CN111324199B - Terminal control method and device, terminal and readable storage medium - Google Patents

Terminal control method and device, terminal and readable storage medium Download PDF

Info

Publication number
CN111324199B
CN111324199B CN201811528321.3A CN201811528321A CN111324199B CN 111324199 B CN111324199 B CN 111324199B CN 201811528321 A CN201811528321 A CN 201811528321A CN 111324199 B CN111324199 B CN 111324199B
Authority
CN
China
Prior art keywords
user
button
gesture operation
terminal
buttons
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811528321.3A
Other languages
Chinese (zh)
Other versions
CN111324199A (en
Inventor
黄翊凇
杨疆
梁耿
陈宣励
唐伟帼
黄坤碧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Group Guangxi Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Group Guangxi Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Group Guangxi Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN201811528321.3A priority Critical patent/CN111324199B/en
Publication of CN111324199A publication Critical patent/CN111324199A/en
Application granted granted Critical
Publication of CN111324199B publication Critical patent/CN111324199B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a terminal control method, a device, a terminal and a readable storage medium, wherein the method comprises the following steps: collecting at least two images, wherein each of the at least two images contains information of at least one button on the clothing of a user; identifying hand information of a user on the at least one button in each image, and determining target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the pre-stored target gesture operation. According to the invention, each of at least two images contains information of at least one button on the user's clothing, and the terminal can quickly identify the hand information of the user on the at least one button in the image through the at least one button, so that the difficulty and complexity in identifying the hand information of the user in the image are reduced, and the accuracy of terminal control is improved.

Description

Terminal control method and device, terminal and readable storage medium
Technical Field
The present invention relates to the field of terminal technologies, and in particular, to a terminal control method, a device, a terminal, and a readable storage medium.
Background
Most of common terminals in the market are provided with touch screens, and a user can control the terminal by operating the terminal on the touch screen as an example, but when the user sweats or is stained with water, the terminal cannot accurately sense the operation of the user on the touch screen when the user performs the control of the terminal through touching, so that the terminal is convenient for the user to operate, and the realization scheme for performing terminal control through non-touching is provided for the user.
In the prior art, when non-touch control is realized, a terminal collects gesture operation of a preset distance from a touch terminal by a user, identifies the collected gesture operation, converts the identified gesture operation into a command which can be identified by a system, and executes control corresponding to the gesture operation. However, when the method is used for controlling the terminal, the terminal needs to identify the hand information of the user by sequentially identifying each pixel point in the image in each image corresponding to the acquired gesture operation, and the difficulty and complexity of identifying the hand information of the user in the image are high because the arrangement of the pixel points in the image is irregular, so that inaccurate control of the terminal is easily caused.
Disclosure of Invention
The invention provides a terminal control method, a device, a terminal and a readable storage medium, which are used for solving the problems of high difficulty and complexity of identifying hand information of a user and inaccurate control of the terminal in the prior art.
The invention provides a terminal control method, which is applied to a terminal, and comprises the following steps:
collecting at least two images, wherein each of the at least two images contains information of at least one button on the clothing of a user;
identifying hand information of a user on the at least one button in each image, and determining target gesture operation of the user;
judging whether the target gesture operation exists in a pre-stored gesture operation set or not;
and if so, executing a terminal control instruction corresponding to the pre-stored target gesture operation.
Further, the acquiring at least two images includes:
if the image is acquired for the first time, judging whether the image acquired for the first time contains information of at least one button on the clothing of the user;
if yes, the information of the at least one button is stored, and the subsequent image is collected.
Further, the acquiring at least two images includes:
At least two images are continuously acquired at set time intervals.
Further, if the information of the two buttons on the clothing of the user is contained in each image, the target gesture operation of the user includes:
the hand of the user rotates a first button of the two buttons; or (b)
The hand of the user rotates a second button of the two buttons; or (b)
The hand of the user moves from a first button to a second button in the two buttons; or (b)
And the hand of the user moves from the second button to the first button in the two buttons.
Further, if the target gesture operation of the user includes a rotation operation of a first button of the two buttons by a hand of the user, a terminal control instruction corresponding to the target gesture operation includes a start instruction of a first function in a terminal;
if the target gesture operation of the user comprises a rotation operation of the hand of the user on a second button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a first button to a second button in the two buttons, a terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a first function parameter to a first set value;
If the target gesture operation of the user comprises a movement operation of the hand of the user from a second button to a first button in the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a second function parameter to a second set value.
Further, before the capturing at least two images, the method further includes:
judging whether the user authorizes the image acquisition;
if so, the subsequent steps are performed.
The invention provides a terminal control device, which is applied to a terminal, and comprises:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring at least two images, and each image in the at least two images contains information of at least one button on the clothing of a user;
the determining module is used for identifying hand information of a user on the at least one button in each image and determining target gesture operation of the user;
the judging module is used for judging whether the target gesture operation exists in a pre-stored gesture operation set; if yes, triggering an execution module;
and the execution module is used for executing the terminal control instruction corresponding to the target gesture operation which is stored in advance.
The invention provides a terminal, which comprises a memory and a processor;
the processor is configured to read the program in the memory, and perform the following procedures: collecting at least two images, wherein each of the at least two images contains information of at least one button on the clothing of a user; identifying hand information of a user on the at least one button in each image, and determining target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the pre-stored target gesture operation.
Further, the processor is specifically configured to determine, if an image is acquired for the first time, whether the image acquired for the first time contains information of at least one button on the user's clothing; if yes, the information of the at least one button is stored, and the subsequent image is collected.
Further, the processor is specifically configured to continuously acquire at least two images at a set time interval.
Further, if the information of the two buttons on the clothing of the user is contained in each image, the target gesture operation of the user includes: the hand of the user rotates a first button of the two buttons; or (b)
The hand of the user rotates a second button of the two buttons; or (b)
The hand of the user moves from a first button to a second button in the two buttons; or (b)
And the hand of the user moves from the second button to the first button in the two buttons.
Further, if the target gesture operation of the user includes a rotation operation of a first button of the two buttons by a hand of the user, a terminal control instruction corresponding to the target gesture operation includes a start instruction of a first function in a terminal;
if the target gesture operation of the user comprises a rotation operation of the hand of the user on a second button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a first button to a second button in the two buttons, a terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a first function parameter to a first set value;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a second button to a first button in the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a second function parameter to a second set value.
Further, the processor is further configured to determine whether the user authorizes image capturing; if so, at least two images are acquired.
The invention provides a terminal, comprising: the device comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
the memory has stored therein a computer program which, when executed by the processor, causes the processor to perform the steps of any of the methods described above.
The present invention provides a computer readable storage medium storing a computer program executable by a terminal, which when run on the terminal causes the terminal to perform the steps of any one of the methods described above.
The invention provides a terminal control method, a device, a terminal and a readable storage medium, wherein the method comprises the following steps: collecting at least two images, wherein each of the at least two images contains information of at least one button on the clothing of a user; identifying hand information of a user on the at least one button in each image, and determining target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the pre-stored target gesture operation. According to the invention, each of at least two images contains information of at least one button on the user's clothing, and the terminal can quickly identify the hand information of the user on the at least one button in the image through the at least one button, so that the difficulty and complexity in identifying the hand information of the user in the image are reduced, and the accuracy of terminal control is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a terminal control process provided in embodiment 1 of the present invention;
fig. 2 is a schematic flow chart of terminal control provided in embodiment 5 of the present invention;
fig. 3 is a schematic structural diagram of a terminal according to embodiment 6 of the present invention;
fig. 4 is a schematic structural diagram of a terminal according to embodiment 7 of the present invention;
fig. 5 is a schematic diagram of a terminal control device according to an embodiment of the present invention.
Detailed Description
In order to reduce recognition difficulty and complexity when recognizing hand information of a user in an image and improve accuracy of terminal control, the embodiment of the invention provides a terminal control method, a device, a terminal and a readable storage medium.
The terminal includes a hardware layer, an operating system layer running on top of the hardware layer, and an application layer running on top of the operating system.
The hardware layer includes hardware such as a central processing unit (CPU, central Processing Unit), a memory management unit (MMU, memory Management Unit), and a memory.
The operating system may be any one or more computer operating systems that implement terminal control through processes (processes), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system.
In addition, in the embodiment of the present invention, the terminal may be a handheld device such as a smart phone, a tablet pc, or a terminal device such as a desktop computer, a portable computer, etc., which is not particularly limited in the embodiment of the present invention, as long as the terminal control can be implemented by running a program recorded with codes of the terminal control method in the embodiment of the present invention.
The execution body of the terminal control in the embodiment of the invention can be a terminal or a functional module in the terminal, which can call a program and execute the program.
For the purpose of promoting an understanding of the principles and advantages of the invention, reference will now be made in detail to the drawings, in which embodiments illustrated in the drawings are intended to illustrate, but not limit the invention to the specific embodiments illustrated. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1:
fig. 1 is a schematic diagram of a terminal control process according to an embodiment of the present invention, where the process includes the following steps:
s101: at least two images are acquired, wherein each of the at least two images contains information of at least one button on the user's clothing.
The terminal control method provided by the embodiment of the invention is applied to the terminal.
The terminal is capable of acquiring at least two images, and specifically comprises an external image capturing component such as a front or rear camera, an infrared detector and the like, and is used for capturing external images, capturing the motion of the hands of a user and the like.
The at least two images each contain information of at least one button on the user's clothing, if the external image capturing component comprises a camera, the information of the at least one button on the user's clothing contained in the image can be the image characteristic information of the at least one button, and if the external image capturing component comprises an infrared detector, the information of the at least one button on the user's clothing contained in the image can be the identification information of the marked at least one button, such as position information or label information.
Preferably, the external image capturing component comprises a camera.
The acquiring at least two images includes:
at least two images are continuously acquired at set time intervals.
The set time interval may be arbitrary, and of course, the set time interval may be small in order to ensure that continuity of gesture operation of the user is acquired.
Specifically, the number of images of the at least two images may be arbitrary, and in order to ensure continuity of the gesture operation of the user, the number of images may be larger.
S102: and identifying hand information of the user on the at least one button in each image, and determining target gesture operation of the user.
The terminal is capable of identifying the user's hand information on at least one button in each image, in particular by a GPU (Graphics Processing Unit, image processor) comprised in the terminal.
Because the acquired at least two images all contain the information of at least one button on the user's clothing, the terminal determines the hand information of the user on at least one button according to the information of at least one button in each image, for example, the hand information of the user is identified in the setting area of each button, the button of the clothing worn by the user is used as an image element for assisting gesture identification, the pattern background is simplified from digital graphic analysis, the difficulty and complexity of the identification of the hand information of the user are reduced, the richness of triggering events for gesture identification by the mobile phone is increased, and the clothing worn by the user and the buttons of the clothing are used as auxiliary gesture identification elements, so that the user experience can be improved, and the gesture action of the user under the condition without reference objects is reduced.
After determining the hand information of the user in each of the at least two images, the terminal can determine the target gesture operation of the user according to the hand information of the user in the at least two images.
According to the hand information of the user of at least two images, the process of determining the target gesture operation of the user can be realized by adopting the prior art, and details are not repeated in the embodiment of the invention.
S103: judging whether the target gesture operation exists in a pre-stored gesture operation set or not; if so, S104; if not, S105 is performed.
The terminal stores the gesture operation set in advance, and specifically, the gesture operation set may be stored in advance in a storage component that stores data in the terminal.
A pre-saved set of gesture operations in a terminal may be understood as a set of gesture operations for triggering an event controlling the terminal.
The terminal can judge whether the target gesture operation exists in the gesture operation set, namely whether the target gesture operation can be found in the gesture operation set.
And executing different steps according to different judging results.
S104: and executing a terminal control instruction corresponding to the pre-stored target gesture operation.
And terminal control instructions corresponding to the gesture operations in the gesture operation set are also stored in the terminal in advance, and the terminal control instructions are used for triggering events of the corresponding control terminal.
Therefore, after determining that the target gesture operation exists in the gesture operation set, the terminal can determine the terminal control corresponding to the target gesture operation and execute the terminal control corresponding to the target gesture operation.
The process of executing the terminal control instruction by the terminal belongs to the prior art, and is not described in detail in the embodiment of the present invention.
After the terminal executes the terminal control instruction corresponding to the target gesture operation, the executed result can be displayed, wherein the terminal comprises a display content component, the display content component comprises, but is not limited to, a lcd screen, an AMOLED screen or a mobile phone screen manufactured by other technologies, and the display content component is used for displaying the executed result, the picture content of the terminal and the like.
S105: the terminal control instruction is not executed.
And if the terminal does not have the target gesture operation in the gesture operation set, the terminal does not execute the terminal control instruction.
Of course, in order to realize control over the terminal, the terminal may also prompt the user to reenter the gesture operation.
In addition, the terminal can also comprise an internet module for connecting with the internet and calling data materials of the server side.
According to the embodiment of the invention, each of at least two images contains the information of at least one button on the clothes of the user, and the terminal can quickly identify the hand information of the user on the at least one button in the image through the at least one button, so that the identification difficulty and complexity in identifying the hand information of the user in the image are reduced, and the accuracy of terminal control is improved.
Example 2:
on the basis of the foregoing embodiment, in an embodiment of the present invention, the capturing at least two images includes:
if the image is acquired for the first time, judging whether the image acquired for the first time contains information of at least one button on the clothing of the user;
if yes, the information of the at least one button is stored, and the subsequent image is collected.
In order to realize the control of the terminal, if the terminal acquires the image for the first time in the process of acquiring the image, when the information of at least one button is identified to be contained in the image, the terminal acquires the subsequent image.
When the terminal acquires the image for the first time, judging whether the image acquired for the first time contains information of at least one button on the clothes of the user. The process of identifying the button information in the image by the terminal can be realized by adopting the prior art, and details are not repeated in the embodiment of the invention.
After the terminal identifies the information of at least one button on the user's clothes in the image, the information of the at least one button can be stored, and the information of the at least one button stored by the terminal can comprise identification information and/or position information of the at least one button and the like.
If the at least one button is at least two buttons, the at least two buttons may be at least two buttons having an arrangement rule, such as having a top-to-bottom button feature, a bottom-to-top (top-to-bottom in the captured image), a left-to-right button feature, a right-to-left (left-to-right in the captured image), etc.
When the terminal determines that the first acquired image contains the information of at least one button on the clothes of the user and stores the information of at least one button, the terminal can acquire the subsequent image, and the terminal can acquire the subsequent image according to a set time interval.
The gesture operation of the user can be acquired by acquiring the subsequent images, so that the subsequent control of the terminal is realized.
In the following, a specific embodiment of the present invention will be described, where an external image capturing component of the terminal captures an image of a real-time garment of a user, extracts image features and transmits the image features to an operation component (for global computation of the present invention, including a central processing unit CPU, etc. for performing data budget), and 3. If the image processor gpu determines that there is an image element in the captured image that meets a button feature threshold from top to bottom, the element is saved as an image UI cache; the first button and the second button from top to bottom of the user are identified, the approximate pixel range, namely the position information, is marked in the image UI cache and is stored in the storage component, the first button is marked as an image UI element close_buttons (1), and the second button is marked as an image UI element close_buttons (2), namely the identification information. If the image processor gpu judges that no image element which accords with the button characteristic threshold value from top to bottom exists in the captured image, a prompt of recognition failure is returned.
In the embodiment of the invention, after the terminal determines that the first acquired image contains the information of at least one button on the user's clothes, the terminal acquires the subsequent image, so that the control of the terminal is realized.
Example 3:
based on the above embodiments, in the embodiment of the present invention, if each image includes information of two buttons on the clothing of the user, the target gesture operation of the user includes:
the hand of the user rotates a first button of the two buttons; or (b)
The hand of the user rotates a second button of the two buttons; or (b)
The hand of the user moves from a first button to a second button in the two buttons; or (b)
And the hand of the user moves from the second button to the first button in the two buttons.
In order to improve the flexibility of terminal control, if each image contains information of two buttons on the clothes of the user, the target gesture of the user can be the operation of a single button or the operation of a plurality of buttons.
The user's target gesture operation includes a rotation operation of the user's hand to the first button of the two buttons, taking the above embodiment as an example, it can be understood that the user grasps the buttons (1) with a certain hand and rotates it by several angles.
The user's target gesture operation includes a rotation operation of the user's hand to the second button of the two buttons, taking the above embodiment as an example, it can be understood that the user grasps the buttons (2) with a certain hand and rotates it by several angles.
The user's target gesture operation includes a movement operation of the user's hand from the first button to the second button of the two buttons, and taking the above embodiment as an example, it can be understood that the user holds the buttons (1) with a certain hand and then shifts the fingers from the buttons (1) to the buttons (2) from top to bottom.
The user's target gesture operation includes a movement operation of the user's hand from the second button to the first button of the two buttons, and taking the above embodiment as an example, it can be understood that the user grasps the buttons (2) with a certain hand and then shifts the fingers from the buttons (2) to the buttons (1) from bottom to top.
If each image includes information of one button on the user's clothing, the target gesture of the user may include a rotation operation of the user's hand on the one button, specifically, a clockwise rotation operation of the user's hand on the one button, and/or a counterclockwise rotation operation of the user's hand on the one button.
If each image includes information about more than two buttons on the user's clothing, the target gesture of the user may include a rotation operation (N is the number corresponding to the more than two buttons) of the user's hand on the first button and the N-th button of the two buttons, where N is the number corresponding to the more than two buttons, and the movement operation of the user's hand on the more than two buttons, i.e., the target gesture of the user is similar to the target gesture of the user when the information about the more than two buttons on the user's clothing is included, and is not repeated herein.
Through the gesture recognition element of the operation to the user button, the forward and reverse operations of the rotary button, the touch button and the progress bar can be simulated, and the motion capture can be realized by using low-precision and low-frequency scanning motion capture hardware, so that the hardware cost is reduced.
Example 4:
on the basis of the above embodiments, in the embodiment of the present invention, if the target gesture operation of the user includes a rotation operation of a hand of the user on a first button of the two buttons, the terminal control instruction corresponding to the target gesture operation includes a start instruction of a first function in the terminal;
if the target gesture operation of the user comprises a rotation operation of the hand of the user on a second button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
If the target gesture operation of the user comprises a movement operation of the hand of the user from a first button to a second button in the two buttons, a terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a first function parameter to a first set value;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a second button to a first button in the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a second function parameter to a second set value.
In order to improve the flexibility of terminal control, if each image contains information of two buttons on the clothes of the user, the user executes corresponding terminal control instructions through target gestures.
When each image contains information of two buttons on the clothing of the user, if the target gesture operation of the user includes a rotation operation of the hand of the user on a first button of the two buttons, the terminal control instruction corresponding to the target gesture operation includes a start instruction of a first function in the terminal, and the start instruction of the first function in the terminal can be regarded as a trigger of event 1 for controlling the terminal, and is regarded as that the user needs to start a certain button or continuous behavior operation, such as starting a "photographing key" or starting a certain application and selecting a certain function in the application.
When each image contains information of two buttons on the clothes of the user, if the target gesture operation of the user comprises rotation operation of the hand of the user on the second button of the two buttons, the terminal control instruction corresponding to the target gesture operation comprises a start instruction of the second function in the terminal, and the start instruction of the second function in the terminal can be regarded as triggering of event 2 for controlling the terminal, and the user is regarded as needing to start a certain button or continuous behavior operation. The second function may be the same as the first function or may be different, if different, the second function may be opposite to the first function, such as by clicking an "exit back" button.
When each image contains information of two buttons on the clothing of the user, if the target gesture operation of the user includes moving operation of the hand of the user from the first button to the second button in the two buttons, the terminal control instruction corresponding to the target gesture operation includes updating a parameter value of a first function parameter to a first set value, where the updating of the parameter value of the first function parameter to the first set value may be regarded as changing a value of a progress bar in the currently started application or changing a value of an option in the application, for example, when the user is currently using a music APP, the updating of the parameter value of the first function parameter to reduce music volume, when the user is currently using a text reading APP, the updating of the parameter value of the first function parameter to drag a text on a page from top to bottom, and so on.
When each image contains information of two buttons on the clothing of the user, if the target gesture operation of the user comprises a movement operation of the hand of the user from the second button to the first button in the two buttons, the terminal control instruction corresponding to the target gesture operation updates the parameter value of the second function parameter to a second set value, the updating of the parameter value of the first function parameter to the first set value can be regarded as a change of a value of a progress bar in the currently started application or a change of a value of an option in the application, the first function parameter and the second function parameter can be the same or different, the first set value and the second set value can be the same or different, and when the second function parameter and the first function parameter are the same, for example, when the user is currently using a music APP, the parameter value of the first function parameter is updated to increase the music volume, when the user is currently using a text reading APP, the parameter value of the first function parameter is updated to drag and display the text of the page from bottom to top, and the like.
When each image contains information of one or more buttons on the clothing of the user, the terminal control instruction corresponding to the target gesture operation can be similar to the information containing the two buttons, and can be set according to the actual use requirement of the user, and details are not repeated here.
Example 5:
based on the foregoing embodiments, in an embodiment of the present invention, before the capturing at least two images, the method further includes:
judging whether the user authorizes the image acquisition;
if so, the subsequent steps are performed.
In order to realize the flexibility of terminal control and meet different realization demands of users, when the users authorize image acquisition, the terminal acquires at least two images again.
The terminal may store an identification of whether the image acquisition is authorized, in particular, the authorization of the image acquisition may be understood as authorization of whether the image capturing is allowed for the real-time clothing of the user.
In the following, a specific embodiment will be described, as shown in fig. 2, 1. When a user uses a mobile phone device with the features of the present invention, the external image capturing component captures images of real-time clothing of the user through user permission authorization, and extracts image features and transmits the image features to the operation component;
2. the arithmetic component invokes the image processor gpu to identify whether the user's clothing has top-to-bottom button features. If the image processor gpu judges that no image element which accords with the button characteristic threshold value from top to bottom exists in the captured image, a prompt of recognition failure is returned.
3. If the image processor gpu judges that the captured image has an image element which accords with the button characteristic threshold value from top to bottom, the element is stored as an image UI cache; the first and second buttons of the user from top to bottom are identified, their approximate pixel ranges are marked in the image UI cache and saved in the storage component, the first button is marked as the image UI element Clothes_buttons (1), and the second button is marked as the image UI element Clothes_buttons (2).
4. The operation component gives the following user actions the authority to trigger certain operation of the mobile phone according to the preset scheme or the scheme selected by the user:
4-1: grasping the close_buttons (1) with a certain hand and turning for a few degrees is regarded as a start trigger event 1;
4-2: grasping the close_buttons (2) with a certain hand and turning for a few degrees is regarded as a start trigger event 2;
4-3, grasping the buttons (1) with one hand of the user, and then displacing the finger from the buttons (1) to the buttons (2) from top to bottom as the triggering event 3;
4-4: the user holds the button (2) with one hand, and then shifts the finger from the button (2) to the button (1) from bottom to top as the trigger event 4;
5. The external image capturing component continuously carries out gesture recognition capturing on the actions of a user according to a certain period interval, and sends an image stream to the image processing chip GPU for recognition;
6. the image processing chip GPU calls a storage component in the storage component, compares the image processing chip GPU with the captured image stream data, and if the continuous frames of images captured for a certain period of time are judged to accord with any event threshold, notifies the operation component to perform trigger event operation of a preset scheme.
7. For the confirmation trigger event, the following processing is performed:
7-1 triggering event 1 the present invention considers that the user requires a certain button or continuous action operation to be initiated, such as preset to grasp the buttons (1) and turn through several angles to be considered to initiate a "photo key", or to launch a certain application and select a certain function in the application.
7-2 triggering event 1 the present invention considers that the user requires that a certain button or continuous action operation be initiated, which may be the same as or different from event 1. For example, preset grasp of close_buttons (2) and turn through a few degrees as the user clicks the "exit back" button;
7-3, triggering event 3, the invention regards that the user starts a certain progress bar control corresponding to the currently used APP, the value of the control progress bar is reduced, the value of a certain settable option corresponding to the application is reduced, for example, if the user is currently using the music APP, event 3 triggers the invention to reduce the music volume, and if the user is currently using the text reading APP, event 3 triggers the invention to drag and display the text of the page from top to bottom.
7-4 triggering event 4. The present invention considers that the user starts a certain progress bar control corresponding to the currently used APP, and the value of the control progress bar is increased, and the value of a certain settable option of the corresponding application is increased, for example, if the user is currently using the music APP, event 4 triggers the present invention to increase the music volume, and if the user is currently using the text reading APP, event 4 triggers the present invention to drag the current page text from bottom to top for display, etc.
8. If the GPU of the image processing chip judges that a plurality of continuous frames of images captured in a certain period of time do not reach any event threshold, the GPU does not perform the next processing;
9. step 5 is repeated until the user does not opt out of the service of the invention.
Example 6:
on the basis of the above embodiments, the embodiment of the present invention further provides a terminal, as shown in fig. 3, including: a processor 301 and a memory 302;
the processor 301 is configured to execute the program in the read memory 302, and perform the following procedures:
collecting at least two images, wherein each of the at least two images contains information of at least one button on the clothing of a user; identifying hand information of a user on the at least one button in each image, and determining target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; and if so, executing a terminal control instruction corresponding to the pre-stored target gesture operation.
Based on the same inventive concept, the embodiment of the present invention further provides a terminal, and since the principle of solving the problem of the terminal is similar to that of the terminal control method, implementation of the terminal can refer to implementation of the method, and repeated parts are not repeated.
In fig. 3, a bus architecture may comprise any number of interconnected buses and bridges, with one or more processors, represented by processor 301, and various circuits of memory, represented by memory 302, being linked together. The bus architecture may also link together various other circuits such as peripheral devices, voltage regulators, power management circuits, etc., which are well known in the art and, therefore, will not be described further herein. The transceiver 303 may be a number of elements, i.e. comprising a transmitter and a receiver, providing a unit for communicating with various other apparatus over a transmission medium. The processor 301 is responsible for managing the bus architecture and general processing, and the memory 302 may store data used by the processor 301 in performing operations.
Alternatively, the processor 301 may be a CPU (Central processing Unit), ASIC (Application SpecificIntegrated Circuit ), FPGA (Field-Programmable Gate Array, field programmable Gate array), or CPLD (Complex Programmable Logic Device ).
The processor 301 is specifically configured to determine, if an image is acquired for the first time, whether the image acquired for the first time contains information of at least one button on the user's clothing; if yes, the information of the at least one button is stored, and the subsequent image is collected.
The processor 301 is specifically configured to continuously acquire at least two images at a set time interval.
If each image contains information of two buttons on the clothing of the user, the target gesture operation of the user comprises: the hand of the user rotates a first button of the two buttons; or (b)
The hand of the user rotates a second button of the two buttons; or (b)
The hand of the user moves from a first button to a second button in the two buttons; or (b)
And the hand of the user moves from the second button to the first button in the two buttons.
If the target gesture operation of the user comprises a rotation operation of a hand of the user on a first button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a first function in a terminal;
If the target gesture operation of the user comprises a rotation operation of the hand of the user on a second button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a first button to a second button in the two buttons, a terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a first function parameter to a first set value;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a second button to a first button in the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a second function parameter to a second set value.
The processor 301 is further configured to determine whether the user authorizes image acquisition; if so, at least two images are acquired.
Example 7:
on the basis of the above embodiments, the embodiment of the present invention further provides a terminal, as shown in fig. 4, including: the processor 401, the communication interface 402, the memory 403 and the communication bus 404, wherein the processor 401, the communication interface 402 and the memory 403 complete communication with each other through the communication bus 404;
The memory 403 has stored therein a computer program which, when executed by the processor 401, causes the processor 401 to perform any of the embodiments described above.
The communication bus mentioned by the above terminal may be a Peripheral component interconnect standard (Peripheral ComponentInterconnect, PCI) bus or an extended industry standard architecture (Extended Industry StandardArchitecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface 402 is used for communication between the above-described terminal and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit, a network processor (NetworkProcessor, NP), etc.; but also digital instruction processors (Digital Signal Processing, DSP), application specific integrated circuits, field programmable gate arrays or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
Example 8:
on the basis of the above embodiments, the embodiments of the present invention further provide a computer-readable storage medium having stored therein a computer program executable by a terminal, which when run on the terminal, causes the terminal to implement any of the above embodiments.
The computer readable storage medium may be any available medium or data storage device that can be accessed by a processor in a terminal including, but not limited to, magnetic memories such as floppy disks, hard disks, magnetic tapes, magneto-optical disks (MO), etc., optical memories such as CD, DVD, BD, HVD, etc., and semiconductor memories such as ROM, EPROM, EEPROM, nonvolatile memories (NAND FLASH), solid State Disks (SSD), etc.
Fig. 5 is a schematic diagram of a terminal control device provided in an embodiment of the present invention, where the device is applied to a terminal, and the device includes:
the acquisition module 501 is configured to acquire at least two images, where each of the at least two images includes information of at least one button on the user's clothing;
a determining module 502, configured to identify hand information of a user on the at least one button in each image, and determine a target gesture operation of the user;
A judging module 503, configured to judge whether the target gesture operation exists in a pre-stored gesture operation set; if so, the execution module 504 is triggered;
and the execution module 504 is configured to execute a terminal control instruction corresponding to the target gesture operation, where the terminal control instruction is stored in advance.
The acquiring module 501 is specifically configured to determine, if an image is acquired for the first time, whether the image acquired for the first time contains information of at least one button on the user's clothing; if yes, the information of the at least one button is stored, and the subsequent image is collected.
The acquisition module 501 is specifically configured to continuously acquire at least two images at a set time interval.
If each image contains information of two buttons on the clothing of the user, the target gesture operation of the user comprises:
the hand of the user rotates a first button of the two buttons; or (b)
The hand of the user rotates a second button of the two buttons; or (b)
The hand of the user moves from a first button to a second button in the two buttons; or (b)
And the hand of the user moves from the second button to the first button in the two buttons.
If the target gesture operation of the user comprises a rotation operation of a hand of the user on a first button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a first function in a terminal;
if the target gesture operation of the user comprises a rotation operation of the hand of the user on a second button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a first button to a second button in the two buttons, a terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a first function parameter to a first set value;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a second button to a first button in the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a second function parameter to a second set value.
The acquisition module is also used for judging whether the user authorizes the image acquisition; if so, the at least two images are acquired.
According to the embodiment of the invention, each of at least two images contains the information of at least one button on the clothes of the user, and the terminal can quickly identify the hand information of the user on the at least one button in the image through the at least one button, so that the identification difficulty and complexity in identifying the hand information of the user in the image are reduced, and the accuracy of terminal control is improved.
For system/device embodiments, the description is relatively simple as it is substantially similar to method embodiments, with reference to the description of method embodiments in part.
It should be noted that in this document relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (13)

1. A terminal control method, applied to a terminal, comprising:
Collecting at least two images, wherein each of the at least two images contains information of at least one button on the clothing of a user;
identifying hand information of a user on the at least one button in each image, and determining target gesture operation of the user;
judging whether the target gesture operation exists in a pre-stored gesture operation set or not;
if yes, executing a terminal control instruction corresponding to the target gesture operation stored in advance;
if each image contains information of two buttons on the clothing of the user, the target gesture operation of the user comprises:
the hand of the user rotates a first button of the two buttons; or (b)
The hand of the user rotates a second button of the two buttons; or (b)
The hand of the user moves from a first button to a second button in the two buttons; or (b)
And the hand of the user moves from the second button to the first button in the two buttons.
2. The method of claim 1, wherein the acquiring at least two images comprises:
if the image is acquired for the first time, judging whether the image acquired for the first time contains information of at least one button on the clothing of the user;
If yes, the information of the at least one button is stored, and the subsequent image is collected.
3. The method of claim 1 or 2, wherein the acquiring at least two images comprises:
at least two images are continuously acquired at set time intervals.
4. The method of claim 1, wherein if the target gesture operation of the user includes a rotation operation of a first button of the two buttons by a hand of the user, the terminal control instruction corresponding to the target gesture operation includes an activation instruction of a first function in a terminal;
if the target gesture operation of the user comprises a rotation operation of the hand of the user on a second button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a first button to a second button in the two buttons, a terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a first function parameter to a first set value;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a second button to a first button in the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a second function parameter to a second set value.
5. The method of claim 1 or 2, wherein prior to the acquiring the at least two images, the method further comprises:
judging whether the user authorizes the image acquisition;
if so, the subsequent steps are performed.
6. A terminal control apparatus, characterized by being applied to a terminal, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring at least two images, and each image in the at least two images contains information of at least one button on the clothing of a user;
the determining module is used for identifying hand information of a user on the at least one button in each image and determining target gesture operation of the user;
the judging module is used for judging whether the target gesture operation exists in a pre-stored gesture operation set; if yes, triggering an execution module;
the execution module is used for executing a terminal control instruction corresponding to the target gesture operation which is stored in advance;
if each image contains information of two buttons on the clothing of the user, the target gesture operation of the user comprises: the hand of the user rotates a first button of the two buttons; or (b)
The hand of the user rotates a second button of the two buttons; or (b)
The hand of the user moves from a first button to a second button in the two buttons; or (b)
And the hand of the user moves from the second button to the first button in the two buttons.
7. A terminal comprising a memory and a processor;
the processor is configured to read the program in the memory, and perform the following procedures: collecting at least two images, wherein each of the at least two images contains information of at least one button on the clothing of a user; identifying hand information of a user on the at least one button in each image, and determining target gesture operation of the user; judging whether the target gesture operation exists in a pre-stored gesture operation set or not; if yes, executing a terminal control instruction corresponding to the target gesture operation stored in advance;
if each image contains information of two buttons on the clothing of the user, the target gesture operation of the user comprises: the hand of the user rotates a first button of the two buttons; or (b)
The hand of the user rotates a second button of the two buttons; or (b)
The hand of the user moves from a first button to a second button in the two buttons; or (b)
And the hand of the user moves from the second button to the first button in the two buttons.
8. The terminal of claim 7, wherein the processor is specifically configured to determine, if an image is acquired for the first time, whether the image acquired for the first time contains information of at least one button on a user's clothing; if yes, the information of the at least one button is stored, and the subsequent image is collected.
9. The terminal according to claim 7 or 8, wherein the processor is configured to continuously acquire at least two images at set time intervals.
10. The terminal of claim 7, wherein if the target gesture operation of the user includes a rotation operation of a first button of the two buttons by a hand of the user, the terminal control instruction corresponding to the target gesture operation includes an activation instruction of a first function in the terminal;
if the target gesture operation of the user comprises a rotation operation of the hand of the user on a second button of the two buttons, a terminal control instruction corresponding to the target gesture operation comprises a starting instruction of a second function in the terminal;
If the target gesture operation of the user comprises a movement operation of the hand of the user from a first button to a second button in the two buttons, a terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a first function parameter to a first set value;
if the target gesture operation of the user comprises a movement operation of the hand of the user from a second button to a first button in the two buttons, the terminal control instruction corresponding to the target gesture operation comprises updating a parameter value of a second function parameter to a second set value.
11. The terminal of claim 7 or 8, wherein the processor is further configured to determine whether the user has authorized image acquisition; if so, at least two images are acquired.
12. A terminal, comprising: the device comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
the memory has stored therein a computer program which, when executed by the processor, causes the processor to perform the steps of the method of any of claims 1 to 5.
13. A computer readable storage medium, characterized in that it stores a computer program executable by a terminal, which when run on the terminal causes the terminal to perform the steps of the method according to any of claims 1-5.
CN201811528321.3A 2018-12-13 2018-12-13 Terminal control method and device, terminal and readable storage medium Active CN111324199B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811528321.3A CN111324199B (en) 2018-12-13 2018-12-13 Terminal control method and device, terminal and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811528321.3A CN111324199B (en) 2018-12-13 2018-12-13 Terminal control method and device, terminal and readable storage medium

Publications (2)

Publication Number Publication Date
CN111324199A CN111324199A (en) 2020-06-23
CN111324199B true CN111324199B (en) 2023-04-28

Family

ID=71170350

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811528321.3A Active CN111324199B (en) 2018-12-13 2018-12-13 Terminal control method and device, terminal and readable storage medium

Country Status (1)

Country Link
CN (1) CN111324199B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013109795A1 (en) * 2012-01-17 2013-07-25 Blast Motion Inc. Intelligent motion capture element
CN104978144A (en) * 2015-06-26 2015-10-14 中国工商银行股份有限公司 Gesture password input device and system and method for transaction based on system
CN105717900A (en) * 2016-04-26 2016-06-29 华南理工大学 Smart home control gloves and home control, custom control gesture method thereof
CN106648347A (en) * 2016-09-14 2017-05-10 上海石肤网络科技有限公司深圳分公司 Method for operating and moving target objects on touch terminal in touch mode
CN106886275A (en) * 2015-12-15 2017-06-23 比亚迪股份有限公司 The control method of car-mounted terminal, device and vehicle
CN107450839A (en) * 2017-07-28 2017-12-08 广东欧珀移动通信有限公司 Control method, device, storage medium and mobile terminal based on blank screen gesture
CN107577411A (en) * 2017-07-19 2018-01-12 陈宗林 Touch gestures identify motor speed regulating method
CN107831987A (en) * 2017-11-22 2018-03-23 出门问问信息科技有限公司 The error touch control method and device of anti-gesture operation
CN108139855A (en) * 2015-08-19 2018-06-08 Lg电子株式会社 Watch style mobile terminal
CN108298078A (en) * 2013-07-31 2018-07-20 深圳市大疆创新科技有限公司 Long-range control method and terminal
CN108921101A (en) * 2018-07-04 2018-11-30 百度在线网络技术(北京)有限公司 Processing method, equipment and readable storage medium storing program for executing based on gesture identification control instruction

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2398279B1 (en) * 2012-06-22 2014-01-21 Crambo, S.A. Activation of an application on a programmable device by gesturing on an image
US9575560B2 (en) * 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US10853029B2 (en) * 2015-06-18 2020-12-01 Googale (2009) Ltd. Computerized system including rules for a rendering system accessible to non-literate users via a touch screen

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013109795A1 (en) * 2012-01-17 2013-07-25 Blast Motion Inc. Intelligent motion capture element
CN108298078A (en) * 2013-07-31 2018-07-20 深圳市大疆创新科技有限公司 Long-range control method and terminal
CN104978144A (en) * 2015-06-26 2015-10-14 中国工商银行股份有限公司 Gesture password input device and system and method for transaction based on system
CN108139855A (en) * 2015-08-19 2018-06-08 Lg电子株式会社 Watch style mobile terminal
CN106886275A (en) * 2015-12-15 2017-06-23 比亚迪股份有限公司 The control method of car-mounted terminal, device and vehicle
CN105717900A (en) * 2016-04-26 2016-06-29 华南理工大学 Smart home control gloves and home control, custom control gesture method thereof
CN106648347A (en) * 2016-09-14 2017-05-10 上海石肤网络科技有限公司深圳分公司 Method for operating and moving target objects on touch terminal in touch mode
CN107577411A (en) * 2017-07-19 2018-01-12 陈宗林 Touch gestures identify motor speed regulating method
CN107450839A (en) * 2017-07-28 2017-12-08 广东欧珀移动通信有限公司 Control method, device, storage medium and mobile terminal based on blank screen gesture
CN107831987A (en) * 2017-11-22 2018-03-23 出门问问信息科技有限公司 The error touch control method and device of anti-gesture operation
CN108921101A (en) * 2018-07-04 2018-11-30 百度在线网络技术(北京)有限公司 Processing method, equipment and readable storage medium storing program for executing based on gesture identification control instruction

Also Published As

Publication number Publication date
CN111324199A (en) 2020-06-23

Similar Documents

Publication Publication Date Title
CN105824559B (en) False touch recognition and processing method and electronic equipment
KR102033863B1 (en) Device and method for processing touch input based on intensity
CN106485124B (en) Operation control method of mobile terminal and mobile terminal
CN107463331B (en) Gesture track simulation method and device and electronic equipment
WO2019001152A1 (en) Photographing method and mobile terminal
CN106250021B (en) Photographing control method and mobile terminal
CN107509030B (en) focusing method and mobile terminal
KR20160149262A (en) Touch point recognition method and device
US8149281B2 (en) Electronic device and method for operating a presentation application file
CN106445328B (en) Unlocking method of mobile terminal screen and mobile terminal
CN107172347B (en) Photographing method and terminal
WO2019091215A1 (en) Method and device for implementing screen capture, and storage medium
CN106981048B (en) Picture processing method and device
CN106934389B (en) A kind of fingerprint identification method and mobile terminal
CN105159494A (en) Information display method and device
CN105808129B (en) Method and device for quickly starting software function by using gesture
CN110580124A (en) Image display method and device
CN108846339B (en) Character recognition method and device, electronic equipment and storage medium
CN111158811A (en) Advertisement processing method and device, electronic equipment and storage medium
CN107315529B (en) Photographing method and mobile terminal
CN111324199B (en) Terminal control method and device, terminal and readable storage medium
CN103870117B (en) A kind of information processing method and electronic equipment
US9235694B2 (en) Recording medium, authentication device, and authentication method
CN109002293B (en) UI element display method and device, electronic equipment and storage medium
CN107678612B (en) Mobile payment method, device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: No. 117, Minzu Avenue, Qingxiu District, Nanning City, Guangxi Zhuang Autonomous Region, 530028

Applicant after: GUANGXI BRANCH, CHINA MOBILE COMMUNICATION Group

Applicant after: CHINA MOBILE COMMUNICATIONS GROUP Co.,Ltd.

Address before: 530000 No. 55, Jinhu Road, Nanning, the Guangxi Zhuang Autonomous Region

Applicant before: GUANGXI BRANCH, CHINA MOBILE COMMUNICATION Group

Applicant before: CHINA MOBILE COMMUNICATIONS GROUP Co.,Ltd.

GR01 Patent grant
GR01 Patent grant