CN108563940B - Control method and mobile terminal - Google Patents

Control method and mobile terminal Download PDF

Info

Publication number
CN108563940B
CN108563940B CN201810380027.6A CN201810380027A CN108563940B CN 108563940 B CN108563940 B CN 108563940B CN 201810380027 A CN201810380027 A CN 201810380027A CN 108563940 B CN108563940 B CN 108563940B
Authority
CN
China
Prior art keywords
target
texture information
track
mobile terminal
control operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810380027.6A
Other languages
Chinese (zh)
Other versions
CN108563940A (en
Inventor
顾瀚之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810380027.6A priority Critical patent/CN108563940B/en
Publication of CN108563940A publication Critical patent/CN108563940A/en
Application granted granted Critical
Publication of CN108563940B publication Critical patent/CN108563940B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a control method and a mobile terminal, wherein the method comprises the following steps: under the condition that a target area including first texture information is displayed on a current interface, second texture information acquired by a camera is acquired; under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier; acquiring a moving track of the target identifier; executing control operation corresponding to the moving track; wherein the first texture information comprises at least one of palm print information and knuckle print information. This can effectively simplify user operations.

Description

Control method and mobile terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a control method and a mobile terminal.
Background
With the development and progress of science and technology, the communication technology has been developed rapidly and greatly, and with the improvement of the communication technology, the popularization of intelligent electronic products has been improved to an unprecedented level, and more intelligent terminals or mobile terminals become an indispensable part of the life of people, such as smart phones, smart televisions, computers and the like.
At present, when a mobile terminal is in a screen locking state and an application program is started, the mobile terminal is usually unlocked first, then a desktop on which application icons are displayed is switched, and then the application program is started by clicking the corresponding application icon, so that the operation of the whole process is complicated.
Therefore, in the prior art, when the mobile terminal is in the screen locking state, the starting operation of the application program is complicated.
Disclosure of Invention
The embodiment of the invention provides a control method and a mobile terminal, and aims to solve the problem that the starting operation of an application program is complicated in a screen locking state.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a control method, including:
under the condition that a target area including first texture information is displayed on a current interface, second texture information acquired by a camera is acquired;
under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier;
acquiring a moving track of the target identifier;
executing control operation corresponding to the moving track;
wherein the first texture information includes at least one of palm print information and knuckle print information.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including:
the first obtaining module is used for obtaining second texture information collected by the camera under the condition that a target area including the first texture information is displayed on a current interface;
the updating module is used for updating the display content of the target area into a target identifier under the condition that the first texture information is matched with the second texture information;
the second acquisition module is used for acquiring the moving track of the target identifier;
the execution module is used for executing the control operation corresponding to the movement track;
wherein the first texture information includes at least one of palm print information and knuckle print information.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the control method.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the control method.
In the embodiment of the invention, under the condition that the target area including the first texture information is displayed on the current interface, the second texture information acquired by the camera is acquired; under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier; acquiring a moving track of the target identifier; executing control operation corresponding to the moving track; wherein the first texture information comprises at least one of palm print information and knuckle print information. Therefore, the corresponding control operation is executed based on the movement track of the target identifier, such as unlocking the mobile terminal or starting an application program, so that the control can be effectively simplified; in the process of executing the control operation, the user moves the hand features corresponding to the first texture information to realize the execution of the target event corresponding to the movement track, the operation is convenient, and the interactivity and the interestingness of the whole operation process can be increased.
Drawings
FIG. 1 is a flow chart of a control method according to an embodiment of the present invention;
FIG. 2 is one of the operational schematics provided by the embodiments of the present invention;
FIG. 3a is a second schematic diagram of the operation provided by the embodiment of the present invention;
FIG. 3b is a third exemplary diagram of the operation of the present invention;
FIG. 4a is a fourth exemplary operational diagram provided in accordance with an embodiment of the present invention;
FIG. 4b is a fifth exemplary operational diagram provided in accordance with an embodiment of the present invention;
FIG. 4c is a sixth operational schematic provided by an embodiment of the present invention;
FIG. 4d is a seventh exemplary operational diagram provided in accordance with an embodiment of the present invention;
FIG. 4e is an eighth schematic diagram of the operation provided by the embodiment of the present invention;
FIG. 5 is a ninth operational schematic provided by an embodiment of the present invention;
fig. 6 is a block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 7 is a block diagram of a mobile terminal according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a control method according to an embodiment of the present invention, as shown in fig. 1, including the following steps:
step 101, acquiring second texture information acquired by a camera under the condition that a target area including first texture information is displayed on a current interface.
In the step, the current interface is the interface currently displayed by a display screen of the mobile terminal; the target area is a display area of the current interface, and the position of the target area in the current interface may be located in the center of the current interface, or may be located on the upper side, the lower side, the left side, or the right side of the current interface.
The shape of the target region may be a circle, a square, or an ellipse, and the shape of the target region may be fixed, or may be changed according to the different shapes of the first texture information in the target region.
The first texture information is texture information that is randomly displayed based on pre-stored texture information of the hand of the user, and includes at least one of palm print information and finger joint print information. Wherein, the palm print is the line on the palm, the palm line is composed of thick lines and thin lines, the lines are generated in nature and are not easy to change; the knuckle lines are lines on fingers, and each finger is provided with the knuckle line. For example, the texture information of the user's hand is stored in advance and includes palm print information and finger print information, and the first texture information may be partial or entire palm print information, partial or entire finger print information, or texture information formed by combining partial or entire palm print information and finger print information.
The second texture information is extracted based on the user hand image acquired by the camera, namely, the camera is started to acquire the user hand image and corresponding second texture information is extracted under the condition that the target area including the first texture information is displayed on the current interface.
For example, when the mobile terminal is in a screen-locked state, when an operation of triggering unlocking of the mobile terminal is detected, a target area including first texture information is displayed on a current interface (in the application scenario, the current interface is a screen-locked interface), a camera is started to acquire a hand image of a user, and second texture information in the hand image is extracted.
And 102, under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier.
In this step, the second texture information obtained in step 101 may be matched with the first texture information, and if the matching is successful, the display content of the target area is updated to the target identifier, specifically, the display content of the target area may be scaled, and the scaled target area is updated to the target identifier; if the matching fails, the process is ended, or the texture information acquired by the camera is acquired again.
Wherein, whether the second texture information is matched with the first texture information can be determined by setting a matching threshold. For example, when the matching degree of the second texture information and the first texture information reaches 90%, it may be determined that the matching is successful, and the display content of the target area may be updated to the target identifier.
Specifically, as shown in fig. 2, in the process of detecting the first texture information matching the second texture information, the user may detect the second texture information matching the first texture information in the target area 22 by moving the hand 21, and in the case that the matching is successful, the target identifier of the target area 22 is displayed.
And 103, acquiring the moving track of the target identifier.
In this step, the movement trajectory of the target identifier may be obtained by detecting a drag operation acting on the target identifier; and an incidence relation between the target identifier and the hand features corresponding to the second texture information can be established, and the moving track of the target identifier is determined by acquiring the moving track of the hand features corresponding to the second texture information.
And 104, executing the control operation corresponding to the movement track.
In this step, a corresponding control operation may be performed according to the movement track of the target identifier, for example, the mobile terminal may be unlocked based on the movement track of the target identifier, or an application program associated with the movement track of the target identifier may be started.
In this embodiment, under the condition that the first texture information is matched with the second texture information, the display content of the target area is updated to the target identifier, and then the corresponding control operation is executed based on the movement track of the target identifier, for example, the mobile terminal is unlocked or the application program is started, so that the user operation can be effectively simplified compared with the case that the mobile terminal is unlocked and then the corresponding application program is started; in the process of executing the control operation, the user moves the hand features corresponding to the first texture information to realize the execution of the target event corresponding to the movement track, the operation is convenient, and the interactivity and the interestingness of the whole operation process can be increased.
Optionally, before the obtaining of the moving track of the target identifier, the method further includes: detecting the position change of a target object corresponding to the second texture information acquired by the camera; under the condition that the target object is detected to move, controlling the target identifier to move along with the target object; the target object is a second hand of a preset range of the first hand corresponding to the second texture information, and the preset range includes part or all of the hand area of the first hand.
In this embodiment, the target identifier may move along with the target object by establishing an association relationship between the target identifier and the target object corresponding to the second texture information. For example, when the target object performs a circle drawing operation within the image acquisition range of the camera, the target identifier may also draw a circle on the display interface of the mobile terminal. By the method, the interactivity and the interestingness of the user in the operation process can be effectively improved.
The camera of the mobile terminal can not acquire the whole image of the hand of the user when the hand of the user is close to the camera because the image acquisition area of the camera of the mobile terminal is related to the relative distance between the acquisition object and the camera; therefore, partial or all hand areas of the user hand can be used as target objects, so that the target mark can move along with the target objects collected by the camera. That is, the target object may be a second hand of a preset range of the first hand corresponding to the second texture information, where the preset range includes a part or all of the hand region of the first hand.
Optionally, before the first texture information acquired by the camera is acquired, the method further includes: displaying N icons around the target area; wherein N is a positive integer.
In this embodiment, when the target area including the first texture information is displayed on the current interface, N icons may be displayed around the target area. The N icons can be set according to the requirements of the user, and may include some common icons such as an unlock icon, a camera icon, and a payment icon, so that the user can quickly start or apply the associated icons. Therefore, the display effect of the current interface can be enriched by displaying the N icons around the target area.
As shown in fig. 3a in particular, the current interface 30 displays a target area 31, an unlock icon 32, a flashlight icon 33, a camera icon 34, and a payment icon 35; in the present embodiment, the target area 31 may be a circular area with a radius R, and the circular target area 31 displays first texture information, where the first texture information may be finger print information, palm print information, or a combination of the palm print information and the finger print information.
Wherein N is an integer greater than 0, and the value of N is not limited. Preferably, N is 4, and 4 icons are displayed around the target area.
Further optionally, before the executing the control operation corresponding to the movement trajectory, the method further includes: acquiring a target icon where a track end point of the moving track is located; wherein the target icon is one of the N icons; the executing the control operation corresponding to the movement track comprises: and executing the control operation corresponding to the target icon.
In this embodiment, the icon at the track end point of the movement track may be determined as the target icon, and the control operation corresponding to the target icon may be executed.
For example, as shown in fig. 3b, if it is detected that the end point of the movement trajectory of the target identifier formed based on the update of the target area 31 is located on the unlock icon 32, the mobile terminal is unlocked; if the terminal point of the movement track of the target identifier formed by updating based on the target area 31 is detected to be located in the flashlight icon 33, starting the flashlight application program; if the terminal point of the movement track of the target identifier formed by updating based on the target area 31 is detected to be positioned in the camera icon 34, starting the camera application program; if it is detected that the end point of the movement trajectory of the target mark formed based on the update of the target area 31 is located on the payment icon 35, the payment application is started.
By means of the method, the mobile terminal is unlocked or the associated application program is started, and compared with the mode of inputting the digital password, the mobile terminal is unlocked and the application program is started, user operation can be effectively simplified; and the target identification is controlled to move along with the target object, and the mobile terminal is unlocked or the application program is started according to the moving track of the target identification, so that the operation is convenient and fast, and the interactivity and interestingness of the user in the operation process can be increased.
It should be noted that the target icon may also be determined according to the moving direction of the target identifier, or an icon associated with the moving track of the target identifier may be determined as the target icon. For example, an icon through which the movement track of the target identifier passes is determined as a target icon; if the target icon passes through a plurality of icons in the moving process of the target icon, the target icon can be determined according to a preset rule; the icon through which the movement track passes last may be determined as the target icon, or the icon through which the movement track passes first may be determined as the target icon.
Optionally, before the executing the control operation corresponding to the movement trajectory, the method further includes: acquiring track characteristics of the moving track; the executing the control operation corresponding to the movement track comprises: executing control operation corresponding to the track characteristics of the moving track; wherein the trajectory feature comprises at least one of a trajectory shape, a trajectory direction.
In the embodiment, the control operation corresponding to the track characteristic can be executed according to the track characteristic of the moving track, so that the interactivity and the interestingness of the user in the operation process are increased.
The corresponding relationship between the trajectory feature and the control operation may be preset, for example, when the trajectory feature is a circle, the camera application is started; and when the track characteristic is the upward sliding of the cable, unlocking the mobile terminal.
As shown in fig. 4a to 4e, several corresponding relations between the trajectory characteristics and the control operation are illustrated:
as shown in fig. 4a, if the trajectory feature of the movement trajectory 42 of the target identifier corresponding to the target area 41 is a circular pattern, the camera application is started;
as shown in fig. 4b, if the trajectory feature of the movement trajectory 43 of the target identifier corresponding to the target area 41 is a fork pattern, the payment application program is started;
as shown in fig. 4c, if the trajectory feature of the moving trajectory 44 of the target identifier corresponding to the target area 41 is an up-slide line, unlocking the mobile terminal;
as shown in fig. 4d, if the trajectory feature of the movement trajectory 45 of the target identifier corresponding to the target area 41 is a downward slide line, the flashlight application is started;
as shown in fig. 4e, if the trajectory feature of the movement trajectory 46 of the target mark corresponding to the target area 41 is an "M" type broken line, the social application program is started.
The track characteristics of the moving track can be customized, and the corresponding control operation can also be customized.
Optionally, before step 101, the method further includes: collecting a picture of a hand of a user; and extracting palm print information and finger joint print information in the hand photo, and storing the palm print information and the finger joint print information into a preset texture information base.
In this embodiment, texture information of a hand used by a user for unlocking may be stored in advance. Specifically, as shown in fig. 5, a camera of the mobile terminal is used to capture a picture of the hand 51 of the user, and to extract the palm print features and the finger joint print features in the hand picture, and then to store the extracted palm print features and finger joint print features in the preset texture information base. Therefore, when the control operation is executed, the texture information can be randomly called from the preset texture information base to be used as the first texture information, and the first texture information is displayed in the target area of the current interface.
In the embodiment of the present invention, the method may be applied to a mobile terminal, for example: a Mobile phone, a tablet Computer (tablet personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
According to the control method provided by the embodiment of the invention, under the condition that the target area including the first texture information is displayed on the current interface, the second texture information acquired by the camera is acquired; under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier; acquiring a moving track of the target identifier; executing control operation corresponding to the moving track; wherein the first texture information comprises at least one of palm print information and knuckle print information. Therefore, the corresponding control operation is executed based on the movement track of the target identifier, such as unlocking the mobile terminal or starting an application program, so that the control can be effectively simplified; in the process of executing the control operation, the user moves the hand features corresponding to the first texture information to realize the execution of the target event corresponding to the movement track, the operation is convenient, and the interactivity and the interestingness of the whole operation process can be increased.
Referring to fig. 6, fig. 6 is a structural diagram of a mobile terminal according to an embodiment of the present invention, as shown in fig. 6, a mobile terminal 600 includes a first obtaining module 601, an updating module 602, a second obtaining module 603, and an executing module 604, where the first obtaining module 601 is connected to the updating module 602, the updating module 602 is further connected to the second obtaining module 603, and the second obtaining module 603 is further connected to the executing module 604:
a first obtaining module 601, configured to obtain second texture information acquired by a camera when a target area including first texture information is displayed on a current interface;
an updating module 602, configured to update the display content of the target area to a target identifier when the first texture information matches the second texture information;
a second obtaining module 603, configured to obtain a moving trajectory of the target identifier;
an executing module 604, configured to execute a control operation corresponding to the moving trajectory;
wherein the first texture information includes at least one of palm print information and knuckle print information.
Optionally, the mobile terminal 600 further includes:
the detection module is used for detecting the position change of the target object corresponding to the second texture information acquired by the camera;
the control module is used for controlling the target identifier to move along with the target object under the condition that the target object is detected to move;
the target object is a second hand of a preset range of the first hand corresponding to the second texture information, and the preset range includes part or all of the hand area of the first hand.
Optionally, the mobile terminal 600 further includes:
a display module for displaying N icons around the target area;
wherein N is a positive integer.
Optionally, the mobile terminal 600 further includes:
the third acquisition module is used for acquiring a target icon where the track end point of the moving track is located;
wherein the target icon is one of the N icons;
the executing module 604 is specifically configured to execute the control operation corresponding to the target icon.
Optionally, the mobile terminal 600 further includes:
the fourth acquisition module is used for acquiring the track characteristics of the moving track;
the executing module 604 is specifically configured to execute a control operation corresponding to a trajectory feature of the moving trajectory;
wherein the trajectory feature comprises at least one of a trajectory shape, a trajectory direction.
The mobile terminal 600 can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 5, and is not described herein again to avoid repetition.
The mobile terminal 600 of the embodiment of the invention acquires the second texture information acquired by the camera under the condition that the target area including the first texture information is displayed on the current interface; under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier; acquiring a moving track of the target identifier; executing control operation corresponding to the moving track; wherein the first texture information comprises at least one of palm print information and knuckle print information. Therefore, the corresponding control operation is executed based on the movement track of the target identifier, such as unlocking the mobile terminal or starting an application program, so that the control can be effectively simplified; in the process of executing the control operation, the user moves the hand features corresponding to the first texture information to realize the execution of the target event corresponding to the movement track, the operation is convenient, and the interactivity and the interestingness of the whole operation process can be increased.
Fig. 7 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, and as shown in fig. 7, the mobile terminal 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 7 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 710 obtains second texture information acquired by the camera when the target area including the first texture information is displayed on the current interface; under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier; acquiring a moving track of the target identifier; executing control operation corresponding to the moving track; wherein the first texture information includes at least one of palm print information and knuckle print information.
Optionally, the processor 710 is further configured to: detecting the position change of a target object corresponding to the second texture information acquired by the camera; under the condition that the target object is detected to move, controlling the target identifier to move along with the target object; the target object is a second hand of a preset range of the first hand corresponding to the second texture information, and the preset range includes part or all of the hand area of the first hand.
Optionally, the processor 710 is further configured to: displaying N icons around the target area; wherein N is a positive integer.
Optionally, the processor 710 is further configured to: acquiring a target icon where a track end point of the moving track is located; wherein the target icon is one of the N icons; and executing the control operation corresponding to the target icon.
Optionally, the processor 710 is further configured to: acquiring track characteristics of the moving track; executing control operation corresponding to the track characteristics of the moving track; wherein the trajectory feature comprises at least one of a trajectory shape, a trajectory direction.
The mobile terminal 700 can implement the processes implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
The mobile terminal 700 of the embodiment of the invention acquires the second texture information acquired by the camera under the condition that the target area including the first texture information is displayed on the current interface; under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier; acquiring a moving track of the target identifier; executing control operation corresponding to the moving track; wherein the first texture information comprises at least one of palm print information and knuckle print information. Therefore, the corresponding control operation is executed based on the movement track of the target identifier, such as unlocking the mobile terminal or starting an application program, so that the control can be effectively simplified; in the process of executing the control operation, the user moves the hand features corresponding to the first texture information to realize the execution of the target event corresponding to the movement track, the operation is convenient, and the interactivity and the interestingness of the whole operation process can be increased.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 701 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access via the network module 702, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the mobile terminal 700 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The mobile terminal 700 also includes at least one sensor 705, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or a backlight when the mobile terminal 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 705 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although the touch panel 7071 and the display panel 7061 are shown in fig. 7 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 708 is an interface through which an external device is connected to the mobile terminal 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 700 or may be used to transmit data between the mobile terminal 700 and external devices.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 709 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby integrally monitoring the mobile terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The mobile terminal 700 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may be logically coupled to the processor 710 via a power management system that may enable managing charging, discharging, and power consumption by the power management system.
In addition, the mobile terminal 700 includes some functional modules that are not shown, and thus will not be described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 710, a memory 709, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program is executed by the processor 710 to implement each process of the above control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (6)

1. A control method, comprising:
under the condition that a target area including first texture information is displayed on a current interface, second texture information acquired by a camera is acquired;
under the condition that the first texture information is matched with the second texture information, updating the display content of the target area into a target identifier;
the updating the display content of the target area to the target identification comprises: zooming the display content of the target area, and updating the zoomed target area as a target identifier;
acquiring a moving track of the target identifier;
executing control operation corresponding to the moving track;
wherein the first texture information comprises at least one of palm print information and knuckle print information;
before the obtaining of the moving track of the target identifier, the method further includes:
detecting the position change of a target object corresponding to the second texture information acquired by the camera;
under the condition that the target object is detected to move, controlling the target identifier to move along with the target object;
the target object is a second hand of a preset range of the first hand corresponding to the second texture information, and the preset range comprises part or all of a hand area of the first hand;
before the second texture information acquired by the camera is acquired, the method further includes:
displaying N icons around the target area;
wherein N is a positive integer;
before the control operation corresponding to the movement track is executed, the method further includes:
acquiring a target icon where a track end point of the moving track is located;
wherein the target icon is one of the N icons;
the executing the control operation corresponding to the movement track comprises:
and executing the control operation corresponding to the target icon.
2. The method according to claim 1, wherein before the performing the control operation corresponding to the movement trajectory, further comprising:
acquiring track characteristics of the moving track;
the executing the control operation corresponding to the movement track comprises:
executing control operation corresponding to the track characteristics of the moving track;
wherein the trajectory feature comprises at least one of a trajectory shape, a trajectory direction.
3. A mobile terminal, comprising:
the first obtaining module is used for obtaining second texture information collected by the camera under the condition that a target area including the first texture information is displayed on a current interface;
the updating module is used for updating the display content of the target area into a target identifier under the condition that the first texture information is matched with the second texture information;
the updating the display content of the target area to the target identification comprises: zooming the display content of the target area, and updating the zoomed target area as a target identifier;
the second acquisition module is used for acquiring the moving track of the target identifier;
the execution module is used for executing the control operation corresponding to the movement track;
wherein the first texture information comprises at least one of palm print information and knuckle print information;
the detection module is used for detecting the position change of the target object corresponding to the second texture information acquired by the camera;
the control module is used for controlling the target identifier to move along with the target object under the condition that the target object is detected to move;
the target object is a second hand of a preset range of the first hand corresponding to the second texture information, and the preset range comprises part or all of a hand area of the first hand;
a display module for displaying N icons around the target area;
wherein N is a positive integer;
the third acquisition module is used for acquiring a target icon where the track end point of the moving track is located;
wherein the target icon is one of the N icons;
the execution module is specifically configured to execute the control operation corresponding to the target icon.
4. The mobile terminal of claim 3, wherein the mobile terminal further comprises:
the fourth acquisition module is used for acquiring the track characteristics of the moving track;
the execution module is specifically configured to execute a control operation corresponding to a trajectory feature of the movement trajectory;
wherein the trajectory feature comprises at least one of a trajectory shape, a trajectory direction.
5. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the control method according to any one of claims 1 to 2.
6. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the control method according to any one of claims 1 to 2.
CN201810380027.6A 2018-04-25 2018-04-25 Control method and mobile terminal Active CN108563940B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810380027.6A CN108563940B (en) 2018-04-25 2018-04-25 Control method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810380027.6A CN108563940B (en) 2018-04-25 2018-04-25 Control method and mobile terminal

Publications (2)

Publication Number Publication Date
CN108563940A CN108563940A (en) 2018-09-21
CN108563940B true CN108563940B (en) 2020-04-28

Family

ID=63536556

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810380027.6A Active CN108563940B (en) 2018-04-25 2018-04-25 Control method and mobile terminal

Country Status (1)

Country Link
CN (1) CN108563940B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112698723B (en) * 2020-12-29 2023-08-25 维沃移动通信(杭州)有限公司 Payment method and device and wearable equipment
CN116994248B (en) * 2023-09-25 2024-03-15 支付宝(杭州)信息技术有限公司 Texture detection processing method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485115A (en) * 2015-08-25 2017-03-08 阿里巴巴集团控股有限公司 A kind of information processing method and device
CN107169470A (en) * 2017-06-06 2017-09-15 中控智慧科技股份有限公司 A kind of gesture identification method, apparatus and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123537B (en) * 2014-07-04 2017-06-20 西安理工大学 A kind of quick auth method based on hand and personal recognition
CN107707762A (en) * 2017-10-19 2018-02-16 维沃移动通信有限公司 A kind of method for operating application program and mobile terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485115A (en) * 2015-08-25 2017-03-08 阿里巴巴集团控股有限公司 A kind of information processing method and device
CN107169470A (en) * 2017-06-06 2017-09-15 中控智慧科技股份有限公司 A kind of gesture identification method, apparatus and system

Also Published As

Publication number Publication date
CN108563940A (en) 2018-09-21

Similar Documents

Publication Publication Date Title
CN108427876B (en) Fingerprint identification method and mobile terminal
CN108459797B (en) Control method of folding screen and mobile terminal
CN109032734B (en) Background application program display method and mobile terminal
CN109078319B (en) Game interface display method and terminal
CN108924037B (en) Display method of rich media communication RCS message and mobile terminal
CN108897473B (en) Interface display method and terminal
CN107783747B (en) Interface display processing method and mobile terminal
CN108958593B (en) Method for determining communication object and mobile terminal
CN111339515A (en) Application program starting method and electronic equipment
CN108108113B (en) Webpage switching method and device
CN111124179A (en) Information processing method and electronic equipment
CN109544172B (en) Display method and terminal equipment
CN108509131B (en) Application program starting method and terminal
CN108093137B (en) Dialing method and mobile terminal
CN110941469B (en) Application splitting creation method and terminal equipment thereof
CN109669656B (en) Information display method and terminal equipment
CN111078002A (en) Suspended gesture recognition method and terminal equipment
CN108459864B (en) Method for updating display content and mobile terminal
CN107809515B (en) Display control method and mobile terminal
CN108021315B (en) Control method and mobile terminal
CN108563940B (en) Control method and mobile terminal
CN110007821B (en) Operation method and terminal equipment
CN108897467B (en) Display control method and terminal equipment
CN111460537A (en) Method for hiding page content and electronic equipment
CN111444737A (en) Graphic code identification method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant