CN112446019A - Application operation control method and device and storage medium - Google Patents

Application operation control method and device and storage medium Download PDF

Info

Publication number
CN112446019A
CN112446019A CN201910817119.0A CN201910817119A CN112446019A CN 112446019 A CN112446019 A CN 112446019A CN 201910817119 A CN201910817119 A CN 201910817119A CN 112446019 A CN112446019 A CN 112446019A
Authority
CN
China
Prior art keywords
gesture
application
application operation
matched
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910817119.0A
Other languages
Chinese (zh)
Inventor
刘楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201910817119.0A priority Critical patent/CN112446019A/en
Publication of CN112446019A publication Critical patent/CN112446019A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1306Sensors therefor non-optical, e.g. ultrasonic or capacitive sensing

Abstract

The disclosure relates to an application operation control method, an application operation control device and a storage medium. In the application operation control method, fingerprint input operation and gesture action input by a user on an application interface to be operated are detected; identifying whether the fingerprint input by the fingerprint input operation is matched with preset fingerprint information or not and whether the gesture generated by the gesture action is matched with a preset gesture or not; and if the fingerprint input by the fingerprint input operation is matched with preset fingerprint information and the gesture generated by the gesture action is matched with a preset gesture, executing the application operation to be executed on the application interface to be operated. Through the method and the device, the interactive control of the application operation is performed by combining the fingerprint operation and the gesture operation, the interactive operation of the user and the application operation can be enriched, and the user experience is improved.

Description

Application operation control method and device and storage medium
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to an application operation control method, an application operation control device, and a storage medium.
Background
As the technology of fingerprint under the screen is mature day by day, the interactive mode of front unlocking is accepted by more users.
Fingerprint technique under the screen sets up the fingerprint identification region through setting up on the display screen of electronic equipment such as smart mobile phone, panel computer, and the user can scan user's fingerprint characteristic when the operation of fingerprint identification region, through fingerprint characteristic identification user's identity. And when the user identity verification is passed, enabling corresponding functions of the electronic equipment, such as unlocking, payment and the like.
The models released in the market at present are mostly fingerprints under a small-area screen, so that the user interaction is limited more, a large-area fingerprint scheme is called for along with technical iteration, the interactive operation of the user and the application operation is enriched in the large-area fingerprint technology, the user experience is improved, and the method is a hotspot of research.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an application operation control method, apparatus, and storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided an application operation control method, including:
detecting and identifying fingerprint input operation and gesture action input by a user on an application interface to be operated; identifying whether the fingerprint input by the fingerprint input operation is matched with preset fingerprint information or not and whether the gesture generated by the gesture action is matched with a preset gesture or not; and if the fingerprint input by the fingerprint input operation is matched with preset fingerprint information and the gesture generated by the gesture action is matched with a preset gesture, executing the application operation to be executed on the application interface to be operated.
In one example, the application operation control method related to the present disclosure further includes:
the fingerprint and/or gesture matched with the application operation are preset, the number of the application operations is one or more, different application operations are matched with different gestures, and different application operations are matched with the same or different fingerprints.
In another example, the preset gesture matched with the application operation comprises the following steps:
prompting a user to select application operation, wherein the application operation is triggered and executed through a gesture; after a user selects application operation, prompting the user to input a gesture for setting the selected application operation; and in the process of inputting the gesture actions by the user, recording and storing the gesture actions and gesture action tracks input by the user in different time periods.
In another example, the gesture generated by the gesture action is matched with a preset gesture, and the method includes:
if the gesture actions and gesture tracks input by the user on the application interface to be operated are matched with the gesture actions and gesture tracks in the stored preset gestures in the time sequence within the continuous set number of time periods, determining that the gestures generated by the gesture actions are matched with the stored preset gestures.
In another example, the application operation to be executed includes starting and running an application program, and the application interface to be operated is a lock screen standby interface.
Executing the application operation to be executed on the application interface to be operated, including:
unlocking a screen standby interface; calling an application program matched with the gesture generated by the gesture action; and starting and running the called application program.
In another example, the application operation to be executed includes an application operation executed after an identity check passes, and the application interface to be operated is an identity check interface of the application operation to be executed.
In another example, the application operation executed after the identity verification is passed includes executing a payment operation, and the identity verification interface is an interface to be paid.
According to a second aspect of the embodiments of the present disclosure, there is provided an application operation control apparatus including:
the detection unit is used for detecting fingerprint input operation and gesture action input by a user on the application interface to be operated; the recognition unit is used for recognizing whether the fingerprint input by the fingerprint input operation is matched with preset fingerprint information or not and whether the gesture generated by the gesture action is matched with a preset gesture or not; and the execution unit is used for executing the application operation to be executed on the application interface to be operated when the identification unit identifies that the fingerprint input by the fingerprint input operation is matched with preset fingerprint information and the gesture generated by the gesture action is matched with a preset gesture.
In one example, the application operation control apparatus related to the present disclosure further includes a setting unit configured to:
the fingerprint and/or gesture matched with the application operation are preset, the number of the application operations is one or more, different application operations are matched with different gestures, and different application operations are matched with the same or different fingerprints.
In another example, the setting unit presets the gesture matched with the application operation in the following manner:
prompting a user to select application operation, wherein the application operation is triggered and executed through a gesture; after a user selects application operation, prompting the user to input a gesture for setting the selected application operation; and in the process of inputting the gesture actions by the user, recording and storing the gesture actions and gesture action tracks input by the user in different time periods.
In another example, the recognition unit determines that the gesture generated by the gesture motion matches a preset gesture by:
if the gesture actions and gesture tracks input by the user on the application interface to be operated are matched with the gesture actions and gesture tracks in the stored preset gestures in the time sequence within the continuous set number of time periods, determining that the gestures generated by the gesture actions are matched with the stored preset gestures.
In another example, the application operation to be executed includes starting and running an application program, and the application interface to be operated is a lock screen standby interface.
The execution unit executes the application operation to be executed on the application interface to be operated in the following mode:
unlocking a screen standby interface; calling an application program matched with the gesture generated by the gesture action; and starting and running the called application program.
In another example, the application operation to be executed includes an application operation executed after an identity check passes, and the application interface to be operated is an identity check interface of the application operation to be executed.
In another example, the application operation executed after the identity verification is passed includes executing a payment operation, and the identity verification interface is an interface to be paid.
According to a third aspect of the embodiments of the present disclosure, there is provided an application operation control apparatus including:
a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: executing the application operation control method according to the first aspect or any example of the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, wherein instructions of the storage medium, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the application operation control method according to the first aspect or any example of the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the interactive control of the application operation is carried out by combining the fingerprint operation and the gesture operation, so that the interactive operation of the user and the application operation can be enriched, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a schematic diagram illustrating a matrix arrangement of fingerprint sensors, according to an exemplary embodiment.
FIG. 2 is a flow chart illustrating a method of application operation control according to an exemplary embodiment.
FIG. 3 is a flow diagram illustrating a preset gesture in accordance with an exemplary embodiment.
Fig. 4 is a schematic diagram illustrating a process for unlocking and starting an APP according to an exemplary embodiment.
Fig. 5 is a block diagram illustrating an application operation control apparatus according to an exemplary embodiment.
FIG. 6 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In the related art, the under-screen large-area fingerprint sensors (sensors) are arranged in an array as shown in fig. 1. The fingerprint sensors arranged in an array mode can not only be used for inputting and matching fingerprints, but also be used for detecting the moving trend of fingers and conducting gesture prediction. Therefore, the application operation control method for multiplexing large-area fingerprints under the screen and matching with the preset gesture actions is provided, interaction operation between a user and application operation is enriched, and user experience is improved.
Fig. 2 is a flowchart illustrating an Application operation control method according to an exemplary embodiment, where the Application operation control method is used in a terminal, as shown in fig. 2, and may also be an Application Processor (AP) applied to the terminal, and includes the following steps.
In step S11, a fingerprint entry operation and a gesture action input by the user on the application interface to be operated are detected.
In the present disclosure, the application interface to be operated may be understood as a display interface of a terminal screen including an off-screen large-area fingerprint sensor. The display interface may be a display interface with an application program running thereon, or a screen locking standby interface without an application program running thereon. The terminal screen display interface comprising the large-area fingerprint sensor under the screen can sense and detect the fingerprint input operation input by a user and gesture actions generated by finger touch operation and the like.
In one embodiment, in the present disclosure, when a user triggers that a specific application operation needs to be performed, for example, unlocking a terminal, running an application program (APP), or the like, the terminal may display a prompt message on an application interface to be operated to prompt the user to input a fingerprint and a gesture action. When a user inputs fingerprint input operation and gesture action on the application interface to be operated, the fingerprint input operation and the gesture action input on the application interface to be operated by the user can be detected.
In step S12, it is identified whether the fingerprint entered by the fingerprint entry operation input by the user on the application interface to be operated matches the preset fingerprint information, and whether the gesture generated by the gesture action matches the preset gesture.
In the present disclosure, fingerprint information of a user may be collected in advance to preset the fingerprint information. In one aspect, the preset fingerprint information in the present disclosure may be a unified fingerprint set for the terminal, for example, a fingerprint that is entered by the user when the user uses the terminal for the first time and identifies user identity information. In this case, it is not necessary to set a fingerprint separately for each application operation, and each application operation matches the same fingerprint. On the other hand, in the present disclosure, gestures respectively set for different application operations to be executed and matched with the application operations may also be used, that is, different application operations match different fingerprints.
Whether the fingerprint input by the user in the fingerprint input operation input on the application interface to be operated is matched with the preset fingerprint information or not is identified, and whether the fingerprint input by the fingerprint input operation is the same as the preset fingerprint information or not can be judged by adopting the existing fingerprint identification technology, such as an optical fingerprint identification technology or an ultrasonic fingerprint identification technology. And if the fingerprint input by the fingerprint input operation is the same as the preset fingerprint information, identifying that the fingerprint input by the fingerprint input operation input by the user on the application interface to be operated is matched with the preset fingerprint information. And if the fingerprint input by the fingerprint input operation is not identical to the preset fingerprint information, identifying that the fingerprint input by the fingerprint input operation input by the user on the application interface to be operated is not matched with the preset fingerprint information.
Further, in the present disclosure, a gesture needs to be preset to implement execution of application operation triggered by combining the fingerprint and the gesture.
In one embodiment, in the present disclosure, a matched gesture may be preset for an application operation to be executed, so as to trigger execution of the application operation through the gesture. In the present disclosure, the matched gesture may be preset for one or more application operations. When the preset gestures are operated for a plurality of applications, different gestures are preset for different applications, so that different application operations are triggered through different gestures.
In the present disclosure, an entry for a user preset gesture may be provided on the terminal, and the entry may also be understood as an application operation interface. When a user needs to trigger application operation by adopting a preset gesture, the user starts an application operation interface of the preset gesture, and the gesture is preset on the application operation interface. In the present disclosure, when the gesture is preset for the application operation, the manner as shown in fig. 2 may be adopted. Fig. 3 is a schematic flow chart diagram illustrating a preset gesture for an application operation according to an exemplary embodiment of the present disclosure, which includes the following steps.
In step S21, the user is prompted to select an application operation, which is subsequently gesture-triggered to execute.
In step S22, the user selects an application operation, and after the user selects the application operation, the user is prompted to input a setting gesture for the selected application operation.
In step S23, during the gesture input by the user, the gesture input by the user and the gesture motion trajectory are recorded and saved in time-sharing.
According to the method and the device, after the gesture operation input by the user is detected, whether the gesture input by the user is the same as the preset gesture is judged.
In one embodiment, in this disclosure, if the terminal records and stores the gesture motion and the gesture motion trajectory input by the user in time intervals when the user presets the gesture, the following method may be adopted when determining whether the gesture input by the user is the same as the preset gesture:
and judging whether the gesture actions and gesture tracks input by the user on the application interface to be operated are matched with the gesture actions and gesture tracks in the saved preset gestures in the time sequence continuous set number of time periods. And if the gesture actions and gesture tracks input by the user on the application interface to be operated are matched with the gesture actions and gesture tracks in the stored preset gestures in the time sequence continuous set number of time periods, determining that the gestures generated by the gesture actions are matched with the stored preset gestures, so as to improve the efficiency of gesture matching judgment.
In step S13, if the fingerprint entered by the fingerprint entry operation matches the preset fingerprint information and the gesture generated by the gesture action matches the preset gesture, the application operation to be executed on the application interface to be operated is executed.
According to the application operation method and device, when the fingerprint input by the fingerprint input operation is matched with the preset information and the gesture generated by the gesture action is matched with the preset gesture, the application operation to be executed on the application interface to be operated is executed. And if the fingerprint input by the fingerprint input operation is not matched with the preset information or the gesture generated by the gesture action is not matched with the preset gesture, the application operation to be executed on the application interface to be operated cannot be executed.
According to the application operation control method, interactive control of application operation is performed by combining fingerprint operation and gesture operation, interactive operation of a user and the application operation can be enriched, and user experience is improved.
The present disclosure will be described below with reference to an application operation control method according to the above description.
In one embodiment, the application operation control method disclosed in the present disclosure may be used for fast start and execution of an application program (APP). For example, on a screen locking standby interface, a user inputs a fingerprint and a gesture action, and if the input fingerprint is matched with a preset fingerprint and the gesture generated by the gesture action is matched, the control terminal unlocks and rapidly starts to operate the APP.
In an example, in the present disclosure, when an application operation to be executed is to start and run an APP under a screen lock standby condition of a terminal, it is assumed that a fingerprint preset by a user is a uniform fingerprint, and different APPs have different matching gestures in advance. And the user inputs a fingerprint and inputs a gesture action on a screen locking standby interface of the terminal. The terminal detects a fingerprint input operation and a gesture action input by a user, identifies whether the fingerprint input by the user is matched with a preset fingerprint, and identifies whether a gesture generated by the gesture action is matched with the preset gesture. And if the fingerprint of the user is matched with the preset fingerprint and the gesture generated by the gesture action is matched with the preset gesture, unlocking a screen standby interface of the terminal, calling an application program matched with the gesture generated by the gesture action, and starting and running the called application program.
In an example, in the present disclosure, in the case of running the APP triggered by a fingerprint and a gesture in the terminal lock screen standby state, the fingerprint and gesture input by the user may be in an unlock region of the terminal, so as to facilitate operation. For example, if the gesture preset by the user is sliding down in the unlocking area, the user triggers the terminal to unlock and start the APP through the fingerprint and the gesture, which may be shown in fig. 4.
This is disclosed can cooperate through the large tracts of land fingerprint sensor under the multiplexing screen simultaneously and predetermine the gesture action through above-mentioned embodiment, realizes quick unblock and opens the design that corresponds app, thereby promotes user experience.
In another embodiment, the application operation control method according to the present disclosure may be applied to a scenario in which an application operation needs to be performed after authentication, such as a large payment operation, application downloading and installation in an application store, and the like. In order to realize the multidimensional encryption triggered by the execution of the application operation, an identity verification interface can be displayed before the execution of the application operation, and a user is prompted to input fingerprints and gestures on the identity verification interface. And under the condition that the fingerprint input by the user is matched with the preset fingerprint and the input gesture action is matched with the gesture matched with the application operation to be executed, determining that the identity verification is passed and executing the corresponding application operation.
A typical application scenario is that the application operation executed after the identity verification passes includes executing a payment operation, and the identity verification interface is an interface to be paid. And under the condition that the fingerprint input by the fingerprint input operation is matched with the preset fingerprint information and the gesture generated by the gesture action is matched with the preset gesture, the payment operation is executed, and the safety is improved.
It can be understood that the interface to be paid in the present disclosure may be a payment interface displayed by the current application having the payment function, or may be a payment interface displayed after a third-party application called by the application not having the payment function currently.
This is disclosed through above-mentioned embodiment and cooperates the multi-dimensional encryption that predetermines the gesture and realize the application operation that needs authentication simultaneously through multiplexing large tracts of land fingerprint sensor, improves the security.
Based on the same conception, the embodiment of the disclosure also provides an application operation control device.
It is understood that the application operation control device provided by the embodiment of the present disclosure includes a hardware structure and/or a software module corresponding to each function for implementing the above functions. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Fig. 5 is a schematic diagram illustrating an application operation control apparatus according to an exemplary embodiment. Referring to fig. 5, the application operation touch device 100 includes a detection unit 101, an identification unit 102, and an execution unit 103.
The detection unit 101 is configured to detect a fingerprint entry operation and a gesture action input by a user on an application interface to be operated. The identification unit 102 is configured to identify whether a fingerprint input by the fingerprint input operation matches preset fingerprint information or not, and whether a gesture generated by the gesture action matches a preset gesture or not. The executing unit 103 is configured to execute an application operation to be executed on the application interface to be operated when the identifying unit 102 identifies that the fingerprint entered by the fingerprint entry operation matches the preset fingerprint information and the gesture generated by the gesture action matches the preset gesture.
In an example, the application operation control apparatus 100 according to the present disclosure further includes a setting unit 104. The setting unit 104 is configured to: the fingerprint and/or gesture matched with the application operation are preset, the number of the application operations is one or more, different application operations are matched with different gestures, and different application operations are matched with the same or different fingerprints.
In another example, the setting unit 104 presets the gesture matching the application operation in the following manner:
prompting a user to select application operation, wherein the application operation is triggered and executed through a gesture; prompting the user to input a gesture for setting the selected application operation after the user selects the application operation; and in the process of inputting the gesture actions by the user, recording and storing the gesture actions and gesture action tracks input by the user in different time periods.
In another example, the recognition unit 102 determines that the gesture generated by the gesture motion matches the preset gesture as follows:
and if the gesture actions and gesture tracks input by the user on the application interface to be operated are matched with the gesture actions and gesture tracks in the stored preset gestures in the time sequence continuous set number of time periods, determining that the gestures generated by the gesture actions are matched with the stored preset gestures.
In another example, the application operation to be executed includes starting and running an application program, and the application interface to be operated is a lock screen standby interface.
The execution unit 103 executes the application operation to be executed on the application interface to be operated in the following manner:
unlocking a screen standby interface; calling an application program matched with the gesture generated by the gesture action; the called application is started and run.
In another example, the application operation to be executed includes an application operation executed after the identity verification passes, and the application interface to be operated is an identity verification interface of the application operation to be executed.
In another example, the application operation executed after the identity verification is passed includes executing a payment operation, and the identity verification interface is an interface to be paid.
FIG. 6 is a block diagram illustrating an apparatus 200 for application operation control according to an example embodiment. For example, the apparatus 200 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 6, the apparatus 200 may include one or more of the following components: a processing component 202, a memory 204, a power component 206, a multimedia component 208, an audio component 210, an input/output (I/O) interface 212, a sensor component 214, and a communication component 216.
The processing component 202 generally controls overall operation of the device 200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 202 may include one or more processors 220 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 202 can include one or more modules that facilitate interaction between the processing component 202 and other components. For example, the processing component 202 can include a multimedia module to facilitate interaction between the multimedia component 208 and the processing component 202.
Memory 204 is configured to store various types of data to support operation at device 200. Examples of such data include instructions for any application or method operating on the device 200, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 204 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 206 provide power to the various components of device 200. Power components 206 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 200.
The multimedia component 208 includes a screen that provides an output interface between the device 200 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 208 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 200 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 210 is configured to output and/or input audio signals. For example, audio component 210 includes a Microphone (MIC) configured to receive external audio signals when apparatus 200 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 204 or transmitted via the communication component 216. In some embodiments, audio component 210 also includes a speaker for outputting audio signals.
The I/O interface 212 provides an interface between the processing component 202 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 214 includes one or more sensors for providing various aspects of status assessment for the device 200. For example, the sensor component 214 may detect an open/closed state of the device 200, the relative positioning of components, such as a display and keypad of the apparatus 200, the sensor component 214 may also detect a change in position of the apparatus 200 or a component of the apparatus 200, the presence or absence of user contact with the apparatus 200, orientation or acceleration/deceleration of the apparatus 200, and a change in temperature of the apparatus 200. The sensor assembly 214 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 214 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 214 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 216 is configured to facilitate wired or wireless communication between the apparatus 200 and other devices. The device 200 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 216 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 216 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as memory 204, comprising instructions executable by processor 220 of device 200 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
For example, a non-transitory computer-readable storage medium, in which instructions are executed by a processor of a mobile terminal to enable the mobile terminal to perform the application operation control method referred to above.
It is understood that "a plurality" in this disclosure means two or more, and other words are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is to be understood that although operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. An application operation control method, comprising:
detecting fingerprint input operation and gesture action input by a user on an application interface to be operated;
identifying whether the fingerprint input by the fingerprint input operation is matched with preset fingerprint information or not and whether the gesture generated by the gesture action is matched with a preset gesture or not;
if the fingerprint input by the fingerprint input operation is matched with preset fingerprint information and the gesture generated by the gesture action is matched with a preset gesture, then
And executing the application operation to be executed on the application interface to be operated.
2. The application operation control method according to claim 1, characterized in that the method further comprises:
the fingerprint and/or gesture matched with the application operation are preset, the number of the application operations is one or more, different application operations are matched with different gestures, and different application operations are matched with the same or different fingerprints.
3. The application operation control method according to claim 2, wherein the presetting of the gesture matched with the application operation comprises:
prompting a user to select application operation, wherein the application operation is triggered and executed through a gesture;
after a user selects application operation, prompting the user to input a gesture for setting the selected application operation;
and in the process of inputting the gesture actions by the user, recording and storing the gesture actions and gesture action tracks input by the user in different time periods.
4. The application operation control method according to claim 3, wherein the gesture generated by the gesture action is matched with a preset gesture, and comprises the following steps:
if the gesture actions and gesture tracks input by the user on the application interface to be operated are matched with the gesture actions and gesture tracks in the stored preset gestures in the time sequence within the continuous set number of time periods, determining that the gestures generated by the gesture actions are matched with the stored preset gestures.
5. The application operation control method according to any one of claims 1 to 4, wherein the application operation to be executed includes starting and running an application program, and the application interface to be operated is a lock screen standby interface;
executing the application operation to be executed on the application interface to be operated, including:
unlocking a screen standby interface;
calling an application program matched with the gesture generated by the gesture action;
and starting and running the called application program.
6. The application operation control method according to any one of claims 1 to 4, wherein the application operation to be executed includes an application operation executed after passing the identity verification, and the application interface to be operated is an identity verification interface of the application operation to be executed.
7. The application operation control method according to claim 6, wherein the application operation executed after the identity verification is passed includes executing a payment operation, and the identity verification interface is an interface to be paid.
8. An application operation control apparatus, characterized by comprising:
the detection unit is used for detecting fingerprint input operation and gesture action input by a user on the application interface to be operated;
the recognition unit is used for recognizing whether the fingerprint input by the fingerprint input operation is matched with preset fingerprint information or not and whether the gesture generated by the gesture action is matched with a preset gesture or not;
and the execution unit is used for executing the application operation to be executed on the application interface to be operated when the identification unit identifies that the fingerprint input by the fingerprint input operation is matched with preset fingerprint information and the gesture generated by the gesture action is matched with a preset gesture.
9. The application operation control device according to claim 8, characterized in that the device further comprises a setting unit configured to:
the fingerprint and/or gesture matched with the application operation are preset, the number of the application operations is one or more, different application operations are matched with different gestures, and different application operations are matched with the same or different fingerprints.
10. The application operation control device according to claim 9, wherein the setting unit presets the gesture matched with the application operation in the following manner:
prompting a user to select application operation, wherein the application operation is triggered and executed through a gesture;
after a user selects application operation, prompting the user to input a gesture for setting the selected application operation;
and in the process of inputting the gesture actions by the user, recording and storing the gesture actions and gesture action tracks input by the user in different time periods.
11. The application operation control device according to claim 10, wherein the recognition unit determines that the gesture generated by the gesture action matches a preset gesture by:
if the gesture actions and gesture tracks input by the user on the application interface to be operated are matched with the gesture actions and gesture tracks in the stored preset gestures in the time sequence within the continuous set number of time periods, determining that the gestures generated by the gesture actions are matched with the stored preset gestures.
12. The application operation control device according to any one of claims 8 to 11, wherein the application operation to be executed includes starting and running an application program, and the application interface to be operated is a lock screen standby interface;
the execution unit executes the application operation to be executed on the application interface to be operated in the following mode:
unlocking a screen standby interface;
calling an application program matched with the gesture generated by the gesture action;
and starting and running the called application program.
13. The application operation control device according to any one of claims 8 to 11, wherein the application operation to be executed includes an application operation executed after passing an identity verification, and the application interface to be operated is an identity verification interface of the application operation to be executed.
14. The application operation control device according to claim 13, wherein the application operation executed after the identity verification is passed includes executing a payment operation, and the identity verification interface is an interface to be paid.
15. An application operation control apparatus, characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: executing the application operation control method of any of claims 1 to 7.
16. A non-transitory computer-readable storage medium having instructions therein, which when executed by a processor of a mobile terminal, enable the mobile terminal to perform the application operation control method of any one of claims 1 to 7.
CN201910817119.0A 2019-08-30 2019-08-30 Application operation control method and device and storage medium Pending CN112446019A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910817119.0A CN112446019A (en) 2019-08-30 2019-08-30 Application operation control method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910817119.0A CN112446019A (en) 2019-08-30 2019-08-30 Application operation control method and device and storage medium

Publications (1)

Publication Number Publication Date
CN112446019A true CN112446019A (en) 2021-03-05

Family

ID=74734853

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910817119.0A Pending CN112446019A (en) 2019-08-30 2019-08-30 Application operation control method and device and storage medium

Country Status (1)

Country Link
CN (1) CN112446019A (en)

Similar Documents

Publication Publication Date Title
EP3232299B1 (en) Physical key component, terminal, and touch response method and device
EP3133528B1 (en) Method and apparatus for fingerprint identification
US11443019B2 (en) Methods and devices for fingerprint unlocking
EP3331226B1 (en) Method and device for reading messages
CN105224195B (en) Terminal operation method and device
CN104850995B (en) Operation execution method and device
CN105654302B (en) Payment method and device
CN107102772B (en) Touch control method and device
CN106156561B (en) Application locking method, application unlocking method and device
CN109324863B (en) Fingerprint identification method and device and computer readable storage medium
US20180144176A1 (en) Fingerprint template acquisition method and device
KR20170038178A (en) Method, apparatus, and mobile terminal for identificating fingerprint
CN108319419B (en) Method and device for starting application
US10095911B2 (en) Methods, devices, and computer-readable mediums for verifying a fingerprint
CN110929550B (en) Fingerprint identification method and device, electronic equipment and storage medium
CN108874450B (en) Method and device for waking up voice assistant
US10671827B2 (en) Method and device for fingerprint verification
CN107580142B (en) Method and device for executing processing
CN112437189A (en) Identity recognition method, device and medium
CN111079108A (en) Fingerprint identification method and device, electronic equipment and storage medium
US20210263637A1 (en) Method and device for executing function of icon, and storage medium
CN112446019A (en) Application operation control method and device and storage medium
CN111079467B (en) Fingerprint identification method and device, electronic equipment and storage medium
CN109144587B (en) Terminal control method, device, equipment and storage medium
CN114724196A (en) False touch prevention method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination