WO2019091124A1 - 一种终端的用户界面显示方法和终端 - Google Patents

一种终端的用户界面显示方法和终端 Download PDF

Info

Publication number
WO2019091124A1
WO2019091124A1 PCT/CN2018/092680 CN2018092680W WO2019091124A1 WO 2019091124 A1 WO2019091124 A1 WO 2019091124A1 CN 2018092680 W CN2018092680 W CN 2018092680W WO 2019091124 A1 WO2019091124 A1 WO 2019091124A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
user
sensor
fingerprint
input
Prior art date
Application number
PCT/CN2018/092680
Other languages
English (en)
French (fr)
Inventor
高伟强
崔闯
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201880094953.5A priority Critical patent/CN112313623A/zh
Priority to EP18876036.7A priority patent/EP3798829A4/en
Priority to PCT/CN2018/092680 priority patent/WO2019091124A1/zh
Priority to US17/255,788 priority patent/US11482037B2/en
Publication of WO2019091124A1 publication Critical patent/WO2019091124A1/zh
Priority to US17/948,463 priority patent/US11941910B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • H04M1/724631User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device by limiting the access to the user interface, e.g. locking a touch-screen or a keypad
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Definitions

  • the present application relates to the field of user interface technologies, and in particular, to a user interface (UI) display method and terminal of a terminal.
  • UI user interface
  • the lock screen interface 1 is generally displayed after the terminal lights up the screen, and the terminal is in the lock screen state. If the terminal is not unlocked, the user cannot enter the unlocked interface to use the full function of the terminal.
  • the full functionality of the terminal can only be used after the user has unlocked, for example using the system features of the terminal or the installed application ("Application" or "APP").
  • the user can unlock the terminal in various ways, such as sliding unlocking, fingerprint unlocking, iris unlocking, face recognition unlocking, and the like.
  • the terminal has a rear fingerprint module 3. As shown in FIG.
  • the user needs at least one operation of touching or pressing the rear fingerprint module 3 to unlock.
  • the terminal displays the unlocked interface.
  • the user needs at least another operation to trigger the terminal to perform a certain function of the operating system. For example, click on an app's icon on the unlocked interface, such as WeChat, to open the app.
  • Some terminals also provide shortcuts to applications that do not involve user privacy and security on the lock screen interface, such as providing a "camera” shortcut on the lock screen (for example, the "camera” icon), as shown in Figure 4.
  • the user can click on the shortcut or press and hold the shortcut in a preset direction for a distance to open the Camera app and use the camera's shooting function.
  • the terminal operating system is not unlocked, and the full function of the terminal within the scope of the user authority cannot be used.
  • the terminal After exiting the "camera” application entered through this shortcut, the terminal remains in the lock screen state, for example, returns to the lock screen interface 1, prompting the user to unlock the terminal.
  • An embodiment of the present application provides a user interface display method and a terminal of a terminal, where the user can unlock the terminal by using a coherent operation, and enable the terminal to perform a certain function, thereby improving operation efficiency.
  • an embodiment of the present application provides a display method of an end user interface, including:
  • the terminal Receiving, by the terminal, the first operation for unlocking the terminal that is input by the user when the screen is off or displaying the first interface;
  • the first interface is a lock screen interface of the terminal;
  • the terminal receives a second operation input by the user under the third user interface, and performs a function corresponding to the second operation.
  • the terminal when the terminal determines that the user does not terminate the operation of the terminal after inputting the unlocking operation, the terminal provides a third user interface, and the second operation is input in the interface, and the terminal directly performs the second operation corresponding to the second operation.
  • the user can continuously input the first operation and the second operation without interrupting the operation of the terminal, that is, the user can unlock the terminal by using a coherent operation, and enable the terminal to perform a certain function, thereby improving the operation efficiency.
  • the terminal confirms that the first operation meets a preset unlocking condition, specifically:
  • the terminal confirms that the first operation is a sliding operation detected on a touch sensor
  • the terminal Determining, by the terminal, that the first operation is an operation of drawing a graphic on the unlocking interface by using a touch sensor, and the graphic is consistent with the preset graphic;
  • the terminal confirms that the first operation is an operation of inputting a fingerprint through a fingerprint sensor, and the input fingerprint matches a preset fingerprint;
  • the terminal confirms that the first operation is an operation of inputting an iris through an iris sensor, and the input iris matches the preset iris; or
  • the terminal confirms that the first operation is an operation of inputting a face image through a face sensor, and the input face image matches a preset face.
  • the method may further include:
  • the terminal displays a second user interface; after the second user interface is unlocked by the terminal User interface.
  • the terminal receives the first operation by using a first sensor, and the terminal confirms that the operation of the terminal is not terminated after the user inputs the first operation, including:
  • the terminal After the terminal receives the first operation that satisfies the preset unlocking condition by the first sensor, it is detected that the user continues to operate the first sensor.
  • the first sensor is a fingerprint sensor
  • the first operation of receiving the preset unlocking condition is specifically:
  • the fingerprint input by the user is collected by the fingerprint sensor, and the fingerprint input by the user matches the preset fingerprint
  • the detecting that the user continues to operate the first sensor comprises:
  • the terminal determines whether the user's operation duration of the sensor is greater than a second threshold duration after determining that the collected fingerprint matches the pre-stored fingerprint. If the value is greater than, the user determines that the user inputs the first operation that meets the preset unlocking condition. The first sensor is continuously operated afterwards.
  • the terminal receives the first operation through a first sensor and receives a third operation input by a user through a second sensor;
  • the terminal confirms that the operation of the terminal is not terminated after the user inputs the first operation, including:
  • the terminal detects that the first operation that meets the preset unlocking condition is detected, detecting that the user starts to input the third operation, and the duration of the third operation after the first operation is greater than a third threshold duration, Then, after the user inputs the first operation, the input operation to the terminal is not terminated.
  • the first sensor is a fingerprint sensor
  • the second sensor is a touch sensor
  • the terminal detects that the user starts input before detecting the first operation that satisfies the preset unlocking condition.
  • the third operation is specifically:
  • the terminal detects a touch operation input by the user through the touch sensor before the fingerprint sensor detects a fingerprint matching the preset fingerprint;
  • the duration of the third operation after the first operation is greater than the third threshold duration:
  • the duration of the touch operation input by the user through the touch sensor exceeds the third threshold duration.
  • one or more shortcuts are displayed on the third user interface; each of the one or more shortcuts corresponds to one function in the terminal;
  • the receiving the second operation input by the user and performing the function corresponding to the second operation is specifically:
  • the displaying the third user interface comprises:
  • the touch screen of the terminal detects a touch operation of the user, and the terminal The one or more shortcuts are displayed on the third interface according to the location of the touch point on the touch screen by the user touch operation.
  • the receiving the second operation input by the user and performing the function corresponding to the second operation is specifically:
  • mapping information is displayed on the third user interface, and the mapping information is used to indicate a gesture corresponding to one or more functions.
  • the third user interface overlay display is displayed on the second user interface, and the second user interface displayed under the third user interface does not respond to the user operation.
  • the second user interface is a user interface after the terminal is unlocked.
  • the method before the terminal performs a function corresponding to the second operation, the method further includes:
  • the terminal determines to continuously detect an input operation of the user to the terminal from a period in which the first operation is detected to when the second operation is detected.
  • the first sensor is a fingerprint sensor
  • the second sensor is a touch sensor
  • the second operation and the third operation are both input through the touch sensor.
  • the method before the performing the function corresponding to the second operation, the method further includes: the terminal determining that the third operation and the second operation are performed by the user continuously through the touch sensor Enter the action.
  • the first operation is an operation of inputting a fingerprint by a user through an in-screen fingerprint sensor; and the in-screen fingerprint sensor is configured to detect a fingerprint input by a user in a screen display area;
  • the confirming that the first operation meets the preset unlocking condition is specifically:
  • the terminal confirms that the fingerprint input by the user through the fingerprint sensor matches the preset fingerprint
  • the terminal After the terminal confirms that the user does not terminate the operation on the terminal after inputting the first operation, the terminal specifically:
  • the finger After the terminal confirms that the user inputs the matched fingerprint, the finger still keeps touching the screen of the terminal;
  • the second operation is an operation that is input when the user continues to hold the screen after the first operation is input.
  • an embodiment of the present application provides a terminal, where the terminal includes:
  • An interaction device configured to receive, by the user, a first operation for unlocking the terminal when the terminal is in an off-screen state or displaying the first interface; the first interface is a lock screen interface of the terminal;
  • a processor configured to confirm whether the first operation meets a preset unlocking condition, and whether the operation of the terminal is not terminated after the user inputs the first operation;
  • a display device configured to display a third user interface when the processor confirms that the first operation meets a preset unlocking condition, and the user does not terminate the operation of the terminal after inputting the first operation;
  • the interaction device is further configured to receive a second operation input by the user when the terminal displays the third user interface;
  • the processor is further configured to control, by the terminal, a function corresponding to the second operation.
  • the display device is further configured to: after the processor confirms that the first operation meets a preset unlocking condition, and the user inputs the first operation, terminating operation of the terminal
  • the second user interface is displayed; the second user interface is a user interface after the terminal is unlocked.
  • the interaction device includes:
  • a first sensor configured to receive the first operation
  • the processor confirms that the operation of the terminal is not terminated after the user inputs the first operation, including:
  • the processor After the processor confirms that the first sensor receives the first operation that satisfies the preset unlocking condition, the first sensor continues to detect the user's operation.
  • the first sensor is a fingerprint sensor, specifically for collecting a fingerprint input by a user
  • the processor confirms that the first operation that the first sensor receives the preset unlocking condition is specifically:
  • the processor confirms that the fingerprint collected by the fingerprint sensor matches the preset fingerprint
  • the processor confirms that the first sensor continues to detect the user's operations, including:
  • the first sensor is continuously operated after the first operation.
  • the interaction device includes:
  • a first sensor configured to receive the first operation
  • a second sensor for receiving a third operation input by a user
  • the processor confirms that the operation of the terminal is not terminated after the user inputs the first operation, including:
  • the second sensor detects that the user starts to input the third operation, and the third operation is in the The duration after the first operation is greater than the third threshold duration.
  • the first sensor is a fingerprint sensor
  • the second sensor is a touch sensor
  • the processor confirms that the first sensor receives the first operation that satisfies a preset unlocking condition.
  • detecting, by the second sensor, that the user starts inputting the third operation is specifically:
  • the processor detects a touch operation input by the user through the touch sensor before the fingerprint sensor detects a fingerprint matching the preset fingerprint
  • the processor confirms that the duration of the third operation after the first operation is greater than the third threshold duration:
  • the processor After the processor detects the fingerprint matching the preset fingerprint by the fingerprint sensor, it is detected that the duration of the touch operation input by the user through the touch sensor exceeds the third threshold duration.
  • one or more shortcuts are displayed on the third user interface; each of the one or more shortcuts corresponds to one function in the terminal;
  • the function that the processor controls the terminal to perform the second operation is specifically:
  • the processor determines that the second operation is for triggering a first shortcut of the one or more shortcuts
  • the displaying, by the display device, the third user interface includes:
  • the touch screen of the terminal detects a touch operation of the user, The display device displays the one or more shortcuts on the third interface according to the touch point of the user touch operation on the touch screen.
  • the second operation is a first gesture operation input by the user; and the function of the processor controlling the terminal to perform the second operation is specifically:
  • mapping information is displayed on the third user interface, and the mapping information is used to indicate a gesture corresponding to one or more functions.
  • the processor is further configured to: before controlling the terminal to perform a function corresponding to the second operation, determining that the interaction device detects the first operation to detect During the period of the second operation, the user's input operation to the terminal is continuously detected.
  • the first sensor is a fingerprint sensor
  • the second sensor is a touch sensor
  • the second operation and the third operation are both input through the touch sensor.
  • the processor is further configured to: before controlling the terminal to perform a function corresponding to the second operation, determining that the user continues to pass the touch during the third operation and the second operation The sensor performs an input operation.
  • the display device is a touch screen integrated with a screen of the terminal and a touch sensor;
  • the interaction device is an on-screen fingerprint sensor for detecting a fingerprint input by a user in the screen display area, the first An operation of the user's finger touching or pressing the screen display area to input a fingerprint;
  • the processor confirms that the first operation meets the preset unlocking condition is specifically:
  • the processor confirms that the fingerprint input by the user through the in-screen fingerprint sensor matches the preset fingerprint
  • the interaction device further includes a touch screen for detecting a touch operation of the user;
  • the processor confirms that the operation of the terminal is not terminated after the user inputs the first operation, specifically:
  • the processor After the processor confirms that the user inputs the matching fingerprint, the finger still keeps touching the touch screen;
  • the second operation is an operation that is input when the user continues to hold the screen after the first operation is input.
  • an embodiment of the present application provides a computer readable storage medium, where the computer readable storage medium stores an instruction, wherein when the instruction is run on a terminal, the terminal is caused to perform the first An interface presentation method as described on the one hand.
  • an embodiment of the present application provides a computer program product comprising instructions, wherein when the computer program product is run on a terminal, the terminal is caused to perform the interface presentation method according to the first aspect.
  • FIG. 1 is a schematic diagram of a terminal lock screen interface in the prior art
  • FIG. 2 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
  • 3A is a schematic diagram of a position setting of a terminal rear fingerprint in the prior art
  • FIG. 3B is a schematic diagram of unlocking a terminal by using a rear fingerprint in the prior art
  • FIG. 3C is a schematic diagram of an open application after unlocking a terminal in the prior art
  • FIG. 3D is a schematic diagram of an interface after opening an application in the prior art
  • FIG. 4 is a schematic diagram of a process in which a user operates an terminal to open an application in the prior art
  • FIG. 5 is a schematic flowchart of a method for displaying a terminal user interface according to an embodiment of the present disclosure
  • FIG. 6A is a schematic diagram 1 of an interface of a shortcut interface (third UI) of a terminal according to an embodiment of the present invention
  • FIG. 6B is a schematic diagram 1 of a user performing operations on a third UI interface according to an embodiment of the present invention.
  • FIG. 6C is an interface displayed by the terminal after the application is opened according to the operation shown in FIG. 6B according to an embodiment of the present invention
  • FIG. 7A is a second schematic diagram of an interface of a shortcut access interface (third UI) of a terminal according to an embodiment of the present disclosure
  • FIG. 7B is a third schematic diagram of an interface of a shortcut access interface (third UI) of a terminal according to an embodiment of the present disclosure
  • FIG. 7C is a schematic diagram 4 of an interface of a shortcut access interface (third UI) of a terminal according to an embodiment of the present disclosure
  • FIG. 8 is a schematic diagram of a setting interface of a third UI of a terminal according to an embodiment of the present disclosure.
  • FIG. 9A is a schematic diagram 5 of an interface of a shortcut interface (third UI) of a terminal according to an embodiment of the present disclosure.
  • FIG. 9B is a schematic diagram 2 of a user performing operations on a third UI interface according to an embodiment of the present invention.
  • FIG. 9C is an interface displayed by the terminal after the application is opened according to the operation shown in FIG. 9B according to an embodiment of the present invention
  • FIG. 10 is a schematic diagram 1 of a user operation interface of a method for displaying a terminal user interface according to an embodiment of the present invention
  • FIG. 11A is a schematic diagram 3 of a user performing operations on a third UI interface according to an embodiment of the present invention.
  • FIG. 11B is an interface displayed by the terminal after the application is opened according to the operation shown in FIG. 11A according to an embodiment of the present invention
  • 11C is a schematic diagram 4 of a user performing operations on a third UI interface according to an embodiment of the present invention.
  • FIG. 11D is an interface displayed by the terminal after the application is opened according to the operation shown in FIG. 11C according to an embodiment of the present invention
  • FIG. 12A is a schematic diagram of a user using a rear fingerprint to unlock on a terminal lock screen interface according to an embodiment of the present invention
  • Figure 12B is a third UI displayed by the terminal in the operation shown in Figure 12A according to an embodiment of the present invention.
  • FIG. 12C is a schematic diagram of a user performing operations on the UI shown in FIG. 12B according to an embodiment of the present invention.
  • FIG. 12D is an interface displayed by the terminal after the application is opened according to the operation shown in FIG. 12C according to an embodiment of the present invention.
  • first and second are used merely to refer to the purpose, and do not indicate that they indicate or imply a relative importance or implicitly indicate the number of technical features indicated. Thus, features defining “first” and “second” may include one or more of the features either explicitly or implicitly. In the description of the embodiments of the present application, “multiple” means two or more unless otherwise stated.
  • the terminal determines that the user performs the operation for unlocking the terminal, if the user does not terminate the operation on the terminal, for example, after the user unlocks the fingerprint, the finger is not immediately raised.
  • the user is provided with a quick entry of the terminal function, and the terminal can unlock the terminal and perform the function of the terminal (for example, open the application of the terminal) through the continuous operation without interruption, thereby improving the operation efficiency and the user experience.
  • a function of the terminal is executed, including opening an application, opening a file, opening a folder, or opening an application, and using the open application to perform a function of the application.
  • open the Camera app to take a photo
  • open the Recording app and record open the Music app to play music
  • open the weather app to display the weather
  • open the Phone app to dial the preset contact Call
  • open the "Maps" app and navigate to the preset location open the "Browser” app and open the preview Set up web pages, etc.
  • one function of the execution terminal may also be to perform system functions, such as switching the displayed page, changing the display/scenario/power saving mode, and switching the system to the normal/game mode.
  • a display method of a terminal user interface in the embodiment of the present invention includes:
  • step S100 the terminal receives the first operation input by the user for unlocking the terminal when the screen is off or the lock screen is displayed.
  • the user in the case where the terminal is off, or under the lock screen of the terminal (referred to herein as the first user interface), the user unlocks the operating system of the terminal through the first operation.
  • the terminal's locking method and unlocking method For example, you can set the terminal to lock when the screen is off. When the screen is bright again, the terminal is in the lock screen state, and the user cannot enter other interfaces except the lock screen interface, so that the full function of the terminal cannot be used. At this time, the terminal needs to be unlocked according to a preset unlocking manner, so that the terminal skips the lock screen interface, thereby using the full function of the terminal within the authority of the user.
  • the user may illuminate the terminal screen and unlock the terminal by pressing or touching the fingerprint sensor when the terminal is off; at this time, the terminal may select the fingerprint matching the preset fingerprint when the fingerprint sensor collects the fingerprint. Brighten the screen and unlock the terminal. It can also light up the screen when it detects that the fingerprint sensor is touched, and unlock the terminal if the fingerprint matching the preset fingerprint is collected.
  • the user can pass the lock screen interface of the terminal. The finger is pressed or touched by the fingerprint sensor to unlock the terminal; at this time, the terminal can unlock the terminal if the fingerprint sensor collects a matching fingerprint.
  • the fingerprint sensor may be an off-screen fingerprint sensor that collects fingerprints outside the screen area (such as a fingerprint sensor disposed on the HOME button, and, for example, a rear fingerprint sensor disposed on the back of the terminal), or An on-screen fingerprint sensor that collects fingerprints in the screen area, for example, a fingerprint sensor disposed at the bottom of the screen, or a fingerprint sensor integrated with the touch screen, or disposed on the upper and lower sides or left and right sides of the screen to implement the screen area A sensor that collects fingerprints.
  • the fingerprint sensor is set at the bottom of the screen, the general position and size are fixed.
  • the area corresponding to the fingerprint sensor on the screen generally displays an icon (for example, an icon of a fingerprint shape) for prompting.
  • the user can also use the input of other sensors to achieve the unlock function.
  • the following sensors can also be used to implement the input of the first operation:
  • a touch sensor also referred to as a touch sensitive sensor, such as a touch screen, a touch pad
  • a touch sensitive sensor such as a touch screen, a touch pad
  • unlock such as sliding unlocking, or graphic unlocking
  • the operation of the user's input iris is recognized by the iris sensor, and the terminal unlocks the terminal when the iris matching the preset iris is recognized;
  • the operation of inputting the face image by the user is recognized by the image sensor, and the terminal unlocks the terminal when the face image matching the preset face is recognized.
  • the user can also use other sensors of the terminal to implement the input of the first operation, which is not limited by the present invention.
  • step S102 the terminal determines whether the first operation input by the user satisfies the preset unlocking condition. If yes, step S106 is performed; otherwise, step S104 is performed.
  • the first operation that satisfies the unlocking condition preset by the terminal may include:
  • the preset unlocking condition is that the sliding operation is detected, and the first operation input by the user is a sliding operation on the screen;
  • the preset unlocking condition is that the unlocking graphic input is detected, and the first operation input by the user is to draw a preset unlocking graphic on the screen;
  • the preset unlocking condition is that the fingerprint matching the preset fingerprint is detected, and the first operation input by the user is to press the corresponding finger on the fingerprint sensor, that is, input the matching fingerprint through the fingerprint sensor;
  • the preset unlocking condition is that the iris matching the preset iris is detected, and the first operation input by the user is an operation of inputting the iris through the iris sensor;
  • the preset unlocking condition is that the face image matching the preset face is detected, and the first operation input by the user is an operation of inputting the face image through the face sensor.
  • the face sensor includes a camera, a structured light sensor, and the like, which are not limited by the present invention.
  • step S104 the terminal keeps the screen off or displays the lock screen interface.
  • the terminal may keep the screen or display the lock screen interface; if the terminal is inputting the first operation If the screen is in a bright state and the lock screen is displayed, the lock screen interface is still displayed.
  • the lock screen interface may be consistent with the first interface or may be inconsistent with the first interface.
  • the lock screen interface may be on the first interface.
  • the overlay displays a prompt message to prompt the user to enter an error and the unlock fails.
  • Step 106 Determine whether the operation of the terminal is terminated after the user inputs the first operation, and if yes, execute step S108; otherwise, execute step S110.
  • the user performs the first operation by using the first sensor, and the user still keeps operating the first sensor after the terminal is unlocked, and the terminal determines that the user inputs the first operation by using the first sensor, and does not terminate the operation. Terminal operation.
  • the first sensor may be a fingerprint sensor as an example.
  • the first sensor may also be another sensor that can detect the user input, which is not limited in the present invention.
  • the terminal can determine whether the user continues to operate the fingerprint sensor after the terminal is unlocked by the following two methods:
  • Mode 1 The terminal determines whether the duration of the user pressing the fingerprint sensor or the touch operation is greater than the first threshold duration. If the fingerprint is greater than the preset fingerprint, the terminal determines that the user continues to operate the sensor after inputting the first operation. If the operation is not terminated, the terminal performs step S110; if not, it determines that the user terminates the operation of the terminal after unlocking the terminal, and the terminal performs step S108.
  • Method 2 The terminal determines whether the user's operation duration of the sensor is greater than a second threshold duration after determining that the collected fingerprint matches the pre-stored fingerprint. If it is greater than, the user is determined to continue to maintain the operation sensor after inputting the first operation. If the input operation to the terminal is not terminated, the terminal performs step S110; if not, it determines that the user terminates the input operation to the terminal after inputting the first operation, and the terminal performs step S108.
  • the terminal may use the other time points during the continuous operation of the sensor as a starting point to time the continuous operation time, and determine whether the user continues to operate the sensor after the terminal is unlocked based on the time duration.
  • the invention is not limited to ® .
  • the user may perform the first operation using the first sensor and perform the third operation using the second sensor if the third operation begins before the terminal detects the first operation that satisfies the preset unlocking condition And after the terminal detects that the duration after the first operation that satisfies the preset unlocking condition is greater than the third threshold duration, after the user inputs the first operation, the input operation to the terminal is not terminated.
  • the first sensor can be an iris sensor and the second sensor can be a touch sensor:
  • the user inputting the first operation may be that the user inputs the iris of the user through the iris sensor, for example, the terminal collects the iris of the human eye through the front iris acquisition camera.
  • the terminal detects the touch operation of the user by using the touch sensor (ie, the third operation, for example, the operation of the touch screen of the user touching the terminal), at this time, The terminal determines whether the duration of the touch operation after the terminal determines that the iris input by the user matches the preset iris is greater than a third threshold duration.
  • the terminal determines that the user does not terminate the input of the terminal after inputting the first operation. Operation, at this time, the terminal performs step S110; if it is less, it is determined that the user terminates the input operation to the terminal after inputting the first operation, and the terminal performs step S108.
  • the first sensor may be an on-screen fingerprint sensor that implements fingerprint acquisition in a screen area, for example, a fingerprint sensor disposed under the screen, or a fingerprint sensor integrated in the touch screen, or disposed on the screen up or down or left and right. Both ends to realize the sensor for collecting fingerprints in the screen area.
  • the second sensor can be a touch sensor.
  • the user inputting the first operation may be that the user presses the finger or touches the screen to perform fingerprint input.
  • the touch sensor in this example, the touch screen
  • the touch sensor determines whether the user is The touch is still held for the third preset time period, that is, whether the finger is not raised, and if the finger is not raised, the terminal performs step S110; if the finger is raised, the terminal performs step S108.
  • the present invention does not limit the specific configuration of the first sensor and the second sensor, as long as the first sensor can collect the first operation of the user for unlocking, and the second sensor can be used to collect the third operation.
  • the terminal may determine, according to the signals collected by the first sensor and the second sensor, whether the input operation is performed on the terminal within a preset duration (eg, the third threshold duration) after the user inputs the first operation.
  • Step S108 The terminal displays a second user interface, where the second user interface is an unlocked user interface, referred to as an unlocking interface. After entering the unlock interface, the user can use the full function of the terminal without having to perform the unlock operation.
  • the terminal displays the second user interface.
  • the interface after the user unlocks the terminal is the same.
  • the second user interface may be the main interface 3 of the operating system displayed after the terminal is powered on.
  • the main interface 3 of the terminal may be provided by the status bar 31, the page 32, and the resident bar (also called " Dock column ”) 33 composition.
  • the page 32 is used to display an APP icon, a widget, a file icon, a folder icon, or other manners for triggering execution of a terminal function
  • the resident bar 33 is used to display an icon of a user's commonly used application or function. For example, “Contact”, “Phonebook”, “Camera” and other APP icons.
  • the status bar 31, the page 32 and the resident column 33 are arranged in the upper, middle and lower parts of the user interface. There may be multiple pages, and the displayed page 32 can be switched by the user's operation. During the page switching, the status bar 31 And the resident column 33 does not switch.
  • the second UI may also be a user interface displayed by the terminal when the terminal is unlocked the previous time.
  • the terminal opens the "Album” application and uses the application to browse the A picture.
  • the user turns off the screen through the power button, and the terminal is locked.
  • the interface displayed by the terminal is to use the "album” to browse the A picture. interface.
  • step S110 the terminal displays the third UI.
  • the third UI provides a quick entry to the functionality of the terminal.
  • the operation of the terminal is not terminated, and the terminal displays the third UI.
  • the terminal when the terminal receives the first operation input by the user (the first operation satisfies the unlocking condition preset by the terminal), and detects that the user does not terminate the operation of the terminal, the terminal displays the third UI.
  • the third UI may be a background of the second UI, and the content of the third UI (also referred to as an interface element) may be superimposed on the second UI layer, and the two layers may be transparent or virtual.
  • the third UI may not present the second UI as the background, but only the content of the third UI.
  • the third UI has at least two ways to provide a shortcut to the function of the terminal:
  • Mode 1 providing a shortcut function of the terminal on the third UI, and when the shortcut is triggered by the operation of the user, performing a function corresponding to the shortcut;
  • the mode 2 and the third UI are configured to have a corresponding relationship between the shortcut gesture and the function of the terminal.
  • the function corresponding to the shortcut gesture is executed. The two methods are described separately below.
  • the content presented on the third UI may be a shortcut displayed according to the position of the finger pressing, for example, the finger pressing position may be centered around the distribution.
  • the icon of the application or function, or other shortcuts, the terminal can set the distribution of the shortcut of the function, such as the number of shortcuts, the distance between the shortcut and the position of the finger pressing, the order of the shortcuts, the shortcut display Transparency and so on.
  • the application distribution may be distributed in a circular shape (including an elliptical machine and an irregular circle) around the touch position of the finger, or may be distributed in a fan shape above or below the finger, or may be distributed in other forms, which is not limited in the embodiment of the present invention.
  • a shortcut means that when the user triggers these shortcuts, the corresponding function can be performed.
  • the terminal may display the shortcuts on the UI to prompt the user to trigger the shortcuts in a preset operation manner.
  • the terminal detects the preset operation, and determines the layout on the UI and the specific operation of the user according to the shortcut manner, the terminal determines The specific operation of the user corresponds to the displayed shortcut, and the terminal determines the shortcut corresponding to the specific operation, and then performs the corresponding function.
  • shortcuts include icons and widgets on the page. When the user clicks on these icons or widgets, they can open the corresponding application, file or folder, or open the corresponding application and perform a certain function of the application.
  • the terminal may press according to the finger.
  • the location adjusts how and how the shortcuts are displayed, in a fan or other presentation to ensure that the application's shortcuts are fully displayed.
  • the way the application shortcut is displayed on the third UI can include one of the following:
  • the fixed order is arranged, and the shortcut displayed by the user on the third UI and its sorting are customized;
  • the terminal determines, according to the frequency of the application used by the user, the shortcuts of the N applications with the highest frequency of use, and displays them on the third UI, and arranges according to the frequency;
  • the terminal determines, according to the time when the user recently uses the application, that the shortcuts of the N applications used by the user who is closest to the time of displaying the third UI are displayed on the third UI, and are arranged in chronological order;
  • the user can customize the shortcuts of X applications according to mode 1, and other unfixed N-X shortcuts are determined according to mode 2 or mode 3.
  • X and N are natural numbers and X is less than N.
  • the terminal is provided with a correspondence relationship between the shortcut gesture on the third UI and the function of the terminal, such as “S”, “C” and “O” shaped gestures.
  • the patterns correspond to the “Calendar” app, the “Close all running apps” feature, and the "Play music” feature.
  • the terminal opens the “Calendar” application, closes all running applications, or plays music.
  • the user can set the correspondence between the gesture and the function on the third UI.
  • the user can enter the setting interface of the third UI, select a function that needs to be quickly executed on the third UI in the setting interface, and input a corresponding gesture in the gesture input area of the setting interface.
  • the setting interface of the third UI may also include two sub-setting interfaces, wherein the selecting sub-interface is used to select a function (including an application) that needs to be displayed on the third UI, and the gesture sub-interface is used to set a gesture corresponding to the selected shortcut.
  • the setting interface of the third UI may also provide only a preset gesture for the user to select instead of recording the gesture input by the user.
  • Preset gestures can also be expressed in letter shapes, such as the letter C representing a "C” shaped input gesture.
  • the terminal recognizes the gesture similar to the stroke input method, recognizes the letter C, and then according to the correspondence between the set letter C and the function, Perform the function corresponding to the letter C.
  • mapping information may be displayed on the third UI, where the mapping information is used to indicate a gesture corresponding to one or more functions, that is, to display correspondence information of the gesture pattern and the function set by the user. For example, as shown in FIG. 9, the "S" gesture corresponding to the "WeChat” application set by the user is displayed side by side.
  • the third UI may not display the correspondence relationship between the gesture pattern and the function set by the user, and the user may know by the memory or his own setting which gesture is input on the third UI to trigger the terminal to execute.
  • Kind of function
  • Step S112 When displaying the third UI, the terminal receives the second operation input by the user, and performs a function corresponding to the second operation.
  • the terminal when the terminal displays the third UI, the user inputs a second operation, and triggers the terminal to perform a preset function of the terminal.
  • the user does not terminate the input operation to the terminal from the input of the first operation to the input of the second operation.
  • the terminal continuously detects the user's input operation to the terminal from the time when the first operation is detected to the time when the second operation is detected.
  • the user inputs a function for unlocking to a function of the execution terminal, for example, opening an application, which can be completed by a coherent action for inputting the second operation, improving the operation efficiency of the user.
  • a function for unlocking to a function of the execution terminal for example, opening an application, which can be completed by a coherent action for inputting the second operation, improving the operation efficiency of the user.
  • Experience is good.
  • the user may also continue to input the second operation after the terminal displays the third UI and terminates the input to the terminal.
  • the first sensor is still used as an in-screen fingerprint sensor, and the second sensor is a touch screen as an example.
  • the terminal detects the first operation of the user for unlocking the terminal by using the fingerprint sensor, and after determining in steps S102 and S106, when the third UI is displayed, the touch screen continues to detect the input operation of the user, and when the preset second operation is detected At the time, a function corresponding to the preset second operation is performed.
  • the second operation is exemplarily described below in conjunction with the third UI in the above embodiment.
  • the second operation may be any of the following:
  • the finger slides with the pressing or touch position on the touch screen as a starting point, when the finger moves to the shortcut on the third UI and raises the finger; for example, as shown in FIGS. 6A-6C, the user moves the finger to “WeChat”. On the icon, raise your finger and open the WeChat app on your phone.
  • the finger starts with a pressing or touch position on the touch screen, and performs a flick operation toward a shortcut on the third UI; wherein the shaking operation is composed of touch movement and acceleration movement, similar to a human finger
  • the terminal determines the direction of the swaying operation to determine the shortcut that triggers the operation.
  • One finger keeps touching or pressing the touch screen, and the other finger clicks on the function (including the application) shortcut; for example, as shown in FIG. 10, the right thumb keeps pressing the touch screen, and the left thumb clicks on the “WeChat” icon to open the “WeChat” application. .
  • the touch screen can detect the pressure of the user.
  • the terminal will display the shortcut corresponding to the pressure prominently, such as icon floating, flashing, ticking in the upper right corner, or adding a frame to prompt
  • the user's shortcut corresponds to the pressure at this time.
  • the function corresponding to the shortcut selected by the user using the pressure pressing operation is performed. For example, as shown in FIG. 11A, when the user presses with a lighter force, the "browser” icon is framed to indicate that it is selected. If the user raises his finger at this time, the "browser" application is opened (as shown in FIG.
  • the terminal will perform the function corresponding to the shortcut.
  • the embodiment of the present invention does not limit the above operations, and the terminal may preset other types of operation modes, and point to different function shortcuts by different operations of the operation mode of the type, thereby determining to perform the corresponding function. That is, the user triggers the corresponding shortcut by the second operation.
  • the user performs a unlocking operation of touching or pressing the touch screen, and when the terminal determines that the user has not terminated the operation, the terminal presents a UI providing a shortcut entry of the terminal function, and the user can continue to pass the finger without leaving the touch screen.
  • the operation on the touch screen triggers the terminal to perform the function of the terminal, which improves the operation efficiency and improves the user experience.
  • the terminal may also detect the first operation and the second operation input by the user using a fingerprint sensor disposed outside the screen of the terminal.
  • the first operation and the second operation can be detected using a fingerprint sensor disposed on the back side of the terminal (the back side with respect to the screen).
  • a matching fingerprint input is detected on the lock screen interface as shown in FIG. 12A, and the user does not raise the finger, as shown in FIG. 12B, the third UI is displayed.
  • the third UI displays a shortcut
  • the rear fingerprint sensor can detect the moving direction of the user's finger, such as from top to bottom, left to right, and the like.
  • a reference point eg, a blinking dot or cursor, as shown in FIG. 12C
  • the third UI shows four shortcuts, namely shortcut A (for example, "Facebook") above the reference point, and shortcut B (for example, "WeChat") located below the reference point.
  • shortcut A for example, "Facebook”
  • shortcut B for example, "WeChat”
  • a shortcut C to the left of the reference point for example, "browser”
  • a shortcut D for example, "game
  • the terminal detects that the user's second operation is a sliding operation from top to bottom on the sensor through the rear fingerprint sensor, it is determined that the shortcut B is triggered, and as shown in FIG. 12D, the "WeChat” application is opened. If the terminal detects that the user's second operation is a left-to-right sliding operation on the sensor through the rear fingerprint sensor, it is determined that the shortcut D is triggered, thereby opening the "game” application.
  • the user performs the first operation using the first sensor and performs the third operation using the second sensor, if the third operation starts before the terminal detects the first operation that satisfies the preset unlocking condition, and After the terminal detects that the duration after the first operation that satisfies the preset unlocking condition is greater than the third threshold duration, it is determined that after the user inputs the first operation, the input operation to the terminal is not terminated, and the terminal displays the third UI.
  • the second operation and the third operation may each be input by a second sensor, for example, the second sensor is a touch sensor. The user can continuously input the third operation and the second operation through the touch sensor, and the user can open a specific function of the terminal through a continuous touch action before and after unlocking, and the operation efficiency is high and the user experience is good.
  • the second operation may be that the user draws a gesture pattern of the prompt on the third UI, the terminal recognizes the gesture of the user, and performs a function corresponding to the gesture. For example, as shown in FIG. 9B, if the gesture pattern “S” corresponds to the “WeChat” application, when the user presents the third UI, the user draws an “S”-shaped trajectory with a finger on the touch screen (as shown in FIG. 9B ), and the terminal Open the "calendar” (as shown in Figure 9C).
  • the terminal may be directly unlocked, and the unlocked interface is displayed, for example, the second UI is displayed.
  • the third UI provides a shortcut entry of the function of the terminal in the above manner 1
  • the user fails to input a second operation for triggering a shortcut, for example, the user directly terminates after displaying the third UI.
  • the operation of the terminal eg, the finger leaving the touch screen
  • the operation after displaying the third UI does not conform to the predetermined setting of the terminal, for example, in one embodiment, the second operation of the predetermined setting is to move the finger to the shortcut and lift Hand, but the terminal only detects the movement of the user's finger, the movement does not reach the shortcut displayed on the third UI, at this time, the terminal can unlock the screen and display the second UI.
  • the terminal detects that the user directly terminates the operation on the terminal on the third UI, or the terminal can face the screen when the predetermined gesture is not detected. Unlock and display the second UI.
  • the user interface display method provided by the embodiment of the present application can be applied to a mobile phone, a tablet computer, a wearable device, an in-vehicle device, an augmented reality (AR)/virtual reality (VR) device, a notebook computer, a super mobile personal.
  • the embodiment of the present application does not impose any limitation on any terminal device that can display an interface, such as an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (PDA).
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the mobile phone 200 may include a communication module 220, a memory 230, a sensor module 240, an interaction device 250, a display screen 260, an audio module 280, a processor 210, a camera module 291, a power management module 295, and the like. These components can be connected by bus or directly. It will be understood by those skilled in the art that the structure of the handset shown in Fig. 2 does not constitute a limitation to the handset, and may include more components than those of Fig. 2, or a combination of certain components, or different components.
  • the communication module 220 is configured to communicate with other network entities, such as receiving information from a server or transmitting related data to a server.
  • the communication module 220 may include a radio frequency (RF) module 229, a cellular module 221, a wireless fidelity (WIFI) module 223, a GPS module 227, and the like.
  • the RF module 229 can be used for transmitting and receiving information or during a call, receiving and transmitting signals, and in particular, processing the received information to the processor 210; in addition, transmitting the signal generated by the processor 210.
  • RF circuitry 21 may include, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (LNA), a duplexer, and the like.
  • the RF circuit 21 can also communicate with the network and other devices through wireless communication.
  • the cellular module 221 and the WIFI module 223 can be used to connect to a network.
  • the GPS module can be used for positioning or navigation.
  • the processor 210 is a control center of the handset 200 that connects various portions of the entire handset using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 230, and recalling data stored in the memory 230, The various functions and processing data of the mobile phone 200 are executed to perform overall monitoring of the mobile phone 200.
  • processor 210 may include one or more processing units; processor 210 may integrate an application processor and a modem processor.
  • the application processor mainly processes an operating system, a graphical user interface, an application, etc., and the modem processor mainly processes wireless communication. It can be understood that the above modem processor may not be integrated into the processor 210.
  • the memory 230 can be used to store data, software programs, and modules, and can be a volatile memory, such as a random-access memory (RAM), or a non-volatile memory.
  • RAM random-access memory
  • non-volatile memory for example, a read-only memory (ROM), a flash memory, a hard disk drive (HDD), or a solid-state drive (SSD); or a combination of the above types of memories.
  • the program code is provided in the memory 230, and the program code is used to enable the processor 210 to execute the user interface method provided by the embodiment of the present application by executing the program code.
  • the memory 230 can include an internal memory 232 and an external memory 234.
  • the processor 210 executes a code in the memory 230 to determine whether the first operation received by the sensor satisfies a preset unlocking condition (ie, performing step S102 of the method embodiment portion, and implementing each of the steps) a specific implementation), and in the case of determining that the first operation satisfies the unlocking condition, continue to determine the signal received by the sensor to determine whether the user has terminated the operation of the terminal after inputting the first operation (ie, performing the steps of the method embodiment part) S106, and various implementation manners of implementing the step; when the processor 210 implements step S106, a timer may be set to determine whether the input operation to the terminal is terminated after the user inputs the first operation.
  • a preset unlocking condition ie, performing step S102 of the method embodiment portion, and implementing each of the steps
  • the processor 210 can set a timer at other timings to determine whether the user terminates the input operation to the terminal after inputting the first operation, so as to implement various judgment modes of the method embodiment. I will not go into details here.
  • the processor 210 executes the code in the memory 230 to determine that the first operation does not satisfy the unlock condition, and the terminal device keeps the screen off or displays the lock screen interface (ie, step S104 of the method embodiment portion is executed).
  • the processor 210 executes the code in the memory 230 to determine that the operation of the terminal is not terminated after the user inputs the first operation, for example, by the sensor detecting that the user's input operation is not interrupted, and the processor passes through the display device (eg, the display screen 260).
  • the third UI is displayed (ie, step S110 of the method embodiment portion is performed, and various specific embodiments for implementing the step).
  • the various presentation forms of the third UI in the method embodiment are represented by UI data, which is stored in the memory 230, read by the processor 210 and displayed by the display screen 260.
  • the processor 210 executes the code in the memory 230, and when receiving the second operation, determines that the currently displayed UI is the third UI, and performs a function corresponding to the second operation (ie, performs step S112 of the method embodiment part). And various implementations that implement this step). Specifically, when implementing some implementations in the method embodiment, the processor 210 executes the code in the memory 230, and receives the second operation by the sensor, first determining that the currently displayed user interface is the third UI, and then The trajectory of the second operation is mapped to the trajectory on the third UI to determine which shortcut the user's second operation triggered.
  • a touch point detected on touch sensor 240 whose coordinates on touch sensor 240 are converted to coordinates of display screen 260, thereby mapping the trajectory formed by the touch point onto the third UI.
  • the processor 210 executes the code in the memory 230, receives the second operation by the sensor, and first determines that the currently displayed interface is the third UI, but may not perform the second operation.
  • the mapping of the trajectory only needs to determine the shape touched by the user on the touch sensor 252, for example, the gesture "S" is determined, and then the "calendar” application is determined to be opened.
  • processor 210 herein refers to a unit that can perform the above calculation, processing and control functions, and can be a stand-alone device or a plurality of separate devices.
  • processor 210 may include a coprocessor and a main processor.
  • a coprocessor also known as a sensor hub
  • a coprocessor is used to centrally control multiple sensors and process the data collected by the multiple sensors.
  • the function of processing the data collected by the sensor for unlocking the terminal such as the fingerprint sensor, iris sensor, face recognition sensor, touch sensor, etc.
  • the device may perform the operation of comparing the data collected by the sensor for unlocking with the preset unlocking data, determining whether the matching is performed, and transmitting the operation result to the main processor, and the main processor performs other operations according to the result.
  • the main processor can sleep, and does not need to wake up the main processor frequently, which is beneficial to save power consumption of the terminal.
  • the sensor module 240 may include an air pressure sensor 240C, a magnetic sensor 240D, a temperature/humidity sensor 240J, an illuminance sensor 240K, a UV sensor 240M, and the like. These sensors can be used to measure environmental parameters, and the processor can control the mobile phone according to a preset strategy. For example, these parameter values can be displayed, and the working mode of the mobile phone can be set according to the parameter values. It should be noted that the mobile phone 200 may further include other sensors, such as a distance sensor, an RGB sensor, and the like, and details are not described herein.
  • the interaction device 250 detects the user's operation on the terminal, including a keyboard (not shown), and also includes a sensor that can detect the user's operation, such as a gyro sensor 256, for detecting the mobile phone in various directions (generally three axes)
  • a sensor that can detect the user's operation, such as a gyro sensor 256, for detecting the mobile phone in various directions (generally three axes)
  • the magnitude of the acceleration, the magnitude and direction of gravity can be detected at rest, and can be used to identify the gesture of the mobile phone (such as the tilt angle of the mobile phone, horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as step counting) , tapping, etc.
  • the acceleration sensor 258 is configured to detect the acceleration of the mobile phone. These two sensors can sense the actions that the user applies to the mobile phone, such as horizontal and vertical screen switching, mobile phone swaying and the like.
  • the interaction device 250 also includes a fingerprint sensor 257, an iris sensor 259 and a proximity sensor (not shown) and the like.
  • the fingerprint sensor may be separately disposed from the display screen, that is, an off-screen fingerprint sensor, such as a fingerprint sensor disposed on the home button or a rear fingerprint sensor disposed on the rear case, or an on-screen fingerprint sensor;
  • the proximity sensor is A device having the ability to sense the proximity of an object, which can utilize a displacement sensor to have a sensitive characteristic to an approaching object to recognize the approach of the object and output a corresponding switching signal.
  • Distance sensors can be used to detect hover events.
  • Display 260 can include devices such as display panel 262, holographic device 264, and projector 266.
  • the display panel 262 can be used for performing UI display on the mobile phone, for example, performing a graphical user interface (GUI) display, and the graphic user interface includes various controls or various application interfaces.
  • GUI graphical user interface
  • the touch sensor 252 and the display panel 262 may also be referred to as a touch display screen or a touch screen, and the touch screen may collect a touch operation on or near the user (such as a user using a finger, a stylus, or the like, any suitable object or accessory).
  • the operation is performed on the touch screen or near the touch screen, and the corresponding connection device is driven according to a preset program. It can also be used to display information entered by the user or information provided to the user (such as images captured by the camera) as well as various menus of the mobile phone.
  • the touch screen can be implemented in various types, such as a resistive type, a capacitive type, an infrared light sensation, and an ultrasonic wave, which is not limited in the embodiment of the present invention.
  • the operation of the user in the vicinity of the touch screen may be referred to as a hovering touch.
  • the touch operation in this document includes such a hovering touch operation.
  • Touch screens capable of hovering touch can be realized by capacitive, infrared light, and ultrasonic waves.
  • the touch operation in the embodiment of the present invention also includes such a floating touch operation.
  • the infrared light sensing touch screen can use infrared LEDs and infrared light emitting diodes to emit light, and the mobile phone realizes the recognition and tracking of the floating gesture by detecting the screen light reflected back by the target such as the user's finger.
  • the camera module 291 can be used for capturing images, taking pictures, recording video or scanning two-dimensional codes/bar codes, and the like, and can also be used for facial information recognition, user expression recognition, user head motion recognition, and the like.
  • the audio module 280 can include a speaker 282, a receiver 284, an earphone 286, or a microphone 288, etc., which the user can capture or play.
  • the power management module 295 can include a battery 296 for logically connecting to the processor 210 through the power management system to implement functions such as managing charging, discharging, and power management through the power management system.
  • the mobile phone 200 may further include a function module such as a user identification module, an indicator, a motor, and the like, and details are not described herein again.
  • a function module such as a user identification module, an indicator, a motor, and the like, and details are not described herein again.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions can be from a website site, computer, server or data center Transfer to another website site, computer, server, or data center by wire (eg, coaxial cable, fiber optic, digital subscriber line (DSL), or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer readable storage medium can be any available media that can be accessed by a computer or a data storage device such as a server, data center, or the like that includes one or more available media.
  • the usable medium may be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a DVD), or a semiconductor medium (such as a solid state disk (SSD)).

Abstract

本申请的实施例提供一种终端用户界面的显示方法及终端,该方法包括如下步骤:所述终端在熄屏或显示第一界面时,接收用户输入的用于解锁终端的第一操作;所述第一界面为终端的锁屏界面;所述终端确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,显示第三用户界面;所述终端在所述第三用户界面下,接收用户输入的第二操作,执行与所述第二操作对应的功能。在该方案中,终端在判断用户在输入解锁操作之后未终止对终端操作时,提供了第三用户界面,在该界面下输入第二操作,终端即可直接执行与第二操作对应的功能。用户可使用一个连贯的操作对终端进行解锁,且使终端执行某一功能,提升了操作效率。

Description

一种终端的用户界面显示方法和终端 技术领域
本申请涉及用户界面技术领域,特别涉及一种终端的用户界面(User Interface,简称UI)显示方法和终端。
背景技术
现有技术中,如图1所示,为了防止用户误操作或者保护用户隐私,终端点亮屏幕之后一般会显示锁屏界面1,终端处于锁屏状态。若不解锁终端,用户不能进入解锁后的界面,以使用终端的完整功能。用户解锁之后才能使用终端的完整功能,例如使用终端的系统功能或者安装的应用程序(简称“应用”或者“APP”)。用户可以通过多种方式对终端进行解锁,比如滑动解锁,指纹解锁,虹膜解锁,人脸识别解锁等等。例如,参见图3A,终端具有后置指纹模块3。如图3B所示,在锁屏界面1,用户至少需要一个接触或按压后置指纹模块3的操作来进行解锁。如图3C-3D所示,解锁之后终端显示解锁后的界面,此时,用户至少还需要另一个操作来触发终端执行操作系统的某个功能。例如,点击解锁后的界面上的一个应用的图标,如微信,以打开该应用。
也有些终端会在锁屏界面上提供一些不涉及用户隐私和安全的应用的快捷入口,例如在锁屏界面上提供“相机”的快捷方式(例如,“相机”的图标),如图4所示,用户点击该快捷方式或者按住并沿预设方向拖动该快捷方式一段距离,即可打开“相机”应用,使用“相机”的拍摄功能。此时,用户虽然能使用“相机”应用,但终端操作系统并未解锁,不能使用该用户权限范围内的终端的完整功能。当退出通过这种快捷方式进入的“相机”应用后,终端仍然保持锁屏状态,例如,返回锁屏界面1,提示用户对终端进行解锁。
由此可见,在现有技术中,用户不能使用一个连续的操作既对终端屏幕解锁,又执行某一功能(例如,打开一个应用),用户的操作体验不好。
发明内容
本申请的实施例提供一种终端的用户界面显示方法和终端,用户可使用一个连贯的操作对终端进行解锁,且使终端执行某一功能,提升了操作效率。
为达到上述目的,本申请的实施例采用如下技术方案:
第一方面,本申请的实施例提供一种终端用户界面的显示方法,包括:
所述终端在熄屏或显示第一界面时,接收用户输入的用于解锁终端的第一操作;所述第一界面为终端的锁屏界面;
所述终端确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,显示第三用户界面;
所述终端在所述第三用户界面下,接收用户输入的第二操作,执行与所述第二操作对应的功能。
在本申请实施例中,终端在判断用户在输入解锁操作之后未终止对终端操作时, 提供了第三用户界面,在该界面下输入第二操作,终端即可直接执行与第二操作对应的功能。故,用户可以不中断对终端的操作,即可连续地输入第一操作和第二操作,即用户可使用一个连贯的操作对终端进行解锁,且使终端执行某一功能,提升了操作效率。
在一种可能的设计方法中,所述终端确认所述第一操作满足预置解锁条件具体为:
所述终端确认所述第一操作为在触摸传感器上检测到的滑动操作;或
所述终端确认所述第一操作为通过触摸传感器在所述解锁界面上画出图形的操作,且所述图形与预置图形相符;或
所述终端确认所述第一操作为通过指纹传感器输入指纹的操作,且输入的指纹与预置指纹匹配;或
所述终端确认所述第一操作为通过虹膜传感器输入虹膜的操作,且输入的虹膜与预置虹膜匹配;或
所述终端确认所述第一操作为通过人脸传感器输入人脸图像的操作,且输入的人脸图像与预置人脸匹配。
在一种可能的设计方法中,该方法还可包括:
若所述第一操作满足预置解锁条件,且用户输入所述第一操作后终止对所述终端的操作,所述终端显示第二用户界面;所述第二用户界面为所述终端解锁后的用户界面。
在一种可能的设计方法中,所述终端通过第一传感器接收所述第一操作,所述终端确认用户输入所述第一操作后未终止对所述终端的操作包括:
所述终端通过所述第一传感器接收到满足预置解锁条件的所述第一操作之后,检测到用户继续保持操作所述第一传感器。
在一种可能的设计方法中,所述第一传感器为指纹传感器,所述接收到满足预置解锁条件的第一操作具体为:
通过所述指纹传感器采集到用户输入的指纹,所述用户输入的指纹与预置的指纹匹配;
所述检测到用户继续保持操作所述第一传感器包括:
终端判断用户对所述指纹传感器按压或者触摸操作的持续时间是否大于第一阈值时长,若大于,且采集的指纹与预置的指纹相匹配,则确定用户输入满足预置的解锁条件的所述第一操作后,检测到用户继续保持操作所述第一传感器;或
终端判断用户在判断采集到的指纹和预存的指纹匹配后,用户对传感器的操作持续时间是否大于第二阈值时长,若大于,则确定用户在输入满足预置的解锁条件的所述第一操作后持续保持操作所述第一传感器。
在一种可能的设计方法中,所述终端通过第一传感器接收所述第一操作,并通过第二传感器接收用户输入的第三操作;
所述终端确认用户输入所述第一操作后未终止对所述终端的操作包括:
若终端检测到满足预置解锁条件的所述第一操作之前,检测到用户开始输入所述第三操作,并且所述第三操作在所述第一操作之后的持续时长大于第三阈值时长,则用户输入第一操作后,未终止对该终端的输入操作。
在一种可能的设计方法中,所述第一传感器为指纹传感器,所述第二传感器为触 摸传感器,所述终端检测到满足预置解锁条件的所述第一操作之前,检测到用户开始输入所述第三操作具体为:
所述终端通过所述指纹传感器检测到与预置指纹匹配的指纹之前,检测到用户通过所述触摸传感器输入的触摸操作;
所述第三操作在所述第一操作之后的持续时长大于第三阈值时长具体为:
所述终端通过所述指纹传感器检测到与预置指纹匹配的指纹之后,用户通过所述触摸传感器输入的触摸操作的持续时长超过所述第三阈值时长。
在一种可能的设计方法中,所述第三用户界面上显示有一个或多个快捷方式;所述一个或多个快捷方式中的每个与所述终端中一个功能相对应;
所述接收用户输入的第二操作,并执行与所述第二操作对应的功能具体为:
所述终端接收用户输入的第二操作,确定所述第二操作用于触发所述一个或多个快捷方式中的第一快捷方式;
执行与所述第一快捷方式对应的功能。
在一种可能的设计方法中,所述显示第三用户界面包括:
若所述终端确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,所述终端的触摸屏检测到用户的触摸操作,所述终端根据所述用户触摸操作在所述触摸屏上的触摸点的位置在所述第三界面上显示所述一个或多个快捷方式。
在一种可能的设计方法中,其特征在于,所述接收用户输入的第二操作,并执行与所述第二操作对应的功能具体为:
接收用户输入的第一手势,根据预置的手势与所述终端的功能的对应关系,确定所述用户输入的第一手势对应的第一功能;
执行所述第一功能。
在一种可能的设计方法中,所述第三用户界面上显示有映射信息,所述映射信息用于指示一个或多个功能所对应的手势。
在一种可能的设计方法中,其特征在于,所述第三用户界面叠加显示显示在第二用户界面之上,显示在所述第三用户界面下的所述第二用户界面不响应用户操作;所述第二用户界面为所述终端解锁后的用户界面。
在一种可能的设计方法中,所述终端执行与所述第二操作对应的功能之前,所述方法还包括:
所述终端确定从检测到所述第一操作到检测到所述第二操作的时段内,持续检测到用户对终端的输入操作。
在一种可能的设计方法中,所述第一传感器为指纹传感器,所述第二传感器为触摸传感器,所述第二操作和所述第三操作均通过所述触摸传感器输入。
在一种可能的设计方法中,所述执行与所述第二操作对应的功能之前,所述方法还包括:所述终端确定第三操作和第二操作期间,用户持续通过所述触摸传感器进行输入操作。
在一种可能的设计方法中,所述第一操作为用户通过屏内指纹传感器输入指纹的操作;所述屏内指纹传感器用于检测屏幕显示区域内用户输入的指纹;
所述确认所述第一操作满足预置解锁条件具体为:
所述终端确认用户通过所述指纹传感器输入的指纹与预置指纹相匹配;
所述终端确认用户输入所述第一操作后未终止对所述终端的操作具体为:
所述终端确认用户输入匹配的指纹后手指仍保持触摸所述终端的屏幕;
所述第二操作为用户在输入第一操作后,手指持续保持触摸所述屏幕的情况下输入的操作。
第二方面,本申请的实施例提供了一种终端,该终端包括:
交互设备,用于在所述终端处于熄屏状态或显示第一界面时,接收用户输入的用于解锁终端的第一操作;所述第一界面为终端的锁屏界面;
处理器,用于确认所述第一操作是否满足预置解锁条件,且用户输入所述第一操作后是否未终止对所述终端的操作;
显示设备,用于在所述处理器确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,显示第三用户界面;
所述交互设备还用于在所述终端显示所述第三用户界面时,接收用户输入的第二操作;
所述处理器还用于,控制所述终端执行所述第二操作对应的功能。
在一种可能的设计方法中,所述显示设备还用于,在所述处理器确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后终止对所述终端的操作时,显示第二用户界面;所述第二用户界面为所述终端解锁后的用户界面。
在一种可能的设计方法中,所述交互设备包括:
第一传感器,用于接收所述第一操作;
所述处理器确认用户输入所述第一操作后未终止对所述终端的操作包括:
所述处理器确认所述第一传感器接收到满足预置解锁条件的所述第一操作之后,所述第一传感器继续检测到用户的操作。
在一种可能的设计方法中,所述第一传感器为指纹传感器,具体用于采集到用户输入的指纹;
所述处理器确认所述第一传感器接收到满足预置解锁条件的所述第一操作具体为:
所述处理器确认所述指纹传感器采集的指纹与预置指纹相匹配;
所述处理器确认所述第一传感器继续检测到用户的操作包括:
所述处理器判断用户对所述指纹传感器按压或者触摸操作的持续时间是否大于第一阈值时长,若大于,且采集的指纹与预置的指纹相匹配,则确定用户输入满足预置的解锁条件的所述第一操作后,检测到用户继续保持操作所述第一传感器;或
所述处理器判断用户在判断采集到的指纹和预存的指纹匹配后,用户对传感器的操作持续时间是否大于第二阈值时长,若大于,则确定用户在输入满足预置的解锁条件的所述第一操作后持续保持操作所述第一传感器。
在一种可能的设计方法中,所述交互设备包括:
第一传感器,用于接收所述第一操作;
第二传感器,用于接收用户输入的第三操作;
所述处理器确认用户输入所述第一操作后未终止对所述终端的操作包括:
所述处理器确认所述第一传感器接收到满足预置解锁条件的所述第一操作之前,通过所述第二传感器检测到用户开始输入所述第三操作,并且所述第三操作在所述第 一操作之后的持续时长大于第三阈值时长。
在一种可能的设计方法中,所述第一传感器为指纹传感器,所述第二传感器为触摸传感器,所述处理器确认所述第一传感器接收到满足预置解锁条件的所述第一操作之前,通过所述第二传感器检测到用户开始输入所述第三操作具体为:
所述处理器通过所述指纹传感器检测到与预置指纹匹配的指纹之前,检测到用户通过所述触摸传感器输入的触摸操作;
所述处理器确认所述第三操作在所述第一操作之后的持续时长大于第三阈值时长具体为:
所述处理器通过所述指纹传感器检测到与预置指纹匹配的指纹之后,检测到用户通过所述触摸传感器输入的触摸操作的持续时长超过所述第三阈值时长。
在一种可能的设计方法中,所述第三用户界面上显示有一个或多个快捷方式;所述一个或多个快捷方式中的每个与所述终端中一个功能相对应;
所述处理器控制所述终端执行与所述第二操作对应的功能具体为:
所述处理器确定所述第二操作用于触发所述一个或多个快捷方式中的第一快捷方式;
控制所述终端执行与所述第一快捷方式对应的功能。
在一种可能的设计方法中,所述显示设备显示第三用户界面包括:
若所述处理器确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,所述终端的触摸屏检测到用户的触摸操作,所述显示设备根据所述用户触摸操作在所述触摸屏上的触摸点在所述第三界面上显示所述一个或多个快捷方式。
在一种可能的设计方法中,所述第二操作为用户输入的第一手势操作;所述处理器控制所述终端执行所述第二操作对应的功能具体为:
所述处理器根据预置的手势与所述终端的功能的对应关系,确定所述用户输入的第一手势对应的第一功能;
控制所述终端执行所述第一功能。
在一种可能的设计方法中,所述第三用户界面上显示有映射信息,所述映射信息用于指示一个或多个功能所对应的手势。
在一种可能的设计方法中,所述处理器还用于,在控制所述终端执行与所述第二操作对应的功能之前,确定所述交互设备从检测到所述第一操作到检测到所述第二操作的时段内,持续检测到用户对终端的输入操作。
在一种可能的设计方法中,所述第一传感器为指纹传感器,所述第二传感器为触摸传感器,所述第二操作和所述第三操作均通过所述触摸传感器输入。
在一种可能的设计方法中,所述处理器还用于,在控制所述终端执行与所述第二操作对应的功能之前,确定第三操作和第二操作期间,用户持续通过所述触摸传感器进行输入操作。
在一种可能的设计方法中,所述显示设备为集成了终端的屏幕及触摸传感器的触摸屏;所述交互设备为屏内指纹传感器,用于检测屏幕显示区域内用户输入的指纹,所述第一操作为用户手指触摸或按压屏幕显示区域以输入指纹的操作;
所述处理器确认所述第一操作满足预置解锁条件具体为:
所述处理器确认用户通过所述屏内指纹传感器输入的指纹与预置指纹相匹配;
所述交互设备还包括触摸屏,用于检测用户的触摸操作;
所述处理器确认用户输入所述第一操作后未终止对所述终端的操作具体为:
所述处理器确认用户输入匹配的指纹后手指仍保持触摸所述触摸屏;
所述第二操作为用户在输入第一操作后,手指持续保持触摸所述屏幕的情况下输入的操作。
第三方面,本申请的实施例提供一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在终端上运行时,使得所述终端执行第一方面所述的界面展示方法。
第四方面,本申请的实施例提供一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在终端上运行时,使得所述终端执行第一方面所述的界面展示方法。
附图说明
图1为现有技术中终端锁屏界面的示意图;
图2为本发明实施例提供的一种终端的结构示意图;
图3A为现有技术中终端后置指纹设置位置的示意图;
图3B为现有技术中通过后置指纹解锁终端的示意图;
图3C为现有技术中解锁终端后打开应用的一个示意图;
图3D为现有技术中打开应用之后的界面一个示意图;
图4为现有技术中用户操作终端打开应用的过程的一个示意图;
图5为本发明实施例提供的一种终端用户界面的显示方法的流程示意图;
图6A为本发明实施例提供的一种终端的快捷界面(第三UI)的界面示意图一;
图6B为本发明实施例提供的一种用户在第三UI界面上进行操的示意图一;
图6C为本发明实施例提供的根据图6B所示操作打开应用后终端显示的界面;
图7A为本发明实施例提供的一种终端的快捷访问界面(第三UI)的界面示意图二;
图7B为本发明实施例提供的一种终端的快捷访问界面(第三UI)的界面示意图三;
图7C为本发明实施例提供的一种终端的快捷访问界面(第三UI)的界面示意图四;
图8为本发明实施例提供的一种终端的第三UI的设置界面示意图;
图9A为本发明实施例提供的一种终端的快捷界面(第三UI)的界面示意图五;
图9B为本发明实施例提供的一种用户在第三UI界面上进行操的示意图二;
图9C为本发明实施例提供的根据图9B所示操作打开应用后终端显示的界面;
图10为本发明实施例提供的一种终端用户界面的显示方法的用户操作界面示意图一;
图11A为本发明实施例提供的一种用户在第三UI界面上进行操的示意图三;
图11B为本发明实施例提供的根据图11A所示的操作打开应用后终端显示的界面;
图11C为本发明实施例提供的一种用户在第三UI界面上进行操的示意图四;
图11D为本发明实施例提供的根据图11C所示的操作打开应用后终端显示的界面;
图12A为本发明实施例提供的用户在终端锁屏界面上使用后置指纹进行解锁的示意图;
图12B为本发明实施例提供的在图12A所示的操作下,终端展示的第三UI;
图12C为本发明实施例提供的用户在图12B所示的UI上进行操作的示意图;
图12D为本发明实施例提供的根据图12C所示的操作打开应用后终端显示的界面。
具体实施方式
本文中,“第一”、“第二”的表述仅用于指代目的,而不表示其指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
在本发明实例提供的终端界面的显示方法中,终端判断用户进行了用于解锁终端的操作之后,如果用户并未终止对终端的操作,例如,用户通过指纹解锁之后,并未立即抬起手指,则给用户提供终端功能的快捷入口,终端通过无需中断的连贯操作即可对终端进行解锁并执行终端的功能(例如,打开终端的应用),提高了操作效率,用户体验好。
本文中,执行终端的一个功能,包括:打开一个应用,打开一个文件,打开一个文件夹,或者打开某一应用,并使用打开的应用执行该应用的一个功能。例如,打开“相机”应用进行拍照、打开“录音”应用并进行录音、打开“音乐”应用播放音乐、打开天气应用显示天气、打开“备忘录”并新建备忘录、打开“电话”应用拨打预设联系人电话、打开“短信”/“微信”应用并打开给预设联系人发送短信/微信消息的信息编辑窗口、打开“地图”应用并导航到预设位置、打开“浏览器”应用并打开预设网页等。当然,执行终端的一个功能,也可以是执行系统功能,例如切换显示的页面,更改显示/情景/省电模式,将系统切换到正常/游戏模式等。
如图5所示,本发明实施例中的一种终端用户界面的显示方法包括:
步骤S100,终端在熄屏或显示锁屏界面时,接收用户输入的用于解锁终端的第一操作。
在本发明的一个实施例中,在终端熄屏的情况下,或者在终端的锁屏界面下(本文称其为第一用户界面),用户通过第一操作来解锁终端的操作系统。为防止误操作或保护用户隐私及数据安全,用户一般会设置终端的锁定方式及解锁方式。例如,可设置终端在熄屏时锁定。再次亮屏时,终端处于锁屏状态,用户不能进入除锁屏界面外的其他界面,从而不能使用终端的完整功能。此时,需要按照预设的解锁方式对终端进行解锁,以使终端跳过锁屏界面,从而使用该用户的权限内的终端的完整功能。
例如,用户可以在终端熄屏的情况下,通过手指按压或触摸指纹传感器来点亮终端屏幕并解锁终端;此时,终端可以在指纹传感器采集到与预置指纹匹配的指纹的情况下,点亮屏幕并解锁终端,也可以在检测到指纹传感器被触摸时点亮屏幕,并在采 集到与预置指纹匹配的指纹的情况下解锁终端;或者,用户可以在终端的锁屏界面下,通过手指按压或者触摸指纹传感器来解锁终端;此时,终端可以在指纹传感器采集到匹配的指纹的情况下,解锁终端。
在本发明的实施例中,指纹传感器可以是在屏幕区域外采集指纹的屏外指纹传感器(如设置在HOME键上的指纹传感器,再例如,设置在终端背面的后置指纹传感器),也可以是在屏幕区域内采集指纹的屏内指纹传感器,例如,设置在屏幕下方的指纹传感器,或与触摸屏集成在一起的指纹传感器,或设置在屏幕的上下两侧或左右两侧,实现在屏幕区域内采集指纹的传感器。当指纹传感器设置在屏幕下方时,一般位置和大小固定,当需要采集用户指纹时,屏幕上与指纹传感器对应的区域一般会显示图标(例如,指纹形状的图标)进行提示。
当然,用户也可以使用其他传感器的输入来实现解锁功能。例如,还可使用如下传感器来实现第一操作的输入:
1、通过触摸传感器(也可称为触敏传感器,如触摸屏,触摸板)来识别用户的触摸操作来解锁,如滑动解锁,或者图形解锁;
2、通过虹膜传感器识别用户的输入虹膜的操作,终端在识别到与预置的虹膜匹配的虹膜时,解锁终端;
3、通过图像传感器来识别用户的输入人脸图像的操作,终端在识别到与预置的人脸匹配的人脸图像时,解锁终端。
当然,用户还可以使用终端的其他传感器来实现第一操作的输入,本发明不做限定。
步骤S102,终端判断用户输入的第一操作是否满足预置的解锁条件。若满足,则执行步骤S106;否则,执行步骤S104。
与步骤S100中第一操作的多种实现方式相应,第一操作满足终端预置的解锁条件匹配可以包括:
1、预置解锁条件为检测到滑动操作,用户输入的第一操作为在屏幕上的滑动操作;
2、预置解锁条件为检测到解锁图形输入,用户输入的第一操作为在屏幕上划出预置的解锁图形;
3、预置的解锁条件为检测到与预置指纹匹配的指纹,用户输入的第一操作为在指纹传感器上按下相应的手指,即通过指纹传感器输入匹配的指纹;
4、预置的解锁条件为检测到与预置虹膜匹配的虹膜,用户输入的第一操作为通过虹膜传感器输入虹膜的操作;
5、预置的解锁条件为检测到与预置人脸匹配的人脸图像,用户输入的第一操作为通过人脸传感器输入人脸图像的操作。人脸传感器包括摄像头、结构光传感器等,本发明不做限定。
步骤S104,终端保持熄屏或显示锁屏界面。
在用户输入的第一操作与预置的解锁条件不一致的情况下,若终端在输入第一操作之前处于熄屏状态,则终端可以保持熄屏或显示锁屏界面;若终端在输入第一操作之前处于亮屏状态并显示锁屏界面,则仍显示锁屏界面,该锁屏界面可以与第一界面 一致,也可与第一界面不一致,例如,该锁屏界面可以是在第一界面上叠加显示提示信息,以提示用户输入错误,解锁失败。
步骤106,判断用户输入第一操作后是否终止对终端的操作,若是,执行步骤S108;否则,执行步骤S110。
在本发明的一个实施例中,用户使用第一传感器进行第一操作,用户在终端解锁之后仍保持操作该第一传感器,则终端判断用户使用第一传感器输入第一操作后,未终止对该终端的操作。
下面以第一传感器可以为指纹传感器为例进行说明,第一传感器也可以是其他可检测用户输入的传感器,本发明不做限定。
用户通过按压或者接触指纹传感器对终端进行解锁之后,手指继续保持按压或者触摸指纹传感器,则用户未终止对该终端的操作。此时,终端至少可以通过以下两种方式判断终端解锁后用户是否继续保持操作指纹传感器:
方式1:终端判断用户对指纹传感器按压或者触摸操作的持续时间是否大于第一阈值时长,若大于,且采集的指纹与预置的指纹相匹配,则确定用户输入第一操作后持续保持操作传感器,未终止对该终端的操作,此时终端执行步骤S110;若小于,则确定用户在解锁终端后,终止了对终端的操作,此时终端执行步骤S108。
方式2:终端判断用户在判断采集到的指纹和预存的指纹匹配后,用户对传感器的操作持续时间是否大于第二阈值时长,若大于,则确定用户在输入第一操作后持续保持操作传感器,未终止对该终端的输入操作,此时终端执行步骤S110;若小于,则确定用户在输入第一操作后,终止了对终端的输入操作,此时终端执行步骤S108。
当然,终端可以采用在用户持续操作传感器过程中的其他时间点作为起点来对持续操作时间计时,并基于计时时长来判断用户是否在终端解锁之后,持续保持操作传感器。本发明不做限定。
在本发明的另一实施例中,用户可使用第一传感器进行第一操作,并使用第二传感器进行第三操作,若第三操作在终端检测到满足预置解锁条件的第一操作之前开始,并在终端检测到满足预置解锁条件的第一操作之后的持续时长大于第三阈值时长,则用户输入第一操作后,未终止对该终端的输入操作。
举例而言,第一传感器可以为虹膜传感器,第二传感器可以为触摸传感器:
在终端展示第一用户界面时,用户输入第一操作可以是用户通过虹膜传感器输入用户的虹膜,例如,终端通过前置的虹膜采集摄像头采集人眼虹膜。在终端判断用户输入的虹膜与预置的虹膜匹配并解锁终端前,终端通过触摸传感器检测到用户的触摸操作(即第三操作,例如可以是,用户触摸终端的触摸屏的操作),此时,终端判断该触摸操作在终端确定用户输入的虹膜与预置的虹膜匹配之后的持续时长是否大于第三阈值时长,若大于,则终端判断用户在输入第一操作后,未终止对该终端的输入操作,此时终端执行步骤S110;若小于,则确定用户在输入第一操作后,终止了对终端的输入操作,此时终端执行步骤S108。
在另一实施方式中,第一传感器可以为实现在屏幕区域内进行指纹采集的屏内指纹传感器,例如,设置在屏幕下指纹传感器,或集成在触摸屏内指纹传感器,或设置在屏幕上下或左右两端,以实现屏幕区域内采集指纹的传感器。第二传感器可以为触 摸传感器。
在终端展示第一用户界面时,用户输入第一操作可以是用户用手指按压或触摸屏进行指纹输入,终端通过指纹传感器采集到匹配的指纹之后,利用触摸传感器(本例中为触摸屏)判断用户是否在第三预置时长内仍保持触摸,即判断是否未抬起手指,若未抬起手指,终端执行步骤S110;若抬起了手指,终端执行步骤S108。
本领域技术人员通过以上举例可知,本发明并不限定第一传感器和第二传感器的具体形态,只要第一传感器可以采集用于解锁的用户的第一操作,第二传感器可用于采集第三操作,并且终端可根据第一传感器和第二传感器采集到信号判断用户输入第一操作后是否在一预设时长(如,第三阈值时长)内还在对终端进行输入操作即可。
步骤S108,终端显示第二用户界面,该第二用户界面为解锁后的用户界面,简称解锁界面。进入解锁界面后,用户无需再进行解锁操作,就能使用终端的完整功能。
若用户通过第一操作解锁终端后,终止对终端的操作(例如,手指离开指纹传感器或者触摸屏),终端展示第二用户界面,第二UI与传统技术方案中,用户对终端解锁之后的界面相同。例如,如图3C所示,第二用户界面可以为终端开机之后显示的操作系统的主界面3,例如,终端的主界面3可以由状态栏31,页面32,以及常驻栏(也称“Dock栏”)33组成。其中,页面32用于显示APP图标,小组件(widget),文件图标,文件夹图标或其他用于触发执行终端功能的快捷方式,常驻栏33用于显示用户常用的应用或功能的图标,例如“联系人”,“电话本”,“相机”等APP的图标。通常,状态栏31,页面32和常驻栏33分列用户界面的上、中、下部分,页面可以有多个,可以通过用户的操作切换显示的页面32,页面切换过程中,状态栏31和常驻栏33并不发生切换。
第二UI也可以是本次解锁终端的前一次锁定终端时,终端展示的用户界面。例如,终端打开了“相册”应用,并使用该应用浏览A图片,此时,用户通过电源键灭屏,终端锁定,用户再次解锁终端时终端展示的界面,就是使用“相册”浏览A图片的界面。
以上描述为第二用户界面的一个实例,也可包括其他形态的界面。
步骤S110,终端显示第三UI。所述第三UI提供终端的功能的快捷入口。
若用户输入用于解锁终端的第一操作后,未终止对该终端的操作,终端展示第三UI。
也即,终端接收到用户输入的第一操作(该第一操作满足终端预置的解锁条件)时,检测到用户未终止对该终端的操作,则展示第三UI。
如图6A所示,第三UI可以是以第二UI为背景,在第二UI层上叠加显示第三UI的内容(也可称为界面元素),两层界面可以进行适当的透明或虚化处理,以达到较好的UI效果,同时,进入第三UI后,第二UI内容虽然仍有显示,但不再响应第二UI所定义的手势操作,只响应第三UI所定义的手势。也即,在显示第三UI时,终端检测到的用户输入操作仅对第三UI起作用,并不对第二UI起作用。例如,在两层UI的显示界面上,若用户通过触摸屏输入用户操作,且该操作作用于第三UI的图标A,且在图标A的坐标处,下层虚化显示的第二UI上有图标A’,则该操作仅对图 标A起作用,对图标A’不起作用。
当然,第三UI呈现的时候也可以不以第二UI作为背景,而仅仅呈现第三UI的内容。
第三UI至少有以下两种方式提供终端的功能的快捷入口:
方式1、在第三UI上提供终端的功能的快捷方式,当快捷方式被用户的操作触发时,执行快捷方式对应的功能;
方式2、第三UI设置有快捷手势和终端的功能的对应关系,当在第三UI上检测到预设的快捷手势时,执行该快捷手势对应的功能。下面对这两种方式分别进行说明。
在方式1中,如图6A,图7A及图7B所示,第三UI上呈现的内容可以是根据手指按压位置来显示快捷方式,例如,可以以手指按压位置为中心,围绕分布多个常用应用程序或功能的图标,或其他快捷方式,终端可以设置功能的快捷方式的分布情况,如,快捷方式的数量、快捷方式与手指按压位置之间的距离、快捷方式的排列顺序、快捷方式显示的透明度等。应用程序分布可以围绕手指触摸位置呈圆形(包括椭圆机及不规则圆形)分布,也可在手指上方或下方呈扇形分布,也可以其他形式分布,本发明实施例不做限定。
快捷方式是指,当用户触发这些快捷方式时,可以执行相应的功能。终端可以在UI上显示这些快捷方式,以提示用户以预设的操作方式去触发这些快捷方式,当终端检测到预设的操作,并且根据快捷方式在UI上的布局和用户的具体操作,确定用户具体操作与显示的快捷方式对应性,终端确定该具体操作对应的快捷方式,进而执行相应的功能。例如,快捷方式包括页面上的图标和widget,当用户点击这些图标或widget,可以打开相应的应用,文件或文件夹,或者打开相应的应用并执行应用的某一功能。
如图7A-7C所示,当手指按压位置位于或接近触摸屏边缘位置时,应用程序的快捷方式如果按照圆形排列可能导致部分快捷方式显示不完整或者完全显示不出来,终端可根据手指按压的位置对快捷方式的显示方式及数量进行调整,以扇形或者其他呈现形式以确保应用的快捷方式能够完整显示。
应用程序快捷方式在第三UI上的显示方式可包括以下之一:
1、固定顺序排列,且由用户自定义在第三UI上显示的快捷方式及其排序;
2、终端根据用户使用应用的频率,确定使用频率最高的N个应用的快捷方式显示在第三UI上,并按照频率高低进行排列;
3、终端根据用户最近使用应用的时间,确定距显示第三UI的时间最近的用户曾经使用的N个应用的快捷方式显示在第三UI上,并按照时间顺序进行排列;
4、组合排列方式,用户可按方式1自定义X个应用的快捷方式,其他未固定N-X个快捷方式按照方式2或方式3确定。以上,X和N为自然数,X小于N。
在第三UI以方式2提供终端的功能的快捷入口的情况下,终端设置有第三UI上的快捷手势和终端的功能的对应关系,如“S”,“C”和“O”形手势图案分别对应“日历”应用,“关闭所有运行的应用”功能,和“播放音乐”功能。当用户在第三UI上输入“S”,“C”或“O”形等手势图案时,终端打开“日历”应用,关闭所有运行的应用,或者播放音乐。
用户可以设置在第三UI上的手势与功能的对应关系。用户可以进入第三UI的设置界面,在设置界面选择需要在第三UI上快捷执行的功能,并在该设置界面手势输入 区输入对应的手势。第三UI的设置界面也可以包括两个子设置界面,其中选择子界面用于选择需要在第三UI上显示的功能(包括应用),手势子界面用于设置选择的快捷方式对应的手势。如图8所示,第三UI的设置界面也可以仅提供预置的手势让用户选择,而不是记录用户输入的手势。预置的手势也可以用字母形状来表达,例如用字母C代表“C”形的输入手势。当用户在第三UI界面用手指在触摸屏上划出“C”形时,终端对手势进行类似于笔画输入法的识别,识别得到字母C,然后根据设定的字母C和功能的对应关系,执行字母C对应的功能。
在方式2中,第三UI上可以展示映射信息,所述映射信息用于指示一个或多个者功能所对应的手势,即,展示用户设置的手势图案与功能的对应关系信息。例如,如图9所示,将用户设置好的“微信”应用对应的“S”手势并排展示。
当然,在方式2中,第三UI也可以不展示用户设置的手势图案与功能的对应关系信息,用户可以凭记忆或自己的设定知道在第三UI上输入何种手势可以触发终端执行何种功能。
步骤S112,终端在展示第三UI时,接收用户输入的第二操作,执行与所述第二操作对应的功能。
在本发明的一个实施例中,用户在终端展示所述第三UI时,输入第二操作,触发终端执行终端的预设功能。用户从输入第一操作到输入第二操作,未终止过对终端的输入操作。换言之,终端从检测到第一操作到检测到第二操作的时段内,持续检测到用户对终端的输入操作。
在该实施例中,用户从输入用于解锁的第一操作到执行终端的一个功能,例如,打开一个应用,可以用用于输入第二操作的一个连贯动作来完成,提高了操作效率,用户体验好。
在本发明的另一个实施例中,用户也可以在终端展示第三UI后,终止了对终端的输入之后,再继续输入第二操作。
仍以第一传感器为屏内指纹传感器,第二传感器为触摸屏为例进行说明。终端使用指纹传感器检测到用户输入的用于解锁终端的第一操作,并经过步骤S102和S106的判断,显示第三UI时,触摸屏继续检测用户的输入操作,当检测到预设的第二操作时,执行与预设的第二操作对应的功能。
下面结合上述实施例中的第三UI,对第二操作进行示例性说明。在第三UI以上述方式1提供终端的功能的快捷入口时,第二操作可以是以下的任一种方式:
1、手指以触摸屏上的按压或者触摸位置为起点进行滑动,当手指移动到第三UI上的快捷方式上并抬起手指;例如,如图6A-6C所示,用户将手指移动到“微信”图标上并抬起手指,手机打开“微信”应用。
2、手指以触摸屏上的按压或者触摸位置为起点,朝向第三UI上的某一个快捷方式进行拂动(flick)的操作;其中,拂动操作由触摸移动和加速移动组成,类似人用手指拂去桌面上的灰尘的动作;终端判断拂动操作的方向,以确定该操作触发的快捷方式。
3、一个手指保持触摸或者按压触摸屏,另一手指点击功能(包括应用)的快捷方式;例如,如图10所示,右手拇指保持按压触摸屏,左手拇指点击“微信”图标,打 开“微信”应用。
4、手指按压触摸屏,终端根据按压的力量的不同,确定不同的快捷方式。在此方式中,触摸屏可检测用户的压力,根据压力的不同,终端会将与该压力对应的快捷方式显著性显示,如图标悬浮,闪动,右上角打勾,或者加框等,以提示用户该快捷方式对应于此时的压力。当用户抬起手指,则执行用户使用压力按压操作选中的快捷方式对应的功能。例如,如图11A所示,当用户用较轻力度按压时,“浏览器”图标加框,代表其被选中,若用户此时抬起手指,则打开“浏览器”应用(如图11B所示)。若“浏览器”图标被加框,指示其被选中时,用户并未抬手,而是加重力度,则如11C所示,“微信”图标被加框,指示其被选中,用户此时抬起手指,则打开“微信”应用(如图11D所示)。
通过以上操作,终端将执行该快捷方式对应的功能。当然,本发明的实施例并不限定以上操作,终端可以预设其他的类型的操作方式,并以该类型的操作方式的不同操作指向不同的功能的快捷方式,从而确定执行对应的功能。也即,用户通过第二操作触发了对应的快捷方式。
在以上实施方式中,用户通过一个触摸或者按压触摸屏的解锁操作,终端在判断用户未终止操作时,呈现提供终端功能的快捷入口的UI,用户可以在手指不离开触摸屏的情况下,继续通过在触摸屏上的操作触发终端执行终端的功能,提高了操作的效率,提升了用户体验。
在本发明的一个实施例中,终端也可以使用设置在终端的屏幕外的指纹传感器来检测用户输入的第一操作和第二操作。例如,可以使用设置在终端背面(相对于屏幕而言的背面)的指纹传感器,来检测第一操作和第二操作。如图12所示,在如图12A所示的锁屏界面上检测到匹配的指纹输入,并且用户未抬起手指,则如图12B所示,展示第三UI。在第三UI展示快捷方式时,后置指纹传感器可以检测到用户手指的移动方向,例如从上到下,从左到右等。在该实施例中,当终端展现第三UI时,在第三UI上展示一个参考点(例如,闪烁的圆点或光标,如图12C所示),根据该参考点相对于快捷方式的位置关系,以及第二操作的方向,确定第二操作触发的快捷方式。例如,第三UI展示的快捷方式有四个,分别为位于该参考点上方的快捷方式A(例如,“Facebook”),位于该参考点下方的快捷方式B(例如,“微信”),位于该参考点左边的快捷方式C(例如,“浏览器”),和位于该参考点右边的快捷方式D(例如,“游戏”)。若终端通过后置指纹传感器检测到用户的第二操作为在传感器上从上往下的滑动操作,则确定触发的是快捷方式B,如图12D所示,打开“微信”应用。若终端通过后置指纹传感器检测到用户的第二操作为在传感器上从左往右的滑动操作,则确定触发的是快捷方式D,从而打开“游戏”应用。
在前述的一个实施方式中,用户使用第一传感器进行第一操作,并使用第二传感器进行第三操作,若第三操作在终端检测到满足预置解锁条件的第一操作之前开始,并在终端检测到满足预置解锁条件的第一操作之后的持续时长大于第三阈值时长,则确定用户输入第一操作后,未终止对该终端的输入操作,此时终端显示第三UI。在该实施方式中,第二操作和第三操作可以均由第二传感器输入,例如,该第二传感器是触摸传感器。用户通过触摸传感器可以连续地输入第三操作和第二操作,用户在解锁前后通过一个连贯的触摸动作即可打开终端的一个特定功能,操作效率高,用户体验 好。
在第三UI以上述方式2提供终端的功能的快捷入口时,第二操作可以是用户在第三UI上划出提示的手势图案,终端识别出用户的手势,并执行该手势对应的功能。例如,如图9B所示,若手势图案“S”对应“微信”应用,用户在终端呈现第三UI时,在触摸屏上用手指划出“S”形轨迹(如图9B所示),终端打开“日历”(如图9C所示)。
需要说明的是,在本发明的一个实施方式中,在显示第三UI后,若未检测到用户的第二操作,则可以直接解锁终端,显示解锁后的界面,例如,显示第二UI。举例而言,在第三UI以上述方式1提供终端的功能的快捷入口时,用户未能输入用于触发某个快捷方式的第二操作,比如,用户在显示第三UI后,直接终止了对终端的操作(例如,手指离开触摸屏),或者显示第三UI后的操作不符合终端的预定设置,例如,在一个实施例中,预定设置的第二操作为手指移动到快捷方式上并抬手,但终端仅检测到用户手指的移动,该移动并未到达第三UI上显示的快捷方式,此时,终端可以对屏幕进行解锁,显示第二UI。
又例如,在第三UI以上述方式2提供终端的功能的快捷入口时,终端检测到用户在第三UI上直接终止了对终端的操作,或者未检测到预定的手势时,终端可以对屏幕进行解锁,显示第二UI。
本申请实施例提供的用户界面显示方法可以应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等任意可以显示界面的终端设备上,本申请实施例对此不作任何限制。
以本申请实施例中的终端为手机为例,对手机的通用硬件架构进行说明。如图2所示,手机200可以包括:通信模块220、存储器230、传感器模块240、交互设备250、显示屏260、音频模块280、处理器210、相机模块291、电源管理模块295等。这些部件之间可以以总线连接,也可以直连连接。本领域技术人员可以理解,图2中示出的手机结构并不构成对手机的限定,可以包括比图2更多的部件,或者组合某些部件,或者不同的部件布置。
其中,通信模块220用于与其它网络实体进行通信,例如从服务器接收信息或向服务器发送相关数据等。通信模块220可以包括射频(radio frequency,RF)模块229、蜂窝模块221、无线保真(wireless fidelity,WIFI)模块223、以及GPS模块227等。RF模块229可以用于收发信息或通话过程中,信号的接收和发送,特别地,将接收到的信息给处理器210处理;另外,将处理器210生成的信号发送出去。通常,RF电路21可以包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器(low noise amplifier,LNA)、双工器等。此外,RF电路21还可以通过无线通信与网络和其他设备通信。蜂窝模块221和WIFI模块223可以用于连接网络。GPS模块可以用于进行定位或导航。
处理器210是手机200的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器230内的软件程序和/或模块,以及调用存储在存储器230内的数据,执行手机200的各种功能和处理数据,从而对手机200进行整体监控。在具体实 现中,作为一种实施例,处理器210可包括一个或多个处理单元;处理器210可集成应用处理器和调制解调处理器。其中,应用处理器主要处理操作系统、图形用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器210中。
存储器230可用于存储数据、软件程序以及模块,可以是易失性存储器(volatile memory),例如随机存取存储器(random-access memory,RAM);或者非易失性存储器(non-volatile memory),例如只读存储器(read-only memory,ROM),快闪存储器(flash memory),硬盘(hard disk drive,HDD)或固态硬盘(solid-state drive,SSD);或者上述种类的存储器的组合。具体的,存储器230内可存储程序代码,该程序代码用于使处理器210通过执行该程序代码,执行本申请实施例提供的用户界面方法。存储器230可以包括内部存储器232和外部存储器234。
本发明的实施例中,处理器210执行存储器230中的代码,判断通过传感器接收的第一操作是否满足预置的解锁条件(即,执行方法实施例部分的步骤S102,以及实现该步骤的各种具体实施方式),并在判断第一操作满足解锁条件的情况下,继续判断通过传感器接收的信号来判断用户输入第一操作之后是否终止了对终端的操作(即执行方法实施例部分的步骤S106,以及实现该步骤的各种具体实施方式);处理器210在实现步骤S106时,可设定计时器,来判断用户输入第一操作之后是否终止对终端的输入操作。对于终端而言,如果检测到了满足解锁条件的第一操作之后,开启该定时器,并在定时器超时之前,检测到用户终止了对终端的输入操作,例如,在定时器超时之前未持续检测到触摸信号,则意味着用户在输入第一操作之后,终止了对终端的输入操作。当然,处理器210可在其他时机设置定时器来判断用户是否在输入第一操作之后,终止了对终端的输入操作,以实现方法实施例部分的各种判断方式。此处不赘述。
处理器210执行存储器230中的代码,判断第一操作不满足解锁条件的情况下,终端设备保持熄屏或者显示锁屏界面(即执行方法实施例部分的步骤S104)。
处理器210执行存储器中230中的代码,判断用户输入第一操作后未终止对终端的操作时,例如通过传感器检测到用户的输入操作未中断过时,处理器通过显示设备(例如,显示屏260)显示第三UI(即,执行方法实施例部分的步骤S110,以及实现该步骤的各种具体实施方式)。方法实施例中第三UI的各种展现形式由UI数据来体现,UI数据存储在存储器230中,由处理器210读取出并交由显示屏260进行显示。
处理器210执行存储器中230中的代码,在接收到第二操作时,确定当前显示的UI为第三UI,执行与所述第二操作对应的功能(即,执行方法实施例部分的步骤S112,以及实现该步骤的各种具体实施方式)。具体而言,在实现方法实施例中的有些实现方式时,处理器210执行存储器中230中的代码,通过传感器接收到第二操作,首先确定当前显示的用户界面为第三UI,然后将第二操作的轨迹映射为在第三UI上的轨迹,以确定用户的第二操作触发了哪个快捷方式。例如,在触摸传感器240上检测到的触摸点,其在触摸传感器240上的坐标会转化成显示屏260的坐标,从而将触摸点形成的轨迹映射到第三UI上。在实现方法实施例中的有些实现方式时,处理器210执行存储器中230中的代码,通过传感器接收到第二操作,首先也确定当前显示的界面为第三UI,但可以不进行第二操作轨迹的映射,只需要判断出用户在触摸传感器252上触摸的形状即可,例如判断为手势“S”,则确定打开“日历”应用。当然也可以将第二操作的触摸坐标映射到第三UI上,再判断 第二操作是何种手势。
需要说明的是,本文中的处理器210,是指可以完成上述计算,处理和控制功能的单元,其可以是一个独立的器件,也可以由几个分离的器件构成。例如,在本发明的一个具体实施方式中,处理器210可以包括一个协处理器,一个主处理器。协处理器(也可称为sensor hub)用于集中控制多个传感器并处理该多个传感器采集的数据。在此架构下,用于解锁终端的传感器(如前文提到的指纹传感器,虹膜传感器,人脸识别传感器,触摸传感器等)所采集的数据的处理的功能可由协处理器完成,例如,协处理器可以执行将传感器采集到的用于解锁的数据与预置的解锁数据进行比较,确定是否匹配的操作,将操作结果传递给主处理器,主处理器根据该结果完成其他操作。在此架构下,协处理器工作时,主处理器可以休眠,不用频繁唤醒主处理器,有利于节省终端的功耗。
当然,也可以根据其他需要,在终端中设置多个协同工作的处理器,本发明不做限定。
传感器模块240可以包括气压传感器240C、磁传感器240D、温度/湿度传感器240J,照度传感器240K和UV传感器240M等。这些传感器可用于测量环境参数,供处理器按照一定预置的策略对手机进行控制,例如,可以显示这些参数值,也可以根据这些参数值来设置手机的工作模式。需要说明的是,手机200还可以包括其它传感器,比如距离传感器、RGB传感器等,在此不再赘述。
交互设备250检测到用户对终端的操作,包括键盘(图中未示出),也包括可以检测用户操作的传感器,例如陀螺仪传感器256,用于检测手机在各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如手机倾斜的角度、横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等。加速度传感器258,用于检测手机的加速度。这两个传感器可以感知用户施加在手机上的动作,例如横竖屏切换,手机的甩动等。交互设备250还包括指纹传感器257,虹膜传感器259和接近传感器(图中未示出)等。其中,指纹传感器可以是与显示屏分离设置,即屏外指纹传感器,例如设置在home键上的指纹传感器或设置在后壳上的后置指纹传感器,也可以是屏内指纹传感器;接近传感器是一种具有感知物体接近能力的器件,它可以利用位移传感器对接近的物体具有敏感特性来识别物体的接近,并输出相应开关信号。距离传感器可以用于检测悬浮事件。
显示屏260可以包括显示面板262、全息设备264、以及投影仪266等设备。其中,显示面板262可以用于在手机上进行UI显示,例如,进行图形用户界面(graphical user interface,GUI)显示,图形用户界面上包括各种控件或各种应用界面等。
其中,触摸传感器252和显示面板262还可以称为触摸显示屏幕或触控屏,触控屏幕可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控屏幕上或在触控屏幕附近的操作),并根据预先设定的程式驱动相应的连接装置。还可用于显示由用户输入的信息或提供给用户的信息(如通过摄像头采集到的图像)以及手机的各种菜单。例如,可以采用电阻式、电容式、红外光感以及超声波等多种类型实现触控屏幕,本发明实施例对此不进行限定。其中,用户在触控屏幕附近的操作可以称之为悬浮触控,本文中的触摸操作,包括这种悬浮触控操作。能够进行悬浮触控的触控屏可以采用电容式、红外光感以及超声波等实现。本发明实施例中的触摸操作也包括这种悬浮触控的操作。
例如,当手指等目标靠近或远离电容式触控屏时,触控屏中的自电容和互电容的电流 会随之发生变化,从而使得电子设备可以检测到悬浮测控。再例如,红外光感触控屏可以利用红外LED和红外发光二极管发射光线,手机通过检测用户手指等目标反射回的屏幕光线,实现悬浮手势的识别和追踪。
相机模块291,可以用于采集图像从而进行拍照、录制视频或扫描二维码/条码等,还可以用于进行面部信息识别、用户表情识别、用户头部动作识别等。
音频模块280可以包括扬声器282、接收机284、耳机286或麦克风288等,用户采集或播放音频信号。
电源管理模块295,可以包括电池296,用于通过电源管理系统与处理器210逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
此外,手机200还可以包括用户识别模块、指示器、电机等功能模块,在此不再一一赘述。
在上述实施例中,可以全部或部分的通过软件,硬件,固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式出现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质,(例如,软盘,硬盘、磁带)、光介质(例如,DVD)或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (32)

  1. 一种终端用户界面的显示方法,其特征在于,包括:
    所述终端在熄屏或显示第一界面时,接收用户输入的用于解锁终端的第一操作;所述第一界面为终端的锁屏界面;
    所述终端确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,显示第三用户界面;
    所述终端在所述第三用户界面下,接收用户输入的第二操作,执行与所述第二操作对应的功能。
  2. 根据权利要求1所述的方法,其特征在于,所述终端确认所述第一操作满足预置解锁条件具体为:
    所述终端确认所述第一操作为在触摸传感器上检测到的滑动操作;或
    所述终端确认所述第一操作为通过触摸传感器在所述解锁界面上画出图形的操作,且所述图形与预置图形相符;或
    所述终端确认所述第一操作为通过指纹传感器输入指纹的操作,且输入的指纹与预置指纹匹配;或
    所述终端确认所述第一操作为通过虹膜传感器输入虹膜的操作,且输入的虹膜与预置虹膜匹配;或
    所述终端确认所述第一操作为通过人脸传感器输入人脸图像的操作,且输入的人脸图像与预置人脸匹配。
  3. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:
    若所述第一操作满足预置解锁条件,且用户输入所述第一操作后终止对所述终端的操作,所述终端显示第二用户界面;所述第二用户界面为所述终端解锁后的用户界面。
  4. 根据权利要求1-3任一所述的方法,其特征在于,所述终端通过第一传感器接收所述第一操作,所述终端确认用户输入所述第一操作后未终止对所述终端的操作包括:
    所述终端通过所述第一传感器接收到满足预置解锁条件的所述第一操作之后,检测到用户继续保持操作所述第一传感器。
  5. 根据权利要求4所述的方法,其特征在于,所述第一传感器为指纹传感器,所述接收到满足预置解锁条件的第一操作具体为:
    通过所述指纹传感器采集到用户输入的指纹,所述用户输入的指纹与预置的指纹匹配;
    所述检测到用户继续保持操作所述第一传感器包括:
    终端判断用户对所述指纹传感器按压或者触摸操作的持续时间是否大于第一阈值时长,若大于,且采集的指纹与预置的指纹相匹配,则确定用户输入满足预置的解锁条件的所述第一操作后,检测到用户继续保持操作所述第一传感器;或
    终端判断用户在判断采集到的指纹和预存的指纹匹配后,用户对传感器的操作持续时间是否大于第二阈值时长,若大于,则确定用户在输入满足预置的解锁条件的所述第一操作后持续保持操作所述第一传感器。
  6. 根据权利要求1-3任一所述的方法,其特征在于,所述终端通过第一传感器接收所述第一操作,并通过第二传感器接收用户输入的第三操作;
    所述终端确认用户输入所述第一操作后未终止对所述终端的操作包括:
    若终端检测到满足预置解锁条件的所述第一操作之前,检测到用户开始输入所述第三操作,并且所述第三操作在所述第一操作之后的持续时长大于第三阈值时长,则用户输入第一操作后,未终止对该终端的输入操作。
  7. 根据权利要求6所述的方法,其特征在于,所述第一传感器为指纹传感器,所述第二传感器为触摸传感器,所述终端检测到满足预置解锁条件的所述第一操作之前,检测到用户开始输入所述第三操作具体为:
    所述终端通过所述指纹传感器检测到与预置指纹匹配的指纹之前,检测到用户通过所述触摸传感器输入的触摸操作;
    所述第三操作在所述第一操作之后的持续时长大于第三阈值时长具体为:
    所述终端通过所述指纹传感器检测到与预置指纹匹配的指纹之后,用户通过所述触摸传感器输入的触摸操作的持续时长超过所述第三阈值时长。
  8. 根据权利要求1,2,4-7任一所述的方法,其特征在于,所述第三用户界面上显示有一个或多个快捷方式;所述一个或多个快捷方式中的每个与所述终端中一个功能相对应;
    所述接收用户输入的第二操作,并执行与所述第二操作对应的功能具体为:
    所述终端接收用户输入的第二操作,确定所述第二操作用于触发所述一个或多个快捷方式中的第一快捷方式;
    执行与所述第一快捷方式对应的功能。
  9. 根据权利要求8所述的方法,其特征在于,所述显示第三用户界面包括:
    若所述终端确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,所述终端的触摸屏检测到用户的触摸操作,所述终端根据所述用户触摸操作在所述触摸屏上的触摸点的位置在所述第三界面上显示所述一个或多个快捷方式。
  10. 根据权利要求1,2,4-7任一所述的方法,其特征在于,所述接收用户输入的第二操作,并执行与所述第二操作对应的功能具体为:
    接收用户输入的第一手势,根据预置的手势与所述终端的功能的对应关系,确定所述用户输入的第一手势对应的第一功能;
    执行所述第一功能。
  11. 根据权利要求10所述的方法,其特征在于,所述第三用户界面上显示有映射信息,所述映射信息用于指示一个或多个功能所对应的手势。
  12. 根据权利要求8-11任一所述的方法,其特征在于,所述第三用户界面叠加显示显示在第二用户界面之上,显示在所述第三用户界面下的所述第二用户界面不响应用户操作;所述第二用户界面为所述终端解锁后的用户界面。
  13. 根据权利要求1-12任一所述的方法,其特征在于,所述终端执行与所述第二操作对应的功能之前,所述方法还包括:
    所述终端确定从检测到所述第一操作到检测到所述第二操作的时段内,持续检测到用户对终端的输入操作。
  14. 根据权利要求6所述的方法,其特征在于,所述第一传感器为指纹传感器,所述第二传感器为触摸传感器,所述第二操作和所述第三操作均通过所述触摸传感器输入。
  15. 根据权利要求14所述的方法,其特征在于,所述执行与所述第二操作对应的功 能之前,所述方法还包括:所述终端确定第三操作和第二操作期间,用户持续通过所述触摸传感器进行输入操作。
  16. 根据权利要求1所述的方法,其特征在于,所述第一操作为用户通过屏内指纹传感器输入指纹的操作;所述屏内指纹传感器用于检测屏幕显示区域内用户输入的指纹;
    所述确认所述第一操作满足预置解锁条件具体为:
    所述终端确认用户通过所述指纹传感器输入的指纹与预置指纹相匹配;
    所述终端确认用户输入所述第一操作后未终止对所述终端的操作具体为:
    所述终端确认用户输入匹配的指纹后手指仍保持触摸所述终端的屏幕;
    所述第二操作为用户在输入第一操作后,手指持续保持触摸所述屏幕的情况下输入的操作。
  17. 一种终端,其特征在于,包括:
    交互设备,用于在所述终端处于熄屏状态或显示第一界面时,接收用户输入的用于解锁终端的第一操作;所述第一界面为终端的锁屏界面;
    处理器,用于确认所述第一操作是否满足预置解锁条件,且用户输入所述第一操作后是否未终止对所述终端的操作;
    显示设备,用于在所述处理器确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,显示第三用户界面;
    所述交互设备还用于在所述终端显示所述第三用户界面时,接收用户输入的第二操作;
    所述处理器还用于,控制所述终端执行所述第二操作对应的功能。
  18. 根据权利要求17所述的终端,其特征在于,所述显示设备还用于,在所述处理器确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后终止对所述终端的操作时,显示第二用户界面;所述第二用户界面为所述终端解锁后的用户界面。
  19. 根据权利要求17或18所述的终端,其特征在于,所述交互设备包括:
    第一传感器,用于接收所述第一操作;
    所述处理器确认用户输入所述第一操作后未终止对所述终端的操作包括:
    所述处理器确认所述第一传感器接收到满足预置解锁条件的所述第一操作之后,所述第一传感器继续检测到用户的操作。
  20. 根据权利要求19所述的终端,其特征在于,所述第一传感器为指纹传感器,具体用于采集到用户输入的指纹;
    所述处理器确认所述第一传感器接收到满足预置解锁条件的所述第一操作具体为:
    所述处理器确认所述指纹传感器采集的指纹与预置指纹相匹配;
    所述处理器确认所述第一传感器继续检测到用户的操作包括:
    所述处理器判断用户对所述指纹传感器按压或者触摸操作的持续时间是否大于第一阈值时长,若大于,且采集的指纹与预置的指纹相匹配,则确定用户输入满足预置的解锁条件的所述第一操作后,检测到用户继续保持操作所述第一传感器;或
    所述处理器判断用户在判断采集到的指纹和预存的指纹匹配后,用户对传感器的操作持续时间是否大于第二阈值时长,若大于,则确定用户在输入满足预置的解锁条件的所述第一操作后持续保持操作所述第一传感器。
  21. 根据权利要求17或18所述的终端,其特征在于,所述交互设备包括:
    第一传感器,用于接收所述第一操作;
    第二传感器,用于接收用户输入的第三操作;
    所述处理器确认用户输入所述第一操作后未终止对所述终端的操作包括:
    所述处理器确认所述第一传感器接收到满足预置解锁条件的所述第一操作之前,通过所述第二传感器检测到用户开始输入所述第三操作,并且所述第三操作在所述第一操作之后的持续时长大于第三阈值时长。
  22. 根据权利要求21所述的终端,其特征在于,所述第一传感器为指纹传感器,所述第二传感器为触摸传感器,所述处理器确认所述第一传感器接收到满足预置解锁条件的所述第一操作之前,通过所述第二传感器检测到用户开始输入所述第三操作具体为:
    所述处理器通过所述指纹传感器检测到与预置指纹匹配的指纹之前,检测到用户通过所述触摸传感器输入的触摸操作;
    所述处理器确认所述第三操作在所述第一操作之后的持续时长大于第三阈值时长具体为:
    所述处理器通过所述指纹传感器检测到与预置指纹匹配的指纹之后,检测到用户通过所述触摸传感器输入的触摸操作的持续时长超过所述第三阈值时长。
  23. 根据权利要求17-22任一所述的终端,其特征在于,所述第三用户界面上显示有一个或多个快捷方式;所述一个或多个快捷方式中的每个与所述终端中一个功能相对应;
    所述处理器控制所述终端执行与所述第二操作对应的功能具体为:
    所述处理器确定所述第二操作用于触发所述一个或多个快捷方式中的第一快捷方式;
    控制所述终端执行与所述第一快捷方式对应的功能。
  24. 根据权利要求23所述的终端,其特征在于,所述显示设备显示第三用户界面包括:
    若所述处理器确认所述第一操作满足预置解锁条件,且用户输入所述第一操作后未终止对所述终端的操作时,所述终端的触摸屏检测到用户的触摸操作,所述显示设备根据所述用户触摸操作在所述触摸屏上的触摸点在所述第三界面上显示所述一个或多个快捷方式。
  25. 根据权利要求17-22任一所述的终端,其特征在于,所述第二操作为用户输入的第一手势操作;所述处理器控制所述终端执行所述第二操作对应的功能具体为:
    所述处理器根据预置的手势与所述终端的功能的对应关系,确定所述用户输入的第一手势对应的第一功能;
    控制所述终端执行所述第一功能。
  26. 根据权利要求25所述的终端,其特征在于,所述第三用户界面上显示有映射信息,所述映射信息用于指示一个或多个功能所对应的手势。
  27. 根据权利要求17-26任一所述的终端,其特征在于,所述处理器还用于,在控制所述终端执行与所述第二操作对应的功能之前,确定所述交互设备从检测到所述第一操作到检测到所述第二操作的时段内,持续检测到用户对终端的输入操作。
  28. 根据权利要求21所述的终端,其特征在于,所述第一传感器为指纹传感器,所述第二传感器为触摸传感器,所述第二操作和所述第三操作均通过所述触摸传感器输入。
  29. 根据权利要求28所述的终端,其特征在于,所述处理器还用于,在控制所述终端执行与所述第二操作对应的功能之前,确定第三操作和第二操作期间,用户持续通过所 述触摸传感器进行输入操作。
  30. 根据权利要求17所述的终端,其特征在于,所述显示设备为集成了终端的屏幕及触摸传感器的触摸屏;所述交互设备为屏内指纹传感器,用于检测屏幕显示区域内用户输入的指纹,所述第一操作为用户手指触摸或按压屏幕显示区域以输入指纹的操作;
    所述处理器确认所述第一操作满足预置解锁条件具体为:
    所述处理器确认用户通过所述屏内指纹传感器输入的指纹与预置指纹相匹配;
    所述交互设备还包括触摸屏,用于检测用户的触摸操作;
    所述处理器确认用户输入所述第一操作后未终止对所述终端的操作具体为:
    所述处理器确认用户输入匹配的指纹后手指仍保持触摸所述触摸屏;
    所述第二操作为用户在输入第一操作后,手指持续保持触摸所述屏幕的情况下输入的操作。
  31. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在终端上运行时,使得所述终端执行如权利要求1-16中任一项所述的方法。
  32. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在终端上运行时,使得所述终端执行如权利要求1-16中任一项所述的方法。
PCT/CN2018/092680 2018-06-25 2018-06-25 一种终端的用户界面显示方法和终端 WO2019091124A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201880094953.5A CN112313623A (zh) 2018-06-25 2018-06-25 一种终端的用户界面显示方法和终端
EP18876036.7A EP3798829A4 (en) 2018-06-25 2018-06-25 DISPLAY METHOD FOR USER INTERFACE OF A TERMINAL DEVICE AND TERMINAL DEVICE
PCT/CN2018/092680 WO2019091124A1 (zh) 2018-06-25 2018-06-25 一种终端的用户界面显示方法和终端
US17/255,788 US11482037B2 (en) 2018-06-25 2018-06-25 User interface display method of terminal, and terminal
US17/948,463 US11941910B2 (en) 2018-06-25 2022-09-20 User interface display method of terminal, and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/092680 WO2019091124A1 (zh) 2018-06-25 2018-06-25 一种终端的用户界面显示方法和终端

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US17/255,788 A-371-Of-International US11482037B2 (en) 2018-06-25 2018-06-25 User interface display method of terminal, and terminal
US17/948,463 Continuation US11941910B2 (en) 2018-06-25 2022-09-20 User interface display method of terminal, and terminal

Publications (1)

Publication Number Publication Date
WO2019091124A1 true WO2019091124A1 (zh) 2019-05-16

Family

ID=66438133

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/092680 WO2019091124A1 (zh) 2018-06-25 2018-06-25 一种终端的用户界面显示方法和终端

Country Status (4)

Country Link
US (2) US11482037B2 (zh)
EP (1) EP3798829A4 (zh)
CN (1) CN112313623A (zh)
WO (1) WO2019091124A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD949177S1 (en) * 2019-09-24 2022-04-19 Beijing Xiaomi Mobile Software Co., Ltd. Mobile phone with graphical user interface
USD973065S1 (en) * 2020-01-31 2022-12-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD967838S1 (en) * 2020-01-31 2022-10-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD967127S1 (en) * 2020-01-31 2022-10-18 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104143060A (zh) * 2013-05-10 2014-11-12 中国电信股份有限公司 面向多应用的屏幕解锁方法和装置、以及移动终端
CN105447369A (zh) * 2015-11-27 2016-03-30 上海斐讯数据通信技术有限公司 一种基于虹膜特征打开应用的方法及移动终端
CN105893033A (zh) * 2016-03-29 2016-08-24 乐视控股(北京)有限公司 唤醒移动终端及应用程序的方法和系统
CN106484433A (zh) * 2016-11-02 2017-03-08 珠海格力电器股份有限公司 终端的解锁方法、终端的解锁装置及终端

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8881269B2 (en) * 2012-03-31 2014-11-04 Apple Inc. Device, method, and graphical user interface for integrating recognition of handwriting gestures with a screen reader
JP6023879B2 (ja) 2012-05-18 2016-11-09 アップル インコーポレイテッド 指紋センサ入力に基づくユーザインタフェースを操作するための機器、方法、及びグラフィカルユーザインタ−フェース
US20140055369A1 (en) * 2012-08-22 2014-02-27 Qualcomm Innovation Center, Inc. Single-gesture mobile computing device operations
JP6069117B2 (ja) * 2013-06-26 2017-02-01 京セラ株式会社 電子機器及び制御プログラム並びに動作方法
CN104717349A (zh) 2013-12-13 2015-06-17 中兴通讯股份有限公司 一种终端用户界面的显示方法及终端
US10379599B2 (en) * 2014-07-24 2019-08-13 Samsung Electronics Co., Ltd. Method for displaying items in an electronic device when the display screen is off
CN106415471A (zh) 2015-03-05 2017-02-15 华为技术有限公司 终端的用户界面的处理方法、用户界面和终端
CN105159677A (zh) * 2015-09-09 2015-12-16 深圳Tcl数字技术有限公司 终端用户界面的受控显示方法及装置
KR102253313B1 (ko) 2015-10-13 2021-05-20 후아웨이 테크놀러지 컴퍼니 리미티드 지문 식별을 사용하는 조작 방법 및 장치, 및 모바일 단말
CN109241714A (zh) * 2016-01-06 2019-01-18 阿里巴巴集团控股有限公司 一种信息图像显示方法及装置
CN107306311A (zh) 2016-04-25 2017-10-31 中兴通讯股份有限公司 一种屏幕解锁方法、装置及移动终端
US10379880B2 (en) * 2016-09-25 2019-08-13 International Business Machines Corporation Recovering missed display advertising
CN107992241A (zh) 2016-10-27 2018-05-04 中兴通讯股份有限公司 一种锁屏界面的控制方法及装置
CN107918506A (zh) 2017-09-14 2018-04-17 北京珠穆朗玛移动通信有限公司 一种指纹模组的触控方法、移动终端及存储介质
CN107943362B (zh) 2017-11-17 2021-03-30 深圳天珑无线科技有限公司 移动终端的用户界面显示方法以及装置
CN110554815B (zh) * 2018-05-30 2021-12-28 北京小米移动软件有限公司 图标唤醒方法、电子设备和存储介质
CN110058777B (zh) * 2019-03-13 2022-03-29 华为技术有限公司 快捷功能启动的方法及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104143060A (zh) * 2013-05-10 2014-11-12 中国电信股份有限公司 面向多应用的屏幕解锁方法和装置、以及移动终端
CN105447369A (zh) * 2015-11-27 2016-03-30 上海斐讯数据通信技术有限公司 一种基于虹膜特征打开应用的方法及移动终端
CN105893033A (zh) * 2016-03-29 2016-08-24 乐视控股(北京)有限公司 唤醒移动终端及应用程序的方法和系统
CN106484433A (zh) * 2016-11-02 2017-03-08 珠海格力电器股份有限公司 终端的解锁方法、终端的解锁装置及终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3798829A4 *

Also Published As

Publication number Publication date
US20230017370A1 (en) 2023-01-19
EP3798829A1 (en) 2021-03-31
US20210124903A1 (en) 2021-04-29
EP3798829A4 (en) 2021-06-16
US11482037B2 (en) 2022-10-25
US11941910B2 (en) 2024-03-26
CN112313623A (zh) 2021-02-02

Similar Documents

Publication Publication Date Title
EP3252644B1 (en) Method for activating function using fingerprint and electronic device including touch display supporting the same
US11868459B2 (en) Operation method with fingerprint recognition, apparatus, and mobile terminal
KR102578253B1 (ko) 전자 장치 및 전자 장치의 지문 정보 획득 방법
KR102364420B1 (ko) 전자 장치 및 터치 입력에 기초하여 상기 전자 장치를 제어하는 방법
US11567637B2 (en) Display control method and device
WO2018157662A1 (zh) 一种移动终端的显示控制方法及移动终端
KR102080183B1 (ko) 전자 장치 및 전자 장치에서 잠금 해제 방법
US11941910B2 (en) User interface display method of terminal, and terminal
US20150138101A1 (en) Mobile terminal and control method thereof
US20150077362A1 (en) Terminal with fingerprint reader and method for processing user input through fingerprint reader
KR102614046B1 (ko) 생체 데이터를 획득하기 위한 방법 및 그 전자 장치
EP3764254B1 (en) Fingerprint unlocking method, and terminal
KR20190090260A (ko) 지문 인식을 위한 방법, 전자 장치 및 저장 매체
US11703996B2 (en) User input interfaces
WO2021114690A1 (zh) 触控笔、终端及其控制方法和计算机可读存储介质
KR102536148B1 (ko) 전자 장치의 동작 방법 및 장치
CN111338524A (zh) 应用程序控制方法及电子设备
EP3528103B1 (en) Screen locking method, terminal and screen locking device
KR102266191B1 (ko) 스크린을 제어하는 휴대 단말 및 방법
KR20140092106A (ko) 터치 스크린 상의 사용자 입력을 처리하는 방법 및 단말과 저장 매체
KR20120134469A (ko) 움직임 감지장치를 이용한 휴대 단말의 포토 앨범 이미지 표시 방법 및 장치
WO2019051648A1 (zh) 一种触摸操作的响应方法及终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18876036

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018876036

Country of ref document: EP

Effective date: 20201223