CN111258461A - Implementation of biometric authentication - Google Patents

Implementation of biometric authentication Download PDF

Info

Publication number
CN111258461A
CN111258461A CN201911199010.1A CN201911199010A CN111258461A CN 111258461 A CN111258461 A CN 111258461A CN 201911199010 A CN201911199010 A CN 201911199010A CN 111258461 A CN111258461 A CN 111258461A
Authority
CN
China
Prior art keywords
biometric
electronic device
displaying
display
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911199010.1A
Other languages
Chinese (zh)
Inventor
M·范欧斯
R·阿巴希恩
P·D·安东
A·贝扎蒂
J·T·伯恩斯坦
J·R·达斯科拉
N·德弗里斯
L·迪瓦恩
A·德瑞尔
A·C·戴伊
C·P·福斯
B·W·格里芬
J·P·艾夫
C·莱门斯
J·A·玛丽亚
P·马里
D·莫塞尼
J-P·M·穆耶索斯
C·穆赛特
G·保罗
D·T·普里斯通
C·E·皮尤
P·沙玛
W·M·泰勒
H·沃威吉
G·耶基斯
C·H·应
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DKPA201770712A external-priority patent/DK201770712A1/en
Priority claimed from PCT/US2018/015603 external-priority patent/WO2018226265A1/en
Priority claimed from US15/894,221 external-priority patent/US10410076B2/en
Priority claimed from DKPA201870370A external-priority patent/DK179714B1/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN111258461A publication Critical patent/CN111258461A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/67Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2117User registration

Abstract

The present disclosure relates generally to implementing biometric authentication, including providing a user interface for: a biometric enrollment process tutorial, aligning biometric features for enrollment, enrolling biometric features, providing a prompt during a biometric enrollment process, biometric authentication based on an application, automatically populating a biometric secure field, unlocking a device using biometric authentication, retrying biometric authentication, managing transmissions using biometric authentication, an interposing user interface during biometric authentication, preventing retrying biometric authentication, caching biometric authentication, automatically populating a fillable field based on visibility criteria, automatically logging in using biometric authentication, retry the biometric authentication at the credential entry user interface, provide an indication of an error condition during the biometric authentication, providing an indication of a biometric sensor during biometric authentication, and orienting the device to register a biometric feature.

Description

Implementation of biometric authentication
The present application is a divisional application of an invention patent application having an application date of 2018, 9/1/h, an application number of 201880003211.7, and an invention name of "implementation of biometric authentication".
Cross Reference to Related Applications
This application claims priority from the following patent applications: U.S. provisional patent application No.62/556,413, "FACE electrode AND authencation", filed on 9/2017; no.62/557,130, "IMPLEMENTATION OF BIOMETRIC AUTHENTICATION", filed on 11.9.2017; danish patent application No. PA 201770712, "impact polymerization OF BIOMETRIC authencation", filed 2017 on 22.9.7.9.8.; no. PA 201770713, "IMPLEMENTATION OF BIOMETRIC AUTHENTICATION", filed 2017, 9, month 22; no. PA 201770714, "IMPLEMENTATION OF BIOMETRICAUTHENTICATION", filed 2017, 9, month 22; no. PA 201770715, "IMPLEMENTATION OFBIOMETRIC AUTHENTICATION", filed 2017, 9, month 22; U.S. provisional patent application No. 62/581,025, "IMPLEMENTATION OF BIOMETRIC AUTHENTICATION", filed 11/2/2017; international application No. PCT/US2018/015603, "IMPLEMENTATION OF BIOMETRIC AUTHENTICATION", filed on 26.1.2018; U.S. patent application No.15/894,221, "improvement OF biometricatutaneous" filed 2, 12.2018; U.S. patent application No.15/903,456, "improved notification of BIOMETRIC authencation", filed 2018, 23/2/8; U.S. provisional patent application No.62/679,955, "IMPLEMENTATION OF BIOMETRIC AUTHENTICATION", filed 2018, 6, 3.D.; danish patent application No. pa 201870370, "impact polymerization OF BIOMETRIC authencation", filed on 12.6.2018; and danish patent application No. pa 201870371, "impact OF biometricatutanition", filed on 12.6.2018. All of these applications are incorporated herein by reference in their entirety.
Technical Field
The present disclosure relates generally to biometric authentication, and more particularly, to interfaces and techniques for enrollment and authentication of biometric features.
Background
Biometric authentication of, for example, a face, iris, or fingerprint using an electronic device is a convenient and efficient method of authenticating a user of the electronic device. Biometric authentication allows a device to quickly and easily verify the identity of any number of users.
Disclosure of Invention
However, some techniques for implementing biometric authentication using electronic devices are often cumbersome. For example, some prior art techniques, such as those involving facial recognition, require a user to align biometric features nearly perfectly in the same way during both enrollment and each iteration of authentication. Deviations from alignment with biometric features often result in negative misidentification results. Thus, the user optionally needs to perform multiple iterations of biometric authentication unnecessarily, or optionally simply does not want to use biometric authentication. As another example, some prior art techniques rely only on two-dimensional representations of biometric features. Thus, authentication of the user is optionally limited by failing to analyze one or more three-dimensional features of the biometric feature, and the user is also optionally required to unnecessarily perform additional iterations of biometric authentication. In view of the foregoing drawbacks, the prior art requires more time than necessary, which wastes both the user's time and the device's energy. This latter consideration is particularly significant in the operation of battery-powered devices.
Thus, the present technology provides faster, more efficient methods and interfaces for electronic devices to enable biometric authentication. Such methods and interfaces optionally complement or replace other methods for implementing biometric authentication. Such methods and interfaces reduce the cognitive burden placed on the user and result in a more efficient human-machine interface. For battery-driven computing devices, such methods and interfaces conserve power and increase the time interval between battery charges. Such methods and interfaces also reduce the number of unnecessary, extraneous, or repetitive inputs required at computing devices, such as smartphones and smartwatches.
According to some examples, a method is described, the method comprising: at an electronic device having one or more input devices, one or more biometric sensors, and a display: displaying a first user interface on a display; while displaying the first user interface, detecting an occurrence of a condition corresponding to introducing a biometric enrollment process for enrolling a biometric characteristic; in response to detecting an occurrence of a condition corresponding to introducing a biometric enrollment process, displaying a biometric enrollment introduction interface, wherein displaying the biometric enrollment introduction interface includes concurrently displaying: a simulated representation of the biometric feature; and a simulated progress indicator; displaying a tutorial animation including displaying movement of the simulated representation of the biometric characteristic and incremental advancement of the simulated progress indicator while displaying the biometric enrollment introduction interface; after displaying at least a portion of the tutorial animation, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process; and in response to detecting the occurrence of a condition corresponding to initiating a biometric enrollment process: displaying a progress indicator corresponding to the simulated progress indicator; and displaying the representation of the biometric characteristic of the user determined by the one or more biometric sensors of the device at a location previously occupied by the simulated representation of the biometric characteristic in the biometric enrollment introduction interface.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with one or more input devices, one or more biometric sensors, and a display, the one or more programs including instructions for: displaying a first user interface on a display; while displaying the first user interface, detecting an occurrence of a condition corresponding to introducing a biometric enrollment process for enrolling a biometric characteristic; in response to detecting an occurrence of a condition corresponding to introducing a biometric enrollment process, displaying a biometric enrollment introduction interface, wherein displaying the biometric enrollment introduction interface includes concurrently displaying: a simulated representation of the biometric feature; and a simulated progress indicator; displaying a tutorial animation including displaying movement of the simulated representation of the biometric characteristic and incremental advancement of the simulated progress indicator while displaying the biometric enrollment introduction interface; after displaying at least a portion of the tutorial animation, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process; and in response to detecting the occurrence of a condition corresponding to initiating a biometric enrollment process: displaying a progress indicator corresponding to the simulated progress indicator; and displaying the representation of the biometric characteristic of the user determined by the one or more biometric sensors of the device at a location previously occupied by the simulated representation of the biometric characteristic in the biometric enrollment introduction interface.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with one or more input devices, one or more biometric sensors, and a display, the one or more programs including instructions for: displaying a first user interface on a display; while displaying the first user interface, detecting an occurrence of a condition corresponding to introducing a biometric enrollment process for enrolling a biometric characteristic; in response to detecting an occurrence of a condition corresponding to introducing a biometric enrollment process, displaying a biometric enrollment introduction interface, wherein displaying the biometric enrollment introduction interface includes concurrently displaying: a simulated representation of the biometric feature; and a simulated progress indicator; displaying a tutorial animation including displaying movement of the simulated representation of the biometric characteristic and incremental advancement of the simulated progress indicator while displaying the biometric enrollment introduction interface; after displaying at least a portion of the tutorial animation, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process; and in response to detecting the occurrence of a condition corresponding to initiating a biometric enrollment process: displaying a progress indicator corresponding to the simulated progress indicator; and displaying the representation of the biometric characteristic of the user determined by the one or more biometric sensors of the device at a location previously occupied by the simulated representation of the biometric characteristic in the biometric enrollment introduction interface.
According to some examples, an electronic device is described, the electronic device comprising: one or more input devices; one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: displaying a first user interface on a display; while displaying the first user interface, detecting an occurrence of a condition corresponding to introducing a biometric enrollment process for enrolling a biometric characteristic; in response to detecting an occurrence of a condition corresponding to introducing a biometric enrollment process, displaying a biometric enrollment introduction interface, wherein displaying the biometric enrollment introduction interface includes concurrently displaying: a simulated representation of the biometric feature; and a simulated progress indicator; displaying a tutorial animation including displaying movement of the simulated representation of the biometric characteristic and incremental advancement of the simulated progress indicator while displaying the biometric enrollment introduction interface; after displaying at least a portion of the tutorial animation, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process; and in response to detecting the occurrence of a condition corresponding to initiating a biometric enrollment process: displaying a progress indicator corresponding to the simulated progress indicator; and displaying the representation of the biometric characteristic of the user determined by the one or more biometric sensors of the device at a location previously occupied by the simulated representation of the biometric characteristic in the biometric enrollment introduction interface.
According to some examples, an electronic device is described, the electronic device comprising: one or more input devices; one or more biometric sensors; a display; means for displaying a first user interface on the display; means for performing the following: while displaying the first user interface, detecting an occurrence of a condition corresponding to introducing a biometric enrollment process for enrolling a biometric characteristic; means for performing the following: in response to detecting an occurrence of a condition corresponding to introducing a biometric enrollment process, displaying a biometric enrollment introduction interface, wherein displaying the biometric enrollment introduction interface includes concurrently displaying: a simulated representation of the biometric feature; and a simulated progress indicator; means for performing the following: displaying a tutorial animation including displaying movement of the simulated representation of the biometric characteristic and incremental advancement of the simulated progress indicator while displaying the biometric enrollment introduction interface; means for performing the following: after displaying at least a portion of the tutorial animation, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process; and means for, in response to detecting an occurrence of a condition corresponding to initiating a biometric enrollment process: means for displaying a progress indicator corresponding to the simulated progress indicator; and means for performing the following: displaying the representation of the biometric characteristic of the user determined by the one or more biometric sensors of the device at a location previously occupied by the simulated representation of the biometric characteristic in the biometric enrollment introduction interface.
According to some examples, a method is described, the method comprising: at an electronic device having one or more cameras and a display: displaying a first user interface on a display; while displaying the first user interface, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process for enrolling a respective type of biometric feature; in response to detecting an occurrence of a condition corresponding to initiating a biometric enrollment process, displaying a digital viewfinder on the display that includes a preview of image data captured by the one or more cameras; and after initiating the biometric enrollment process: in accordance with a determination that respective types of biometric features that satisfy alignment criteria have been detected in the fields of view of the one or more cameras, highlighting a first portion of the fields of view of the one or more cameras relative to a second portion of the fields of view of the one or more cameras; and in accordance with a determination that a respective type of biometric feature that satisfies an alignment criterion has not been detected in the field of view of the one or more cameras, maintaining display of the digital viewfinder without highlighting a first portion of the field of view of the one or more cameras relative to a second portion of the field of view of the one or more cameras.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with one or more cameras and a display, the one or more programs including instructions for: displaying a first user interface on a display; while displaying the first user interface, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process for enrolling a respective type of biometric feature; in response to detecting an occurrence of a condition corresponding to initiating a biometric enrollment process, displaying a digital viewfinder on the display that includes a preview of image data captured by the one or more cameras; and after initiating the biometric enrollment process: in accordance with a determination that respective types of biometric features that satisfy alignment criteria have been detected in the fields of view of the one or more cameras, highlighting a first portion of the fields of view of the one or more cameras relative to a second portion of the fields of view of the one or more cameras; and in accordance with a determination that a respective type of biometric feature that satisfies an alignment criterion has not been detected in the field of view of the one or more cameras, maintaining display of the digital viewfinder without highlighting a first portion of the field of view of the one or more cameras relative to a second portion of the field of view of the one or more cameras.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with one or more cameras and a display, the one or more programs including instructions for: displaying a first user interface on a display; while displaying the first user interface, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process for enrolling a respective type of biometric feature; in response to detecting an occurrence of a condition corresponding to initiating a biometric enrollment process, displaying a digital viewfinder on the display that includes a preview of image data captured by the one or more cameras; and after initiating the biometric enrollment process: in accordance with a determination that respective types of biometric features that satisfy alignment criteria have been detected in the fields of view of the one or more cameras, highlighting a first portion of the fields of view of the one or more cameras relative to a second portion of the fields of view of the one or more cameras; and in accordance with a determination that a respective type of biometric feature that satisfies an alignment criterion has not been detected in the field of view of the one or more cameras, maintaining display of the digital viewfinder without highlighting a first portion of the field of view of the one or more cameras relative to a second portion of the field of view of the one or more cameras.
According to some examples, an electronic device is described, the electronic device comprising: one or more cameras; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: displaying a first user interface on a display; while displaying the first user interface, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process for enrolling a respective type of biometric feature; in response to detecting an occurrence of a condition corresponding to initiating a biometric enrollment process, displaying a digital viewfinder on the display that includes a preview of image data captured by the one or more cameras; and after initiating the biometric enrollment process: in accordance with a determination that respective types of biometric features that satisfy alignment criteria have been detected in the fields of view of the one or more cameras, highlighting a first portion of the fields of view of the one or more cameras relative to a second portion of the fields of view of the one or more cameras; and in accordance with a determination that a respective type of biometric feature that satisfies an alignment criterion has not been detected in the field of view of the one or more cameras, maintaining display of the digital viewfinder without highlighting a first portion of the field of view of the one or more cameras relative to a second portion of the field of view of the one or more cameras.
According to some examples, an electronic device is described, the electronic device comprising: one or more cameras; a display; one or more processors; means for displaying a first user interface on the display; means for performing the following: while displaying the first user interface, detecting an occurrence of a condition corresponding to initiating a biometric enrollment process for enrolling a respective type of biometric feature; means for performing the following: in response to detecting an occurrence of a condition corresponding to initiating a biometric enrollment process, displaying a digital viewfinder on the display that includes a preview of image data captured by the one or more cameras; and after initiating the biometric enrollment process: means for performing the following: in accordance with a determination that respective types of biometric features that satisfy alignment criteria have been detected in the fields of view of the one or more cameras, highlighting a first portion of the fields of view of the one or more cameras relative to a second portion of the fields of view of the one or more cameras; and means for performing the following: in accordance with a determination that a respective type of biometric feature that satisfies an alignment criterion has not been detected in the field of view of the one or more cameras, maintaining display of the digital viewfinder without highlighting a first portion of the field of view of the one or more cameras relative to a second portion of the field of view of the one or more cameras.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: concurrently displaying a biometric enrollment interface on the display, wherein displaying the biometric enrollment interface includes concurrently displaying: a representation of the biometric feature, wherein the representation of the biometric feature has an orientation determined based on an alignment of the biometric feature with respect to one or more biometric sensors of the device; and a progress indicator comprising a first progress indicator portion at a first position on the display relative to the representation of the biometric feature and a second progress indicator portion at a second position on the display relative to the representation of the biometric feature, wherein the representation of the biometric feature is displayed on the display between the first position and the second position; while simultaneously displaying the representation of the biometric feature and the progress indicator, detecting a change in orientation of the biometric feature relative to the one or more biometric sensors; and in response to detecting a change in orientation of the biometric feature relative to the one or more biometric sensors: in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a first portion of the biometric feature corresponding to the first progress indicator portion, updating one or more visual features of the first progress indicator portion; and in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a second portion of the biometric feature corresponding to the second progress indicator portion, update one or more visual features of the second progress indicator portion.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: concurrently displaying a biometric enrollment interface on the display, wherein displaying the biometric enrollment interface includes concurrently displaying: a representation of the biometric feature, wherein the representation of the biometric feature has an orientation determined based on an alignment of the biometric feature with respect to one or more biometric sensors of the device; and a progress indicator comprising a first progress indicator portion at a first position on the display relative to the representation of the biometric feature and a second progress indicator portion at a second position on the display relative to the representation of the biometric feature, wherein the representation of the biometric feature is displayed on the display between the first position and the second position; while simultaneously displaying the representation of the biometric feature and the progress indicator, detecting a change in orientation of the biometric feature relative to the one or more biometric sensors; and in response to detecting a change in orientation of the biometric feature relative to the one or more biometric sensors: in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a first portion of the biometric feature corresponding to the first progress indicator portion, updating one or more visual features of the first progress indicator portion; and in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a second portion of the biometric feature corresponding to the second progress indicator portion, update one or more visual features of the second progress indicator portion.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: concurrently displaying a biometric enrollment interface on the display, wherein displaying the biometric enrollment interface includes concurrently displaying: a representation of the biometric feature, wherein the representation of the biometric feature has an orientation determined based on an alignment of the biometric feature with respect to one or more biometric sensors of the device; and a progress indicator comprising a first progress indicator portion at a first position on the display relative to the representation of the biometric feature and a second progress indicator portion at a second position on the display relative to the representation of the biometric feature, wherein the representation of the biometric feature is displayed on the display between the first position and the second position; while simultaneously displaying the representation of the biometric feature and the progress indicator, detecting a change in orientation of the biometric feature relative to the one or more biometric sensors; and in response to detecting a change in orientation of the biometric feature relative to the one or more biometric sensors: in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a first portion of the biometric feature corresponding to the first progress indicator portion, updating one or more visual features of the first progress indicator portion; and in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a second portion of the biometric feature corresponding to the second progress indicator portion, update one or more visual features of the second progress indicator portion.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: concurrently displaying a biometric enrollment interface on the display, wherein displaying the biometric enrollment interface includes concurrently displaying: a representation of the biometric feature, wherein the representation of the biometric feature has an orientation determined based on an alignment of the biometric feature with respect to one or more biometric sensors of the device; and a progress indicator comprising a first progress indicator portion at a first position on the display relative to the representation of the biometric feature and a second progress indicator portion at a second position on the display relative to the representation of the biometric feature, wherein the representation of the biometric feature is displayed on the display between the first position and the second position; while simultaneously displaying the representation of the biometric feature and the progress indicator, detecting a change in orientation of the biometric feature relative to the one or more biometric sensors; and in response to detecting a change in orientation of the biometric feature relative to the one or more biometric sensors: in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a first portion of the biometric feature corresponding to the first progress indicator portion, updating one or more visual features of the first progress indicator portion; and in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a second portion of the biometric feature corresponding to the second progress indicator portion, update one or more visual features of the second progress indicator portion.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; means for performing the following: concurrently displaying a biometric enrollment interface on the display, wherein displaying the biometric enrollment interface includes concurrently displaying: a representation of the biometric feature, wherein the representation of the biometric feature has an orientation determined based on an alignment of the biometric feature with respect to one or more biometric sensors of the device; and a progress indicator comprising a first progress indicator portion at a first position on the display relative to the representation of the biometric feature and a second progress indicator portion at a second position on the display relative to the representation of the biometric feature, wherein the representation of the biometric feature is displayed on the display between the first position and the second position; means for performing the following: while simultaneously displaying the representation of the biometric feature and the progress indicator, detecting a change in orientation of the biometric feature relative to the one or more biometric sensors; and means for, in response to detecting a change in orientation of the biometric feature relative to the one or more biometric sensors: means for performing the following: in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a first portion of the biometric feature corresponding to the first progress indicator portion, updating one or more visual features of the first progress indicator portion; and means for performing the following: in accordance with a determination that the change in orientation of the biometric feature satisfies enrollment criteria for a second portion of the biometric feature corresponding to the second progress indicator portion, one or more visual features of the second progress indicator portion are updated.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: displaying a biometric enrollment user interface for enrolling a biometric feature on the display, wherein displaying the biometric enrollment user interface includes displaying a representation of the biometric feature, wherein an appearance of the representation of the biometric feature changes as an orientation of the biometric feature relative to the one or more biometric sensors changes; while displaying the biometric enrollment user interface, detecting that enrollment prompt criteria have been met for one or more portions of the biometric feature; and in response to detecting that the enrollment prompting criteria have been met for one or more portions of the biometric feature, outputting a respective prompt for moving the biometric feature in a respective manner, wherein the respective prompt is selected based on an enrollment state of the one or more portions of the biometric feature, including: in accordance with a determination that registration-cue criteria have been met for a first portion of the biometric feature that is registerable by moving the biometric feature in a first manner, outputting the respective cue comprises outputting a cue for moving the biometric feature in the first manner; and in accordance with a determination that the enrollment prompting criteria have been met for a second portion of the biometric feature that is registerable by moving the biometric feature in a second manner different from the first manner, outputting the respective prompt includes outputting a prompt for moving the biometric feature in the second manner.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: displaying a biometric enrollment user interface for enrolling a biometric feature on the display, wherein displaying the biometric enrollment user interface includes displaying a representation of the biometric feature, wherein an appearance of the representation of the biometric feature changes as an orientation of the biometric feature relative to the one or more biometric sensors changes; while displaying the biometric enrollment user interface, detecting that enrollment prompt criteria have been met for one or more portions of the biometric feature; and in response to detecting that the enrollment prompting criteria have been met for one or more portions of the biometric feature, outputting a respective prompt for moving the biometric feature in a respective manner, wherein the respective prompt is selected based on an enrollment state of the one or more portions of the biometric feature, including: in accordance with a determination that registration-cue criteria have been met for a first portion of the biometric feature that is registerable by moving the biometric feature in a first manner, outputting the respective cue comprises outputting a cue for moving the biometric feature in the first manner; and in accordance with a determination that the enrollment prompting criteria have been met for a second portion of the biometric feature that is registerable by moving the biometric feature in a second manner different from the first manner, outputting the respective prompt includes outputting a prompt for moving the biometric feature in the second manner.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: displaying a biometric enrollment user interface for enrolling a biometric feature on the display, wherein displaying the biometric enrollment user interface includes displaying a representation of the biometric feature, wherein an appearance of the representation of the biometric feature changes as an orientation of the biometric feature relative to the one or more biometric sensors changes; while displaying the biometric enrollment user interface, detecting that enrollment prompt criteria have been met for one or more portions of the biometric feature; and in response to detecting that the enrollment prompting criteria have been met for one or more portions of the biometric feature, outputting a respective prompt for moving the biometric feature in a respective manner, wherein the respective prompt is selected based on an enrollment state of the one or more portions of the biometric feature, including: in accordance with a determination that registration-cue criteria have been met for a first portion of the biometric feature that is registerable by moving the biometric feature in a first manner, outputting the respective cue comprises outputting a cue for moving the biometric feature in the first manner; and in accordance with a determination that the enrollment prompting criteria have been met for a second portion of the biometric feature that is registerable by moving the biometric feature in a second manner different from the first manner, outputting the respective prompt includes outputting a prompt for moving the biometric feature in the second manner.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: displaying a biometric enrollment user interface for enrolling a biometric feature on the display, wherein displaying the biometric enrollment user interface includes displaying a representation of the biometric feature, wherein an appearance of the representation of the biometric feature changes as an orientation of the biometric feature relative to the one or more biometric sensors changes; while displaying the biometric enrollment user interface, detecting that enrollment prompt criteria have been met for one or more portions of the biometric feature; and in response to detecting that the enrollment prompting criteria have been met for one or more portions of the biometric feature, outputting a respective prompt for moving the biometric feature in a respective manner, wherein the respective prompt is selected based on an enrollment state of the one or more portions of the biometric feature, including: in accordance with a determination that registration-cue criteria have been met for a first portion of the biometric feature that is registerable by moving the biometric feature in a first manner, outputting the respective cue comprises outputting a cue for moving the biometric feature in the first manner; and in accordance with a determination that the enrollment prompting criteria have been met for a second portion of the biometric feature that is registerable by moving the biometric feature in a second manner different from the first manner, outputting the respective prompt includes outputting a prompt for moving the biometric feature in the second manner.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; means for performing the following: displaying a biometric enrollment user interface for enrolling a biometric feature on the display, wherein displaying the biometric enrollment user interface includes displaying a representation of the biometric feature, wherein an appearance of the representation of the biometric feature changes as an orientation of the biometric feature relative to the one or more biometric sensors changes; means for performing the following: while displaying the biometric enrollment user interface, detecting that enrollment prompt criteria have been met for one or more portions of the biometric feature; and means for performing the following: in response to detecting that enrollment prompt criteria have been met for one or more portions of the biometric feature, outputting a respective prompt for moving the biometric feature in a respective manner, wherein the respective prompt is selected based on an enrollment state of the one or more portions of the biometric feature, comprising: means for performing the following: in accordance with a determination that registration-cue criteria have been met for a first portion of the biometric feature that is registerable by moving the biometric feature in a first manner, outputting the respective cue comprises outputting a cue for moving the biometric feature in the first manner; and means for performing the following: in accordance with a determination that the enrollment prompting criteria have been met for a second portion of the biometric feature that is registerable by moving the biometric feature in a second manner different from the first manner, outputting the respective prompt includes outputting a prompt for moving the biometric feature in the second manner.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: simultaneously displaying on the display: an application interface corresponding to an application program; and a biometric authentication interface controlled by an operating system of the electronic device, wherein the biometric authentication interface is displayed over a portion of the application interface; while displaying the biometric authentication interface, obtaining biometric data corresponding to at least a portion of the biometric feature from the one or more biometric sensors; and in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data: providing authentication information to the application indicating that biometric authentication criteria have been met for the one or more portions of the biometric characteristic; and maintaining the display of the biometric authentication interface for a predetermined amount of time after providing the authentication information to the application.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: simultaneously displaying on the display: an application interface corresponding to an application program; and a biometric authentication interface controlled by an operating system of the electronic device, wherein the biometric authentication interface is displayed over a portion of the application interface; while displaying the biometric authentication interface, obtaining biometric data corresponding to at least a portion of the biometric feature from the one or more biometric sensors; and in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data: providing authentication information to the application indicating that biometric authentication criteria have been met for the one or more portions of the biometric characteristic; and maintaining the display of the biometric authentication interface for a predetermined amount of time after providing the authentication information to the application.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: simultaneously displaying on the display: an application interface corresponding to an application program; and a biometric authentication interface controlled by an operating system of the electronic device, wherein the biometric authentication interface is displayed over a portion of the application interface; while displaying the biometric authentication interface, obtaining biometric data corresponding to at least a portion of the biometric feature from the one or more biometric sensors; and in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data: providing authentication information to the application indicating that biometric authentication criteria have been met for the one or more portions of the biometric characteristic; and maintaining the display of the biometric authentication interface for a predetermined amount of time after providing the authentication information to the application.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: simultaneously displaying on the display: an application interface corresponding to an application program; and a biometric authentication interface controlled by an operating system of the electronic device, wherein the biometric authentication interface is displayed over a portion of the application interface; while displaying the biometric authentication interface, obtaining biometric data corresponding to at least a portion of the biometric feature from the one or more biometric sensors; and in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data: providing authentication information to the application indicating that biometric authentication criteria have been met for the one or more portions of the biometric characteristic; and maintaining the display of the biometric authentication interface for a predetermined amount of time after providing the authentication information to the application.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; means for concurrently displaying an application interface corresponding to the application program on the display; and a biometric authentication interface controlled by an operating system of the electronic device, wherein the biometric authentication interface is displayed over a portion of the application interface; means for obtaining biometric data corresponding to at least a portion of a biometric feature from the one or more biometric sensors while displaying a biometric authentication interface; and means for, in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data: means for performing the following: providing authentication information to the application indicating that biometric authentication criteria have been met for the one or more portions of the biometric characteristic; and means for performing the following: the display of the biometric authentication interface is maintained for a predetermined amount of time after providing the authentication information to the application.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: displaying an application interface including the fillable fields on the display; while displaying the application interface, receiving a request for automatically populating a fillable field of the application interface; and in response to receiving a request to automatically populate a fillable field of an application interface: in accordance with a determination that the tillable fields of the application interface are associated with the first type of data, automatically filling the tillable fields with the first type of data; and in accordance with a determination that the tillable field of the application is associated with the second type of data, and based on at least a portion of the biometric characteristic determined from the data corresponding to the biometric characteristic obtained from the one or more biometric sensors satisfying the biometric authentication criteria, automatically filling the tillable field with the second type of data.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: displaying an application interface including the fillable fields on the display; while displaying the application interface, receiving a request for automatically populating a fillable field of the application interface; and in response to receiving a request to automatically populate a fillable field of an application interface: in accordance with a determination that the tillable fields of the application interface are associated with the first type of data, automatically filling the tillable fields with the first type of data; and in accordance with a determination that the tillable field of the application is associated with the second type of data, and based on at least a portion of the biometric characteristic determined from the data corresponding to the biometric characteristic obtained from the one or more biometric sensors satisfying the biometric authentication criteria, automatically filling the tillable field with the second type of data.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: displaying an application interface including the fillable fields on the display; while displaying the application interface, receiving a request for automatically populating a fillable field of the application interface; and in response to receiving a request to automatically populate a fillable field of an application interface: in accordance with a determination that the tillable fields of the application interface are associated with the first type of data, automatically filling the tillable fields with the first type of data; and in accordance with a determination that the tillable field of the application is associated with the second type of data, and based on at least a portion of the biometric characteristic determined from the data corresponding to the biometric characteristic obtained from the one or more biometric sensors satisfying the biometric authentication criteria, automatically filling the tillable field with the second type of data.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: displaying an application interface including the fillable fields on the display; while displaying the application interface, receiving a request for automatically populating a fillable field of the application interface; and in response to receiving a request to automatically populate a fillable field of an application interface: in accordance with a determination that the tillable fields of the application interface are associated with the first type of data, automatically filling the tillable fields with the first type of data; and in accordance with a determination that the tillable field of the application is associated with the second type of data, and based on at least a portion of the biometric characteristic determined from the data corresponding to the biometric characteristic obtained from the one or more biometric sensors satisfying the biometric authentication criteria, automatically filling the tillable field with the second type of data.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; means for displaying an application interface on the display that includes the tillable fields; means for receiving a request to automatically populate a fillable field of an application interface while the application interface is displayed; and means for, in response to receiving a request to automatically populate a fillable field of an application interface: means for performing the following: in accordance with a determination that the tillable fields of the application interface are associated with the first type of data, automatically filling the tillable fields with the first type of data; and means for performing the following: in accordance with a determination that the tillable field of the application is associated with the second type of data, and based on at least a portion of the biometric characteristic determined from the data corresponding to the biometric characteristic obtained from the one or more biometric sensors satisfying the biometric authentication criteria, automatically filling the tillable field with the second type of data.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: detecting that a device wake-up criterion has been met; in response to detecting that the device wake-up criteria have been met, transitioning the electronic device from the first visual state to a second visual state; and after transitioning the device to the second visual state: in accordance with a determination that biometric authentication criteria have been met based on biometric data provided by the one or more biometric sensors, transitioning the electronic device from the second visual state to a third visual state, wherein the transition from the second visual state to the third visual state is a continuation of the transition from the first visual state to the second visual state; and in accordance with a determination that the biometric authentication criteria have not been met based on the biometric data provided by the one or more biometric sensors, maintaining the electronic device in the second visual state.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: detecting that a device wake-up criterion has been met; in response to detecting that the device wake-up criteria have been met, transitioning the electronic device from the first visual state to a second visual state; and after transitioning the device to the second visual state: in accordance with a determination that biometric authentication criteria have been met based on biometric data provided by the one or more biometric sensors, transitioning the electronic device from the second visual state to a third visual state, wherein the transition from the second visual state to the third visual state is a continuation of the transition from the first visual state to the second visual state; and in accordance with a determination that the biometric authentication criteria have not been met based on the biometric data provided by the one or more biometric sensors, maintaining the electronic device in the second visual state.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: detecting that a device wake-up criterion has been met; in response to detecting that the device wake-up criteria have been met, transitioning the electronic device from the first visual state to a second visual state; and after transitioning the device to the second visual state: in accordance with a determination that biometric authentication criteria have been met based on biometric data provided by the one or more biometric sensors, transitioning the electronic device from the second visual state to a third visual state, wherein the transition from the second visual state to the third visual state is a continuation of the transition from the first visual state to the second visual state; and in accordance with a determination that the biometric authentication criteria have not been met based on the biometric data provided by the one or more biometric sensors, maintaining the electronic device in the second visual state.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: detecting that a device wake-up criterion has been met; in response to detecting that the device wake-up criteria have been met, transitioning the electronic device from the first visual state to a second visual state; and after transitioning the device to the second visual state: in accordance with a determination that biometric authentication criteria have been met based on biometric data provided by the one or more biometric sensors, transitioning the electronic device from the second visual state to a third visual state, wherein the transition from the second visual state to the third visual state is a continuation of the transition from the first visual state to the second visual state; and in accordance with a determination that the biometric authentication criteria have not been met based on the biometric data provided by the one or more biometric sensors, maintaining the electronic device in the second visual state.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; means for detecting that a device wake-up criterion has been met; means for transitioning the electronic device from the first visual state to the second visual state in response to detecting that the device wake criteria have been met; and means for, after transitioning the device to the second visual state: means for performing the following: in accordance with a determination that biometric authentication criteria have been met based on biometric data provided by the one or more biometric sensors, transitioning the electronic device from the second visual state to a third visual state, wherein the transition from the second visual state to the third visual state is a continuation of the transition from the first visual state to the second visual state; and means for performing the following: in accordance with a determination that biometric authentication criteria have not been met based on biometric data provided by the one or more biometric sensors, the electronic device is maintained in the second visual state.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: detecting a condition associated with performing a biometric authentication check using a biometric sensor without an explicit input from a user requesting biometric authentication while the electronic device is in a locked state; and in response to detecting the condition, performing a first biometric authentication check, comprising: capturing first biometric data using the one or more biometric sensors; after capturing the first biometric data: in accordance with a determination that the first biometric data meets biometric authentication criteria, transitioning the device from the locked state to the unlocked state; and in accordance with a determination that the first biometric data does not meet the biometric authentication criteria, maintaining the device in a locked state; after performing the first biometric authentication check, detecting, via the device, a request to perform a corresponding operation without receiving further authentication information from the user; and in response to detecting a request to perform a corresponding operation: executing corresponding operation according to the fact that the corresponding operation does not need authentication; executing corresponding operation according to the fact that the corresponding operation needs to be authenticated and the equipment is in an unlocking state; and upon determining that the respective operation requires authentication and that the device is in a locked state: capturing second biometric data using the one or more biometric sensors without explicit input from the user requesting a second biometric authentication check; and after capturing the second biometric data, performing a second biometric authentication check, comprising: executing corresponding operation according to the fact that the second biological identification data meets the biological identification authentication standard; and in accordance with a determination that the second biometric data does not satisfy the biometric authentication criteria, forgo performing the corresponding operation.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: detecting a condition associated with performing a biometric authentication check using a biometric sensor without an explicit input from a user requesting biometric authentication while the electronic device is in a locked state; and in response to detecting the condition, performing a first biometric authentication check, comprising: capturing first biometric data using the one or more biometric sensors; after capturing the first biometric data: in accordance with a determination that the first biometric data meets biometric authentication criteria, transitioning the device from the locked state to the unlocked state; and in accordance with a determination that the first biometric data does not meet the biometric authentication criteria, maintaining the device in a locked state; after performing the first biometric authentication check, detecting, via the device, a request to perform a corresponding operation without receiving further authentication information from the user; and in response to detecting a request to perform a corresponding operation: executing corresponding operation according to the fact that the corresponding operation does not need authentication; executing corresponding operation according to the fact that the corresponding operation needs to be authenticated and the equipment is in an unlocking state; and upon determining that the respective operation requires authentication and that the device is in a locked state: capturing second biometric data using the one or more biometric sensors without explicit input from the user requesting a second biometric authentication check; and after capturing the second biometric data, performing a second biometric authentication check, comprising: executing corresponding operation according to the fact that the second biological identification data meets the biological identification authentication standard; and in accordance with a determination that the second biometric data does not satisfy the biometric authentication criteria, forgo performing the corresponding operation.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: detecting a condition associated with performing a biometric authentication check using a biometric sensor without an explicit input from a user requesting biometric authentication while the electronic device is in a locked state; and in response to detecting the condition, performing a first biometric authentication check, comprising: capturing first biometric data using the one or more biometric sensors; after capturing the first biometric data: in accordance with a determination that the first biometric data meets biometric authentication criteria, transitioning the device from the locked state to the unlocked state; and in accordance with a determination that the first biometric data does not meet the biometric authentication criteria, maintaining the device in a locked state; after performing the first biometric authentication check, detecting, via the device, a request to perform a corresponding operation without receiving further authentication information from the user; and in response to detecting a request to perform a corresponding operation: executing corresponding operation according to the fact that the corresponding operation does not need authentication; executing corresponding operation according to the fact that the corresponding operation needs to be authenticated and the equipment is in an unlocking state; and upon determining that the respective operation requires authentication and that the device is in a locked state: capturing second biometric data using the one or more biometric sensors without explicit input from the user requesting a second biometric authentication check; and after capturing the second biometric data, performing a second biometric authentication check, comprising: executing corresponding operation according to the fact that the second biological identification data meets the biological identification authentication standard; and in accordance with a determination that the second biometric data does not satisfy the biometric authentication criteria, forgo performing the corresponding operation.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: detecting a condition associated with performing a biometric authentication check using a biometric sensor without an explicit input from a user requesting biometric authentication while the electronic device is in a locked state; and in response to detecting the condition, performing a first biometric authentication check, comprising: capturing first biometric data using the one or more biometric sensors; after capturing the first biometric data: in accordance with a determination that the first biometric data meets biometric authentication criteria, transitioning the device from the locked state to the unlocked state; and in accordance with a determination that the first biometric data does not meet the biometric authentication criteria, maintaining the device in a locked state; after performing the first biometric authentication check, detecting, via the device, a request to perform a corresponding operation without receiving further authentication information from the user; and in response to detecting a request to perform a corresponding operation: executing corresponding operation according to the fact that the corresponding operation does not need authentication; executing corresponding operation according to the fact that the corresponding operation needs to be authenticated and the equipment is in an unlocking state; and upon determining that the respective operation requires authentication and that the device is in a locked state: capturing second biometric data using the one or more biometric sensors without explicit input from the user requesting a second biometric authentication check; and after capturing the second biometric data, performing a second biometric authentication check, comprising: executing corresponding operation according to the fact that the second biological identification data meets the biological identification authentication standard; and in accordance with a determination that the second biometric data does not satisfy the biometric authentication criteria, forgo performing the corresponding operation.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; means for performing the following: detecting a condition associated with performing a biometric authentication check using a biometric sensor without an explicit input from a user requesting biometric authentication while the electronic device is in a locked state; and means for performing the following: in response to detecting the condition, performing a first biometric authentication check, comprising: means for capturing first biometric data using the one or more biometric sensors; means for, after capturing the first biometric data: means for performing the following: in accordance with a determination that the first biometric data meets biometric authentication criteria, transitioning the device from the locked state to the unlocked state; and means for performing the following: in accordance with a determination that the first biometric data does not satisfy the biometric authentication criteria, maintaining the device in a locked state; means for performing the following: after performing the first biometric authentication check, detecting, via the device, a request to perform a corresponding operation without receiving further authentication information from the user; and means for, in response to detecting a request to perform a respective operation: means for performing the following: executing corresponding operation according to the fact that the corresponding operation does not need authentication; means for performing the following: executing corresponding operation according to the fact that the corresponding operation needs to be authenticated and the equipment is in an unlocking state; and means for, in accordance with a determination that the respective operation requires authentication and the device is in a locked state: means for performing the following: capturing second biometric data using the one or more biometric sensors without explicit input from the user requesting a second biometric authentication check; and means for performing the following: after capturing the second biometric data, performing a second biometric authentication check, comprising: means for performing the following: executing corresponding operation according to the fact that the second biological identification data meets the biological identification authentication standard; and means for performing the following: and according to the determination that the second biological identification data does not meet the biological identification authentication standard, abandoning to execute corresponding operation.
According to some examples, a method is described, the method comprising: at an electronic device having a display, a button, and one or more biometric sensors separate from the button: detecting one or more activations of a button while the electronic device is in a first state in which respective functions of the device are disabled; and in response to detecting the one or more activations of the button: capturing biometric data with the one or more biometric sensors separate from the button; in accordance with a determination that the biometric data meets the biometric authentication criteria, transitioning the electronic device to a second state in which a respective function of the device is enabled; and in accordance with a determination that the biometric data does not satisfy the biometric authentication criteria, maintaining the electronic device in the first state and displaying on the display an indication that the biometric authentication has failed.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured to be executed by one or more processors of an electronic device with a display, a button, and one or more biometric sensors separate from the button, the one or more programs including instructions for: detecting one or more activations of a button while the electronic device is in a first state in which respective functions of the device are disabled; and in response to detecting the one or more activations of the button: capturing biometric data with the one or more biometric sensors separate from the button; in accordance with a determination that the biometric data meets the biometric authentication criteria, transitioning the electronic device to a second state in which a respective function of the device is enabled; and in accordance with a determination that the biometric data does not satisfy the biometric authentication criteria, maintaining the electronic device in the first state and displaying on the display an indication that the biometric authentication has failed.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display, a button, and one or more biometric sensors separate from the button, the one or more programs including instructions for: detecting one or more activations of a button while the electronic device is in a first state in which respective functions of the device are disabled; and in response to detecting the one or more activations of the button: capturing biometric data with the one or more biometric sensors separate from the button; in accordance with a determination that the biometric data meets the biometric authentication criteria, transitioning the electronic device to a second state in which a respective function of the device is enabled; and in accordance with a determination that the biometric data does not satisfy the biometric authentication criteria, maintaining the electronic device in the first state and displaying on the display an indication that the biometric authentication has failed.
According to some examples, an electronic device is described, the electronic device comprising: a display; a button; one or more biometric sensors separate from the button; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: detecting one or more activations of a button while the electronic device is in a first state in which respective functions of the device are disabled; and in response to detecting the one or more activations of the button: capturing biometric data with the one or more biometric sensors separate from the button; in accordance with a determination that the biometric data meets the biometric authentication criteria, transitioning the electronic device to a second state in which a respective function of the device is enabled; and in accordance with a determination that the biometric data does not satisfy the biometric authentication criteria, maintaining the electronic device in the first state and displaying on the display an indication that the biometric authentication has failed.
According to some examples, an electronic device is described, the electronic device comprising: a display; a button; one or more biometric sensors separate from the button; means for detecting one or more activations of a button while the electronic device is in a first state in which respective functions of the device are disabled; and means for, in response to detecting the one or more activations of a button: means for capturing biometric data with the one or more biometric sensors separate from the button; means for performing the following: in accordance with a determination that the biometric data meets the biometric authentication criteria, transitioning the electronic device to a second state in which a respective function of the device is enabled; and means for performing the following: in accordance with a determination that the biometric data does not satisfy the biometric authentication criteria, the electronic device is maintained in the first state and an indication is displayed on the display that the biometric authentication has failed.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: detecting a request for performing a corresponding operation requiring authentication; and in response to detecting a request to perform a corresponding operation requiring authentication: according to the fact that the equipment is unlocked, corresponding operation is executed; and in accordance with a determination that the device is locked and that a first form of authentication is available, displaying on the display an authentication indicator for the first form of authentication without displaying one or more affordances (affordance) for using a second form of authentication, wherein the first form of authentication is a form of biometric authentication based on data obtained by the one or more biometric sensors.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: detecting a request for performing a corresponding operation requiring authentication; and in response to detecting a request to perform a corresponding operation requiring authentication: according to the fact that the equipment is unlocked, corresponding operation is executed; and in accordance with a determination that the device is locked and that the first form of authentication is available, displaying on the display an authentication indicator for the first form of authentication without displaying one or more affordances for using the second form of authentication, wherein the first form of authentication is a form of biometric authentication based on data obtained by the one or more biometric sensors.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: detecting a request for performing a corresponding operation requiring authentication; and in response to detecting a request to perform a corresponding operation requiring authentication: according to the fact that the equipment is unlocked, corresponding operation is executed; and in accordance with a determination that the device is locked and that the first form of authentication is available, displaying on the display an authentication indicator for the first form of authentication without displaying one or more affordances for using the second form of authentication, wherein the first form of authentication is a form of biometric authentication based on data obtained by the one or more biometric sensors.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: detecting a request for performing a corresponding operation requiring authentication; and in response to detecting a request to perform a corresponding operation requiring authentication: according to the fact that the equipment is unlocked, corresponding operation is executed; and in accordance with a determination that the device is locked and that the first form of authentication is available, displaying on the display an authentication indicator for the first form of authentication without displaying one or more affordances for using the second form of authentication, wherein the first form of authentication is a form of biometric authentication based on data obtained by the one or more biometric sensors.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; means for detecting a request to perform a respective operation requiring authentication; and means for, in response to detecting a request to perform a respective operation requiring authentication: means for performing a corresponding operation in response to determining that the device is unlocked; and means for performing the following: in accordance with a determination that the device is locked and that the first form of authentication is available, displaying on the display an authentication indicator for the first form of authentication without displaying one or more affordances for using the second form of authentication, wherein the first form of authentication is a form of biometric authentication based on data obtained by the one or more biometric sensors.
According to some examples, a method is described, the method comprising: at an electronic device having one or more biometric sensors: receiving a first request for executing a corresponding operation requiring authentication; in response to receiving a first request to perform a respective operation: determining, using the one or more biometric sensors, whether biometric authentication criteria are satisfied, wherein the biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a respective operation be detected by the biometric sensor; executing corresponding operation according to the fact that the biometric authentication standard is met; and, based on a determination that the biometric authentication criteria are not satisfied, forgo execution of the corresponding operation; subsequent to determining, in response to receiving the first request, that the biometric authentication criteria are not satisfied, receiving a second request to perform a respective operation; and in response to receiving a second request to perform a corresponding operation: in accordance with a determination in response to a first request that biometric authentication criteria are not satisfied because the one or more biometric sensors do not detect the presence of a respective type of biometric feature, determining in response to a second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors; and in accordance with a determination in response to the first request that the biometric authentication criteria are not satisfied due to the one or more biometric sensors detecting respective types of biometric features that do not correspond to authorized biometric features, forgo determining in response to the second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors.
According to some examples, a non-transitory computer-readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device with one or more biometric sensors, the one or more programs including instructions for: receiving a first request for executing a corresponding operation requiring authentication; in response to receiving a first request to perform a respective operation: determining, using the one or more biometric sensors, whether biometric authentication criteria are satisfied, wherein the biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a respective operation be detected by the biometric sensor; executing corresponding operation according to the fact that the biometric authentication standard is met; and, based on a determination that the biometric authentication criteria are not satisfied, forgo execution of the corresponding operation; subsequent to determining, in response to receiving the first request, that the biometric authentication criteria are not satisfied, receiving a second request to perform a respective operation; and in response to receiving a second request to perform a corresponding operation: in accordance with a determination in response to a first request that biometric authentication criteria are not satisfied because the one or more biometric sensors do not detect the presence of a respective type of biometric feature, determining in response to a second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors; and in accordance with a determination in response to the first request that the biometric authentication criteria are not satisfied due to the one or more biometric sensors detecting respective types of biometric features that do not correspond to authorized biometric features, forgo determining in response to the second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors.
According to some examples, a transitory computer-readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device with one or more biometric sensors, the one or more programs including instructions for: receiving a first request for executing a corresponding operation requiring authentication; in response to receiving a first request to perform a respective operation: determining, using the one or more biometric sensors, whether biometric authentication criteria are satisfied, wherein the biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a respective operation be detected by the biometric sensor; executing corresponding operation according to the fact that the biometric authentication standard is met; and, based on a determination that the biometric authentication criteria are not satisfied, forgo execution of the corresponding operation; subsequent to determining, in response to receiving the first request, that the biometric authentication criteria are not satisfied, receiving a second request to perform a respective operation; and in response to receiving a second request to perform a corresponding operation: in accordance with a determination in response to a first request that biometric authentication criteria are not satisfied because the one or more biometric sensors do not detect the presence of a respective type of biometric feature, determining in response to a second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors; and in accordance with a determination in response to the first request that the biometric authentication criteria are not satisfied due to the one or more biometric sensors detecting respective types of biometric features that do not correspond to authorized biometric features, forgo determining in response to the second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: receiving a first request for executing a corresponding operation requiring authentication; in response to receiving a first request to perform a respective operation: determining, using the one or more biometric sensors, whether biometric authentication criteria are satisfied, wherein the biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a respective operation be detected by the biometric sensor; executing corresponding operation according to the fact that the biometric authentication standard is met; and, based on a determination that the biometric authentication criteria are not satisfied, forgo execution of the corresponding operation; subsequent to determining, in response to receiving the first request, that the biometric authentication criteria are not satisfied, receiving a second request to perform a respective operation; and in response to receiving a second request to perform a corresponding operation: in accordance with a determination in response to a first request that biometric authentication criteria are not satisfied because the one or more biometric sensors do not detect the presence of a respective type of biometric feature, determining in response to a second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors; and in accordance with a determination in response to the first request that the biometric authentication criteria are not satisfied due to the one or more biometric sensors detecting respective types of biometric features that do not correspond to authorized biometric features, forgo determining in response to the second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; means for receiving a first request to perform a respective operation requiring authentication; means for, in response to receiving a first request to perform a respective operation: determining, using the one or more biometric sensors, whether biometric authentication criteria are satisfied, wherein the biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a respective operation be detected by the biometric sensor; executing corresponding operation according to the fact that the biometric authentication standard is met; and, based on a determination that the biometric authentication criteria are not satisfied, forgo execution of the corresponding operation; means for receiving a second request for performing a respective operation subsequent to determining, in response to receiving the first request, that the biometric authentication criteria are not satisfied; and means for, in response to receiving a second request to perform a respective operation: in accordance with a determination in response to a first request that biometric authentication criteria are not satisfied because the one or more biometric sensors do not detect the presence of a respective type of biometric feature, determining in response to a second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors; and in accordance with a determination in response to the first request that the biometric authentication criteria are not satisfied due to the one or more biometric sensors detecting respective types of biometric features that do not correspond to authorized biometric features, forgo determining in response to the second request whether the biometric authentication criteria are satisfied using the one or more biometric sensors.
According to some examples, a method is described, the method comprising: at an electronic device having one or more biometric sensors: receiving a first request for executing a first operation requiring authentication; in response to receiving a first request to perform a first operation: determining, using the one or more biometric sensors, whether a first biometric authentication criterion is satisfied, wherein the first biometric authentication criterion includes a requirement that a respective type of biometric feature authorized to perform a first operation be detected by the biometric sensor; in accordance with a determination that the first biometric authentication criteria are satisfied, performing a first operation; and in accordance with a determination that the biometric authentication criteria are not satisfied, forgoing performance of the first operation; after the first operation is executed, receiving a second request for executing a second operation requiring authentication; and in response to receiving the second request: in accordance with a determination that the re-authentication criteria have been met, determining, using the one or more biometric sensors, whether a second biometric authentication criteria are met, wherein the second biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a second operation be detected by the biometric sensor; and in accordance with a determination that the re-authentication criteria have not been met, performing a second operation without performing the biometric authentication and forgoing using the one or more biometric sensors to determine whether the second biometric authentication criteria are met.
According to some examples, a non-transitory computer-readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device with one or more biometric sensors, the one or more programs including instructions for: receiving a first request for executing a first operation requiring authentication; in response to receiving a first request to perform a first operation: determining, using the one or more biometric sensors, whether a first biometric authentication criterion is satisfied, wherein the first biometric authentication criterion includes a requirement that a respective type of biometric feature authorized to perform a first operation be detected by the biometric sensor; in accordance with a determination that the first biometric authentication criteria are satisfied, performing a first operation; and in accordance with a determination that the biometric authentication criteria are not satisfied, forgoing performance of the first operation; after the first operation is executed, receiving a second request for executing a second operation requiring authentication; and in response to receiving the second request: in accordance with a determination that the re-authentication criteria have been met, determining, using the one or more biometric sensors, whether a second biometric authentication criteria are met, wherein the second biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a second operation be detected by the biometric sensor; and in accordance with a determination that the re-authentication criteria have not been met, performing a second operation without performing the biometric authentication and forgoing using the one or more biometric sensors to determine whether the second biometric authentication criteria are met.
According to some examples, a transitory computer-readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device with one or more biometric sensors, the one or more programs including instructions for: receiving a first request for executing a first operation requiring authentication; in response to receiving a first request to perform a first operation: determining, using the one or more biometric sensors, whether a first biometric authentication criterion is satisfied, wherein the first biometric authentication criterion includes a requirement that a respective type of biometric feature authorized to perform a first operation be detected by the biometric sensor; in accordance with a determination that the first biometric authentication criteria are satisfied, performing a first operation; and in accordance with a determination that the biometric authentication criteria are not satisfied, forgoing performance of the first operation; after the first operation is executed, receiving a second request for executing a second operation requiring authentication; and in response to receiving the second request: in accordance with a determination that the re-authentication criteria have been met, determining, using the one or more biometric sensors, whether a second biometric authentication criteria are met, wherein the second biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a second operation be detected by the biometric sensor; and in accordance with a determination that the re-authentication criteria have not been met, performing a second operation without performing the biometric authentication and forgoing using the one or more biometric sensors to determine whether the second biometric authentication criteria are met.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: receiving a first request for executing a first operation requiring authentication; in response to receiving a first request to perform a first operation: determining, using the one or more biometric sensors, whether a first biometric authentication criterion is satisfied, wherein the first biometric authentication criterion includes a requirement that a respective type of biometric feature authorized to perform a first operation be detected by the biometric sensor; in accordance with a determination that the first biometric authentication criteria are satisfied, performing a first operation; and in accordance with a determination that the biometric authentication criteria are not satisfied, forgoing performance of the first operation; after the first operation is executed, receiving a second request for executing a second operation requiring authentication; and in response to receiving the second request: in accordance with a determination that the re-authentication criteria have been met, determining, using the one or more biometric sensors, whether a second biometric authentication criteria are met, wherein the second biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a second operation be detected by the biometric sensor; and in accordance with a determination that the re-authentication criteria have not been met, performing a second operation without performing the biometric authentication and forgoing using the one or more biometric sensors to determine whether the second biometric authentication criteria are met.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; means for receiving a first request to perform a first operation requiring authentication; means for performing the following in response to receiving a first request to perform a first operation: determining, using the one or more biometric sensors, whether a first biometric authentication criterion is satisfied, wherein the first biometric authentication criterion includes a requirement that a respective type of biometric feature authorized to perform a first operation be detected by the biometric sensor; in accordance with a determination that the first biometric authentication criteria are satisfied, performing a first operation; and in accordance with a determination that the biometric authentication criteria are not satisfied, forgoing performance of the first operation; means for receiving, after performing the first operation, a second request to perform a second operation requiring authentication; and means for, in response to receiving the second request: in accordance with a determination that the re-authentication criteria have been met, determining, using the one or more biometric sensors, whether a second biometric authentication criteria are met, wherein the second biometric authentication criteria include a requirement that a respective type of biometric feature authorized to perform a second operation be detected by the biometric sensor; and in accordance with a determination that the re-authentication criteria have not been met, performing a second operation without performing the biometric authentication and forgoing using the one or more biometric sensors to determine whether the second biometric authentication criteria are met.
According to some examples, a method is described, the method comprising: at an electronic device having a display: receiving a request to display a first portion of respective content; and in response to the request for displaying the first portion of the respective content: displaying at least a first portion of respective content on the display, the respective content including an element associated with the authentication operation; in accordance with a determination that an element associated with the authentication operation satisfies visibility criteria, initiating biometric authentication; and in accordance with a determination that the element associated with the authentication operation does not satisfy the visibility criteria, forgoing initiating biometric authentication.
According to some examples, a non-transitory computer-readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device with a display, the one or more programs including instructions for: receiving a request to display a first portion of respective content; and in response to the request for displaying the first portion of the respective content: displaying at least a first portion of respective content on the display, the respective content including an element associated with the authentication operation; in accordance with a determination that an element associated with the authentication operation satisfies visibility criteria, initiating biometric authentication; and in accordance with a determination that the element associated with the authentication operation does not satisfy the visibility criteria, forgoing initiating biometric authentication.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display, the one or more programs including instructions for: receiving a request to display a first portion of respective content; and in response to the request for displaying the first portion of the respective content: displaying at least a first portion of respective content on the display, the respective content including an element associated with the authentication operation; in accordance with a determination that an element associated with the authentication operation satisfies visibility criteria, initiating biometric authentication; and in accordance with a determination that the element associated with the authentication operation does not satisfy the visibility criteria, forgoing initiating biometric authentication.
According to some examples, an electronic device is described, the electronic device comprising: a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: receiving a request to display a first portion of respective content; and in response to the request for displaying the first portion of the respective content: displaying at least a first portion of respective content on the display, the respective content including an element associated with the authentication operation; in accordance with a determination that an element associated with the authentication operation satisfies visibility criteria, initiating biometric authentication; and in accordance with a determination that the element associated with the authentication operation does not satisfy the visibility criteria, forgoing initiating biometric authentication.
According to some examples, an electronic device is described, the electronic device comprising: a display; means for receiving a request to display a first portion of respective content; and means for performing the following in response to a request to display a first portion of the respective content: displaying at least a first portion of respective content on the display, the respective content including an element associated with the authentication operation; in accordance with a determination that an element associated with the authentication operation satisfies visibility criteria, initiating biometric authentication; and in accordance with a determination that the element associated with the authentication operation does not satisfy the visibility criteria, forgoing initiating biometric authentication.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: detecting a predefined operation corresponding to a credential submission user interface having a credential submission user interface element; and in response to detecting the predefined operation: in accordance with a determination that biometric authentication via the one or more biometric sensors is available, displaying on the display a credential submission user interface with a visual indication that presenting the one or more biometric sensors with biometric features that satisfy the biometric authentication criteria will cause the credential to be submitted via the credential submission user interface element.
According to some examples, a non-transitory computer-readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: detecting a predefined operation corresponding to a credential submission user interface having a credential submission user interface element; and in response to detecting the predefined operation: in accordance with a determination that biometric authentication via the one or more biometric sensors is available, displaying on the display a credential submission user interface with a visual indication that presenting the one or more biometric sensors with biometric features that satisfy the biometric authentication criteria will cause the credential to be submitted via the credential submission user interface element.
According to some examples, a transitory computer-readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: detecting a predefined operation corresponding to a credential submission user interface having a credential submission user interface element; and in response to detecting the predefined operation: in accordance with a determination that biometric authentication via the one or more biometric sensors is available, displaying on the display a credential submission user interface with a visual indication that presenting the one or more biometric sensors with biometric features that satisfy the biometric authentication criteria will cause the credential to be submitted via the credential submission user interface element.
According to some examples, an electronic device is described, the electronic device comprising: a display; one or more biometric sensors; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: detecting a predefined operation corresponding to a credential submission user interface having a credential submission user interface element; and in response to detecting the predefined operation: in accordance with a determination that biometric authentication via the one or more biometric sensors is available, displaying on the display a credential submission user interface with a visual indication that presenting the one or more biometric sensors with biometric features that satisfy the biometric authentication criteria will cause the credential to be submitted via the credential submission user interface element.
According to some examples, an electronic device is described, the electronic device comprising: a display; one or more biometric sensors; means for detecting a predefined operation corresponding to a credential submission user interface having a credential submission user interface element; and means for, in response to detecting the predefined operation: in accordance with a determination that biometric authentication via the one or more biometric sensors is available, displaying on the display a credential submission user interface with a visual indication that presenting the one or more biometric sensors with biometric features that satisfy the biometric authentication criteria will cause the credential to be submitted via the credential submission user interface element.
According to some examples, a method is described, the method comprising: at an electronic device with a touch-sensitive display and one or more biometric sensors: displaying a credential input user interface having a plurality of character input keys on a touch-sensitive display; while displaying the credential input user interface, receiving a touch gesture input via the touch-sensitive display, the touch gesture input comprising movement of a contact on the touch-sensitive display; and in response to receiving a touch gesture input comprising movement of a contact on the touch-sensitive display: in accordance with a determination that a first set of one or more criteria is satisfied, an attempt is made to biometrically authenticate a user of an electronic device based on biometric information captured using one or more biometric sensors, wherein the first set of one or more criteria includes a requirement that biometric authentication is currently enabled on the electronic device.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a touch-sensitive display and one or more biometric sensors, the one or more programs including instructions for: displaying a credential input user interface having a plurality of character input keys on a touch-sensitive display; while displaying the credential input user interface, receiving a touch gesture input via the touch-sensitive display, the touch gesture input comprising movement of a contact on the touch-sensitive display; and in response to receiving a touch gesture input comprising movement of a contact on the touch-sensitive display: in accordance with a determination that a first set of one or more criteria is satisfied, an attempt is made to biometrically authenticate a user of an electronic device based on biometric information captured using one or more biometric sensors, wherein the first set of one or more criteria includes a requirement that biometric authentication is currently enabled on the electronic device.
According to some examples, a transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a touch-sensitive display and one or more biometric sensors, the one or more programs including instructions for: displaying a credential input user interface having a plurality of character input keys on a touch-sensitive display; while displaying the credential input user interface, receiving, by the touch-sensitive display, a touch gesture input, the touch gesture input comprising movement of a contact on the touch-sensitive display; and in response to receiving a touch gesture input comprising movement of a contact on the touch-sensitive display: in accordance with a determination that a first set of one or more criteria is satisfied, an attempt is made to biometrically authenticate a user of an electronic device based on biometric information captured using one or more biometric sensors, wherein the first set of one or more criteria includes a requirement that biometric authentication is currently enabled on the electronic device.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a touch-sensitive display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: displaying a credential input user interface having a plurality of character input keys on a touch-sensitive display; while displaying the credential input user interface, receiving a touch gesture input via the touch-sensitive display, the touch gesture input comprising movement of a contact on the touch-sensitive display; and in response to receiving a touch gesture input comprising movement of a contact on the touch-sensitive display: in accordance with a determination that a first set of one or more criteria is satisfied, an attempt is made to biometrically authenticate a user of an electronic device based on biometric information captured using one or more biometric sensors, wherein the first set of one or more criteria includes a requirement that biometric authentication is currently enabled on the electronic device.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a touch-sensitive display; means for displaying a credential input user interface having a plurality of character input keys on a touch-sensitive display; means for receiving, via the touch-sensitive display, a touch gesture input while displaying the credential input user interface, the touch gesture input comprising movement of a contact on the touch-sensitive display; and in response to receiving a touch gesture input comprising movement of a contact on the touch-sensitive display: in accordance with a determination that a first set of one or more criteria is satisfied, an attempt is made to biometrically authenticate a user of an electronic device based on biometric information captured using one or more biometric sensors, wherein the first set of one or more criteria includes a requirement that biometric authentication is currently enabled on the electronic device.
According to some embodiments, a method is described, the method comprising: at an electronic device having a display and one or more input devices: receiving, via one or more input devices, a request to perform an operation requiring authentication; and in response to a request to perform an operation requiring authentication: executing the operation according to the authentication success; and in accordance with a determination that the authentication was not successful and that a set of error condition criteria are satisfied: displaying an indication of the error condition on a display, wherein the indication includes information about a cause of the error condition; and forgo performing the operation.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for: receiving, via one or more input devices, a request to perform an operation requiring authentication; and in response to a request to perform an operation requiring authentication: executing the operation according to the authentication success; and in accordance with a determination that the authentication was not successful and that a set of error condition criteria are satisfied: displaying an indication of the error condition on a display, wherein the indication includes information about a cause of the error condition; and forgo performing the operation.
According to some examples, a transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more input devices, the one or more programs including instructions for: receiving, via one or more input devices, a request to perform an operation requiring authentication; and in response to a request to perform an operation requiring authentication: executing the operation according to the authentication success; and in accordance with a determination that the authentication was not successful and that a set of error condition criteria are satisfied: displaying an indication of the error condition on a display, wherein the indication includes information about a cause of the error condition; and forgo performing the operation.
According to some examples, an electronic device is described, the electronic device comprising: one or more input devices; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: receiving, via one or more input devices, a request to perform an operation requiring authentication; and in response to a request to perform an operation requiring authentication: executing the operation according to the authentication success; and in accordance with a determination that the authentication was not successful and that a set of error condition criteria are satisfied: displaying an indication of the error condition on a display, wherein the indication includes information about a cause of the error condition; and forgo performing the operation.
According to some examples, an electronic device is described, the electronic device comprising: one or more input devices; a display; means for receiving, via one or more input devices, a request to perform an operation requiring authentication; and means for, in response to a request to perform an operation requiring authentication: executing the operation according to the authentication success; and in accordance with a determination that the authentication was not successful and that a set of error condition criteria are satisfied: displaying an indication of the error condition on a display, wherein the indication includes information about a cause of the error condition; and forgo performing the operation.
According to some examples, a method is described, the method comprising: at an electronic device having a display and a biometric sensor located at a first portion of the electronic device: detecting whether an error condition exists that prevents the biometric sensor from obtaining biometric information about the device user; in response to detecting the presence of the error condition, displaying an error indication on the display, wherein the error indication is displayed in a position proximate to the first portion of the electronic device, comprising: in accordance with a determination that the user interface of the electronic device is in a first orientation relative to the biometric sensor, displaying an error indication at a first location in the user interface proximate to the first portion of the electronic device; and in accordance with a determination that the user interface of the electronic device is in a second orientation relative to the biometric sensor, displaying an error indication at a second location in the user interface proximate to the first portion of the electronic device, wherein the first orientation is different from the second orientation.
According to some examples, a non-transitory computer readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device having a display and a biometric sensor located at a first portion of the electronic device, the one or more programs including instructions for: detecting whether an error condition exists that prevents the biometric sensor from obtaining biometric information about the device user; in response to detecting the presence of the error condition, displaying an error indication on the display, wherein the error indication is displayed in a position proximate to the first portion of the electronic device, comprising: in accordance with a determination that the user interface of the electronic device is in a first orientation relative to the biometric sensor, displaying an error indication at a first location in the user interface proximate to the first portion of the electronic device; and in accordance with a determination that the user interface of the electronic device is in a second orientation relative to the biometric sensor, displaying an error indication at a second location in the user interface proximate to the first portion of the electronic device, wherein the first orientation is different from the second orientation.
According to some examples, a transitory computer-readable storage medium is described that includes one or more programs configured for execution by one or more processors of an electronic device having a display and a biometric sensor located at a first portion of the electronic device, the one or more programs including instructions for: detecting whether an error condition exists that prevents the biometric sensor from obtaining biometric information about the device user; in response to detecting the presence of the error condition, displaying an error indication on the display, wherein the error indication is displayed in a position proximate to the first portion of the electronic device, comprising: in accordance with a determination that the user interface of the electronic device is in a first orientation relative to the biometric sensor, displaying an error indication at a first location in the user interface proximate to the first portion of the electronic device; and in accordance with a determination that the user interface of the electronic device is in a second orientation relative to the biometric sensor, displaying an error indication at a second location in the user interface proximate to the first portion of the electronic device, wherein the first orientation is different from the second orientation.
According to some examples, an electronic device is described, the electronic device comprising: a biometric sensor located at a first portion of the electronic device; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: detecting whether an error condition exists that prevents the biometric sensor from obtaining biometric information about the device user; in response to detecting the presence of the error condition, displaying an error indication on the display, wherein the error indication is displayed in a position proximate to the first portion of the electronic device, comprising: in accordance with a determination that the user interface of the electronic device is in a first orientation relative to the biometric sensor, displaying an error indication at a first location in the user interface proximate to the first portion of the electronic device; and in accordance with a determination that the user interface of the electronic device is in a second orientation relative to the biometric sensor, displaying an error indication at a second location in the user interface proximate to the first portion of the electronic device, wherein the first orientation is different from the second orientation.
According to some examples, an electronic device is described, the electronic device comprising: a biometric sensor at a first portion of the electronic device; a display; means for detecting the presence of an error condition that prevents the biometric sensor from obtaining biometric information about the device user; means for displaying an error indication on the display in response to detecting the presence of the error condition, wherein the error indication is displayed in a position proximate to the first portion of the electronic device, the displaying the error indication on the display comprising: in accordance with a determination that the user interface of the electronic device is in a first orientation relative to the biometric sensor, displaying an error indication at a first location in the user interface proximate to the first portion of the electronic device; and in accordance with a determination that the user interface of the electronic device is in a second orientation relative to the biometric sensor, displaying an error indication at a second location in the user interface proximate to the first portion of the electronic device, wherein the first orientation is different from the second orientation.
According to some examples, a method is described, the method comprising: at an electronic device having a display and one or more biometric sensors: displaying a biometric enrollment user interface on the display for initiating a biometric enrollment with one or more biometric sensors; receiving an input corresponding to a request to initiate a biometric enrollment while displaying a biometric enrollment user interface; and in response to receiving the input: in accordance with a determination that the orientation of the electronic device satisfies a set of enrollment criteria, initiating a process of enrolling biometric features with one or more biometric sensors; and in accordance with a determination that the orientation of the electronic device does not satisfy the set of registration criteria, output one or more prompts to change the orientation of the electronic device to a different orientation that satisfies the set of registration criteria.
According to some examples, a non-transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: displaying a biometric enrollment user interface on the display for initiating a biometric enrollment with one or more biometric sensors; receiving an input corresponding to a request to initiate a biometric enrollment while displaying a biometric enrollment user interface; and in response to receiving the input: in accordance with a determination that the orientation of the electronic device satisfies a set of enrollment criteria, initiating a process of enrolling biometric features with one or more biometric sensors; and in accordance with a determination that the orientation of the electronic device does not satisfy the set of registration criteria, output one or more prompts to change the orientation of the electronic device to a different orientation that satisfies the set of registration criteria.
According to some examples, a transitory computer-readable medium is described that includes one or more programs configured for execution by one or more processors of an electronic device with a display and one or more biometric sensors, the one or more programs including instructions for: displaying a biometric enrollment user interface on the display for initiating a biometric enrollment with one or more biometric sensors; receiving an input corresponding to a request to initiate a biometric enrollment while displaying a biometric enrollment user interface; and in response to receiving the input: in accordance with a determination that the orientation of the electronic device satisfies a set of enrollment criteria, initiating a process of enrolling biometric features with one or more biometric sensors; and in accordance with a determination that the orientation of the electronic device does not satisfy the set of registration criteria, output one or more prompts to change the orientation of the electronic device to a different orientation that satisfies the set of registration criteria.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; one or more processors; and memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for: displaying a biometric enrollment user interface on the display for initiating a biometric enrollment with one or more biometric sensors; receiving an input corresponding to a request to initiate a biometric enrollment while displaying a biometric enrollment user interface; and in response to receiving the input: in accordance with a determination that the orientation of the electronic device satisfies a set of enrollment criteria, initiating a process of enrolling biometric features with one or more biometric sensors; and in accordance with a determination that the orientation of the electronic device does not satisfy the set of registration criteria, output one or more prompts to change the orientation of the electronic device to a different orientation that satisfies the set of registration criteria.
According to some examples, an electronic device is described, the electronic device comprising: one or more biometric sensors; a display; means for displaying a biometric enrollment user interface on a display for initiating a biometric enrollment with one or more biometric sensors; means for receiving an input corresponding to a request to initiate a biometric enrollment while displaying a biometric enrollment user interface; and means responsive to receiving the input for: in accordance with a determination that the orientation of the electronic device satisfies a set of enrollment criteria, initiating a process of enrolling biometric features with one or more biometric sensors; and in accordance with a determination that the orientation of the electronic device does not satisfy the set of registration criteria, output one or more prompts to change the orientation of the electronic device to a different orientation that satisfies the set of registration criteria.
Executable instructions for performing these functions are optionally included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors. Executable instructions for performing these functions are optionally included in a transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
Thus, faster, more efficient methods and interfaces are provided for devices for implementing biometric authentication, thereby improving the effectiveness, efficiency, and user satisfaction of such devices. Such methods and interfaces optionally complement or replace other methods for implementing biometric authentication.
Drawings
For a better understanding of the various described examples, reference should be made to the following detailed description, taken in conjunction with the following drawings, wherein like reference numerals designate corresponding parts throughout the figures.
FIG. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
Fig. 1B is a block diagram illustrating exemplary components for event processing, according to some embodiments.
Fig. 1C is a block diagram illustrating exemplary components for generating a haptic output, according to some embodiments.
FIG. 2 illustrates a portable multifunction device with a touch screen in accordance with some embodiments.
Fig. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments.
Figure 4A illustrates an exemplary user interface for a menu of applications on a portable multifunction device according to some embodiments.
FIG. 4B illustrates an exemplary user interface for a multifunction device with a touch-sensitive surface separate from a display, in accordance with some embodiments.
Fig. 4C-4H illustrate exemplary haptic output patterns with specific waveforms, according to some embodiments.
Fig. 5A illustrates a personal electronic device, according to some embodiments.
Fig. 5B is a block diagram illustrating a personal electronic device, according to some embodiments.
Fig. 5C-5D illustrate exemplary components of a personal electronic device with a touch-sensitive display and an intensity sensor, according to some embodiments.
Fig. 5E-5H illustrate exemplary components and user interfaces of a personal electronic device, according to some embodiments.
Fig. 6 illustrates an exemplary device connected via one or more communication channels according to some embodiments.
Fig. 7A-7S illustrate exemplary user interfaces for a biometric enrollment process tutorial according to some embodiments.
Fig. 8A-8C are flow diagrams illustrating a method of biometric enrollment process tutorials.
Fig. 9A-9 AE illustrate exemplary user interfaces for aligning biometric features for enrollment.
Fig. 10A to 10F are flowcharts showing a method of aligning biometric features for registration.
Fig. 11A-11O illustrate exemplary user interfaces for registering biometric features.
Fig. 12A to 12B are flowcharts showing a method of registering biometric features.
Fig. 13A-13R illustrate exemplary user interfaces for providing reminders during a biometric enrollment process.
Fig. 14A-14C are flow diagrams illustrating a method of providing reminders during a biometric enrollment process.
Fig. 15A-15T illustrate exemplary user interfaces for application-based biometric authentication.
Fig. 16A to 16E are flowcharts showing a method of biometric authentication based on an application program.
Fig. 17A-17 AI illustrate exemplary user interfaces for automatically populating fields for biometric security.
Fig. 18A-18D are flow diagrams illustrating a method of automatically populating a biometric-safe field.
19A-19 AB illustrate exemplary user interfaces for unlocking a device using biometric authentication.
Fig. 20A to 20F are flowcharts illustrating a method of unlocking a device using biometric authentication.
Fig. 21A through 21AQ illustrate exemplary user interfaces for retrying biometric authentication.
Fig. 22A to 22F are flowcharts showing a method of retrying biometric authentication.
Fig. 23A-23Q illustrate exemplary user interfaces for managing transmissions using biometric authentication.
Fig. 24A-24 BC illustrate exemplary user interfaces for managing transmissions using biometric authentication.
Fig. 25A to 25C are flowcharts illustrating a method of managing transmission using biometric authentication.
Fig. 26A-26 AS illustrate exemplary user interfaces for providing an intrusive user interface during biometric authentication.
Fig. 27A to 27E are flowcharts illustrating a method of providing an intrusive user interface during biometric authentication.
Fig. 28A-28 AA illustrate exemplary user interfaces for avoiding retry of biometric authentication.
Fig. 29A to 29B are flowcharts showing a method of avoiding retry of biometric authentication.
Fig. 30A-30 AL illustrate exemplary user interfaces for cached biometric authentication.
Fig. 31A to 31B are flowcharts illustrating a method of cached biometric authentication.
FIGS. 32A-32W illustrate exemplary user interfaces for automatically populating a filable field based on visibility criteria.
FIG. 33 is a flow diagram illustrating a method of automatically populating a filable field based on visibility criteria.
Fig. 34A to 34N show exemplary user interfaces for automatic login using biometric authentication.
Fig. 35 is a flowchart illustrating a method of automatic login using biometric authentication.
Fig. 36A-36L illustrate example user interfaces for retrying biometric authentication at a credential entry user interface, according to some examples.
37A-37B are flow diagrams illustrating methods for retrying biometric authentication at a credential input user interface according to some examples.
Fig. 38A-38 AD illustrate exemplary user interfaces for providing an indication of an error condition during biometric authentication, according to some examples.
39A-39B are flow diagrams illustrating methods for providing an indication of an error condition during biometric authentication, according to some examples.
Fig. 40A-40U illustrate exemplary user interfaces for providing an indication of a biometric characteristic sensor during biometric authentication, according to some examples.
41A-41C are flow diagrams illustrating methods for providing an indication of a biometric characteristic sensor during biometric authentication, according to some examples.
Fig. 42A-42P illustrate exemplary user interfaces for orienting a device to enroll biometric features according to some examples.
Fig. 43A-43C are flow diagrams illustrating methods for orienting a device to enroll biometric features according to some examples.
Detailed Description
The following description sets forth exemplary methods, parameters, and the like. It should be recognized, however, that such description is not intended as a limitation on the scope of the present disclosure, but is instead provided as a description of exemplary embodiments.
Electronic devices need to provide efficient methods and interfaces for enabling biometric authentication of biometric features. For example, electronic devices need to provide a convenient and efficient method for registering one or more portions of a biometric feature. As another example, electronic devices need to provide fast and intuitive techniques for selectively accessing secure data based on biometric authentication. As another example, electronic devices need to provide fast and intuitive techniques for enabling the functionality of the device based on biometric authentication. Such techniques may reduce the cognitive burden on users using devices to register biometric features and/or perform biometric authentication, thereby improving overall productivity. Moreover, such techniques may reduce processor power and battery power that would otherwise be wasted on redundant user inputs.
Fig. 1A-1C, 2, 3, 4A-4B, and 5A-5H below provide descriptions of exemplary devices for performing techniques for implementing biometric authentication. Fig. 6 illustrates an exemplary device connected via one or more communication channels according to some embodiments. Fig. 7A-7S illustrate exemplary user interfaces for a biometric enrollment process tutorial according to some embodiments. Fig. 8A-8C are flow diagrams illustrating a method of biometric enrollment process tutorials. The user interfaces in fig. 7A to 7S are for illustrating the processes described below, including the processes in fig. 8A to 8C. Fig. 9A-9 AE illustrate exemplary user interfaces for aligning biometric features for enrollment. Fig. 10A to 10F are flowcharts showing a method of aligning biometric features for registration. The user interfaces in fig. 9A to 9AE are used to illustrate the processes described below, including the processes in fig. 10A to 10F. Fig. 11A-11O illustrate exemplary user interfaces for registering biometric features. Fig. 12A to 12B are flowcharts showing a method of registering biometric features. The user interfaces in fig. 11A to 11O are for illustrating the processes described below, including the processes in fig. 12A to 12B. Fig. 13A-13R illustrate exemplary user interfaces for providing reminders during a biometric enrollment process. Fig. 14A-14C are flow diagrams illustrating a method of providing reminders during a biometric enrollment process. The user interfaces in fig. 13A to 13R are for illustrating the processes described below, including the processes in fig. 14A to 14C. Fig. 15A-15T illustrate exemplary user interfaces for application-based biometric authentication. Fig. 16A to 16E are flowcharts showing a method of biometric authentication based on an application program. The user interfaces in fig. 15A to 15T are for illustrating the processes described below, including the processes in fig. 16A to 16E. Fig. 17A-17 AI illustrate exemplary user interfaces for automatically populating fields for biometric security. Fig. 18A-18D are flow diagrams illustrating a method of automatically populating a biometric-safe field. The user interfaces in fig. 17A to 17AI are used to illustrate the processes described below, including the processes in fig. 18A to 18D. 19A-19 AB illustrate exemplary user interfaces for unlocking a device using biometric authentication. Fig. 20A to 20F are flowcharts illustrating a method of unlocking a device using biometric authentication. The user interfaces in fig. 19A to 19AB are used to illustrate the processes described below, including the processes in fig. 20A to 20F. Fig. 21A through 21AQ illustrate exemplary user interfaces for retrying biometric authentication. Fig. 22A to 22F are flowcharts showing a method of retrying biometric authentication. The user interfaces in fig. 21A-21 AQ are used to illustrate the processes described below, including the processes in fig. 22A-22F. Fig. 23A-23Q illustrate exemplary user interfaces for managing transmissions using biometric authentication. Fig. 24A-24 BC illustrate exemplary user interfaces for managing transmissions using biometric authentication. Fig. 25A to 25C are flowcharts illustrating a method of managing transmission using biometric authentication. The user interfaces in fig. 23A to 23Q and fig. 24A to 24BC are used to illustrate the processes described below, including the processes in fig. 25A to 25C. Fig. 26A-26 AS illustrate exemplary user interfaces for providing an intrusive user interface during biometric authentication. Fig. 27A to 27E are flowcharts illustrating a method of providing an intrusive user interface during biometric authentication. The user interfaces in fig. 26A to 26AS are used to illustrate the processes described below, including the processes in fig. 27A to 27E. Fig. 28A-28 AA illustrate exemplary user interfaces for avoiding retry of biometric authentication. Fig. 29A to 29B are flowcharts showing a method of avoiding retry of biometric authentication. The user interfaces in fig. 28A to 28AA are for illustrating the processes described below, including the processes in fig. 29A to 29B. Fig. 30A-30 AL illustrate exemplary user interfaces for cached biometric authentication. Fig. 31A to 31B are flowcharts illustrating a method of cached biometric authentication. The user interfaces in fig. 30A to 30AL are for illustrating the processes described below, including the processes in fig. 31A to 31B. FIGS. 32A-32W illustrate exemplary user interfaces for automatically populating a filable field based on visibility criteria. FIG. 33 is a flow diagram illustrating a method of automatically populating a filable field based on visibility criteria. The user interfaces in fig. 32A to 32W are for illustrating the processes described below, including the process in fig. 33. Fig. 34A to 34N show exemplary user interfaces for automatic login using biometric authentication. Fig. 35 is a flowchart illustrating a method of automatic login using biometric authentication. The user interfaces in fig. 34A to 34N are for illustrating the processes described below, including the process in fig. 35. Fig. 36A-36L illustrate example user interfaces for retrying biometric authentication at a credential entry user interface, according to some examples. 37A-37B are flow diagrams illustrating methods for retrying biometric authentication at a credential input user interface using an electronic device, according to some examples. Fig. 38A-38 AD illustrate exemplary user interfaces for providing an indication of an error condition during biometric authentication, according to some examples. 39A-39B are flow diagrams illustrating methods for providing an indication of an error condition during biometric authentication, according to some examples. Fig. 40A-40U illustrate example user interfaces for providing an indication of a biometric sensor during biometric authentication, according to some examples. 41A-41C are flow diagrams illustrating methods for providing an indication of a biometric sensor during biometric authentication, according to some examples. Fig. 42A-42P illustrate an exemplary user interface for orienting a device to enroll biometric features according to some examples.
Although the following description uses the terms "first," "second," etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first touch may be named a second touch and similarly a second touch may be named a first touch without departing from the scope of various described embodiments. The first touch and the second touch are both touches, but they are not the same touch.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Depending on the context, the term "if" is optionally to be interpreted to mean "when …", "at …" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if determined …" or "if [ stated condition or event ] is detected" is optionally to be construed to mean "upon determination …" or "in response to determination …" or "upon detection of [ stated condition or event ] or" in response to detection of [ stated condition or event ] ", depending on the context.
Embodiments of electronic devices, user interfaces for such devices, and related processes for using such devices are described herein. In some embodiments, the device is a portable communication device, such as a mobile phone, that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of portable multifunction devices include, but are not limited to, those from appleInc. (Cupertino, California)
Figure RE-GDA0002386764700000441
Device and iPod
Figure RE-GDA0002386764700000442
An apparatus, and
Figure RE-GDA0002386764700000443
an apparatus. Other portable electronic devices are optionally used, such as laptops or tablets with touch-sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not portableThe communication device is instead a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the electronic device optionally includes one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
The device typically supports various applications, such as one or more of the following: a mapping application, a rendering application, a word processing application, a website creation application, a disc editing application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, a fitness support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications executing on the device optionally use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device are optionally adjusted and/or varied for different applications and/or within respective applications. In this way, a common physical architecture of the device (such as a touch-sensitive surface) optionally supports various applications with a user interface that is intuitive and clear to the user.
Attention is now directed to embodiments of portable devices having touch sensitive displays. FIG. 1A is a block diagram illustrating a portable multifunction device 100 with a touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display 112 is sometimes referred to as a "touch screen" for convenience, and is sometimes referred to or called a "touch-sensitive display system". Device 100 includes memory 102 (which optionally includes one or more computer-readable storage media), a memory controller 122, one or more processing units (CPUs) 120, a peripheral interface 118, RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, an input/output (I/O) subsystem 106, other input control devices 116, and an external port 124. The device 100 optionally includes one or more optical sensors 164. Device 100 optionally includes one or more contact intensity sensors 165 for detecting the intensity of contacts on device 100 (e.g., a touch-sensitive surface such as touch-sensitive display system 112 of device 100). Device 100 optionally includes one or more tactile output generators 167 for generating tactile outputs on device 100 (e.g., generating tactile outputs on a touch-sensitive surface such as touch-sensitive display system 112 of device 100 or touchpad 355 of device 300). These components optionally communicate over one or more communication buses or signal lines 103.
As used in this specification and claims, the term "intensity" of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of a contact (e.g., a finger contact) on the touch-sensitive surface, or to a substitute (surrogate) for the force or pressure of a contact on the touch-sensitive surface. The intensity of the contact has a range of values that includes at least four different values and more typically includes hundreds of different values (e.g., at least 256). The intensity of the contact is optionally determined (or measured) using various methods and various sensors or combinations of sensors. For example, one or more force sensors below or adjacent to the touch-sensitive surface are optionally used to measure forces at different points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., a weighted average) to determine the estimated contact force. Similarly, the pressure sensitive tip of the stylus is optionally used to determine the pressure of the stylus on the touch-sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or changes thereof, the capacitance of the touch-sensitive surface proximate to the contact and/or changes thereof, and/or the resistance of the touch-sensitive surface proximate to the contact and/or changes thereof are optionally used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the surrogate measurement of contact force or pressure is used directly to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the surrogate measurement). In some implementations, the surrogate measurement of contact force or pressure is converted into an estimated force or pressure, and the estimated force or pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). The intensity of the contact is used as a property of the user input, allowing the user to access additional device functionality that in some cases would not be accessible to the user on a smaller-sized device with limited physical area for displaying affordances (e.g., on a touch-sensitive display) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or physical/mechanical controls, such as knobs or buttons).
As used in this specification and claims, the term "haptic output" refers to a physical displacement of a device relative to a previous position of the device, a physical displacement of a component of the device (e.g., a touch-sensitive surface) relative to another component of the device (e.g., a housing), or a displacement of a component relative to a center of mass of the device that is to be detected by a user with the user's sense of touch. For example, where a device or component of a device is in contact with a surface of a user that is sensitive to touch (e.g., a finger, palm, or other portion of a user's hand), the haptic output generated by the physical displacement will be interpreted by the user as a haptic sensation corresponding to a perceived change in a physical characteristic of the device or component of the device. For example, movement of the touch-sensitive surface (e.g., a touch-sensitive display or trackpad) is optionally interpreted by the user as a "down click" or "up click" of a physical actuation button. In some cases, the user will feel a tactile sensation, such as a "press click" or "release click," even when the physical actuation button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movement is not moving. As another example, even when there is no change in the smoothness of the touch sensitive surface, the movement of the touch sensitive surface is optionally interpreted or sensed by the user as "roughness" of the touch sensitive surface. While such interpretation of touch by a user will be limited by the user's individualized sensory perception, many sensory perceptions of touch are common to most users. Thus, when a haptic output is described as corresponding to a particular sensory perception of a user (e.g., "click down," "click up," "roughness"), unless otherwise stated, the generated haptic output corresponds to a physical displacement of the device or a component thereof that would generate the sensory perception of a typical (or ordinary) user. Providing haptic feedback to a user using haptic output enhances the operability of the device and makes the user device interface more efficient (e.g., by helping the user provide appropriate input and reducing user error in operating/interacting with the device), thereby further reducing power usage and extending the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some embodiments, the haptic output pattern specifies a characteristic of the haptic output, such as a magnitude of the haptic output, a shape of a motion waveform of the haptic output, a frequency of the haptic output, and/or a duration of the haptic output.
When a device generates haptic outputs having different haptic output patterns (e.g., via one or more haptic output generators that move a movable mass to generate the haptic outputs), the haptic outputs may cause different haptic sensations to a user holding or touching the device. While the user's senses are based on the user's perception of the haptic output, most users will be able to recognize changes in the waveform, frequency, and amplitude of the haptic output generated by the device. Thus, the waveform, frequency, and amplitude may be adjusted to indicate to the user that a different operation has been performed. As such, having characteristics (e.g., size, material, weight, stiffness, smoothness, etc.) designed, selected, and/or arranged to simulate an object in a given environment (e.g., a user interface including graphical features and objects, a simulated physical environment having virtual boundaries and virtual objects, a real physical environment having physical boundaries and physical objects, and/or a combination of any of the above); behavior (e.g., oscillation, displacement, acceleration, rotation, stretching, etc.); and/or interaction (e.g., collision, adhesion, repulsion, attraction, friction, etc.) will in some cases provide helpful feedback to the user that reduces input errors and improves the efficiency of the user's operation of the device. Additionally, haptic output is optionally generated to correspond to feedback that is independent of the simulated physical characteristic (such as an input threshold or object selection). Such tactile output will in some cases provide helpful feedback to the user, which reduces input errors and improves the efficiency of the user's operation of the device.
In some embodiments, the haptic output with the appropriate haptic output mode serves as a cue for an event of interest to occur in the user interface or behind the screen in the device. Examples of events of interest include activation of an affordance (e.g., a real or virtual button, or a toggle switch) provided on the device or in the user interface, success or failure of a requested operation, reaching or crossing a boundary in the user interface, entering a new state, switching input focus between objects, activating a new mode, reaching or crossing an input threshold, detecting or recognizing a type of input or gesture, and so forth. In some embodiments, a tactile output is provided to serve as a warning or cue as to an impending event or result that may occur unless a change of direction or an interrupt input is detected in time. Haptic output is also used in other contexts to enrich the user experience, improve accessibility to the device by users having visual or motor difficulties or other accessibility needs, and/or improve the efficiency and functionality of the user interface and/or device. Optionally comparing the tactile output with the audio input and/or visual user interface changes further enhances the user's experience when interacting with the user interface and/or device and facilitates better transfer of information about the state of the user interface and/or device, and this reduces input errors and improves the efficiency of the user's operation of the device.
Fig. 4C-4E provide a set of sample haptic output patterns that may be used, individually or in combination, as is or through one or more transformations (e.g., modulation, magnification, truncation, etc.) to generate suitable haptic feedback for various purposes in various scenarios, such as those described above and with respect to the user interfaces and methods discussed herein. This example of a control panel of haptic output shows how a set of three waveforms and eight frequencies can be used to generate an array of haptic output patterns. In addition to the haptic output modes shown in this figure, each of these haptic output modes is optionally adjusted in magnitude by changing the gain value of the haptic output mode, as shown, for example, for FullTap80Hz, FullTap 200Hz, MiniTap 80Hz, MiniTap 200Hz, MicroTap 80Hz, and MicroTap 200Hz in fig. 4F-4H, each shown with variations having gains of 1.0, 0.75, 0.5, and 0.25. As shown in fig. 4F to 4H, changing the gain of the haptic output pattern changes the amplitude of the pattern without changing the frequency of the pattern or changing the shape of the waveform. In some embodiments, changing the frequency of the tactile output pattern also results in a lower amplitude because some tactile output generators are limited in how much force can be applied to the movable mass, so the higher frequency movement of the mass is constrained to a lower amplitude to ensure that the acceleration required to generate the waveform does not require forces outside the operating force range of the tactile output generator (e.g., peak amplitudes of fulltaps at 230Hz, 270Hz, and 300Hz are lower than amplitudes of fulltaps at 80Hz, 100Hz, 125Hz, and 200 Hz).
Fig. 4C to 4H show haptic output patterns having specific waveforms. Waveform representation of haptic output pattern relative to neutral position (e.g., x)zero) Through which the movable mass passes to generate a haptic output having the haptic output pattern. For example, the first set of haptic output patterns shown in fig. 4C (e.g., the haptic output pattern of "FullTap") each has a waveform that includes an oscillation having two full cycles (e.g., an oscillation that begins and ends at a neutral position and passes through the neutral position three times). The second set of haptic output patterns shown in fig. 4D (e.g., the haptic output pattern of "MiniTap") each have a waveform that includes an oscillation having one full cycle (e.g., an oscillation that begins and ends at a neutral position and crosses the neutral position once). The third set of haptic output patterns shown in fig. 4E (e.g., the haptic output patterns of "MicroTap") each have a waveform that includes an oscillation having half a full cycle (e.g., an oscillation that begins and ends at a neutral position and does not pass through the neutral position). The waveform of the haptic output pattern further includes a representation of the haptic output patternStart and end buffering of the gradual acceleration and deceleration of the movable mass at the start and end. The example waveforms shown in FIGS. 4C-4H include x representing the maximum and minimum degrees of movement of the movable massminAnd xmaxThe value is obtained. For larger electronic devices where the movable mass is large, the minimum and maximum degrees of movement of the mass may be greater or less. The examples shown in fig. 4C to 4H describe the movement of a mass in 1 dimension, but similar principles can be applied to the movement of a movable mass in two or three dimensions.
As shown in fig. 4C-4E, each haptic output pattern also has a corresponding characteristic frequency that affects the "pitch" of the tactile sensation felt by the user from the haptic output having that characteristic frequency. For continuous haptic output, the characteristic frequency represents the number of cycles (e.g., cycles per second) that a movable mass of the haptic output generator completes within a given time period. For discrete haptic output, a discrete output signal (e.g., having 0.5, 1, or 2 cycles) is generated, and the characteristic frequency value specifies how fast the movable mass needs to move to generate the haptic output having the characteristic frequency. As shown in fig. 4C-4H, for each type of haptic output (e.g., defined by a respective waveform, such as FullTap, MiniTap, or MicroTap), a higher frequency value corresponds to faster movement of the movable mass, and thus, in general, a shorter haptic output completion time (e.g., a time including the number of cycles required to complete the discrete haptic output plus start and end buffer times). For example, a FullTap with a characteristic frequency of 80Hz takes longer to complete than a FullTap with a characteristic frequency of 100Hz (e.g., 35.4msvs.28.3ms in FIG. 4F). Further, for a given frequency, a haptic output having more cycles in its waveform at the corresponding frequency takes longer to complete than a haptic output having fewer cycles in its waveform at the same corresponding frequency. For example, a 150Hz FullTap takes longer to complete than a 150Hz MiniTap (e.g., 19.4ms vs.12.8ms), and a 150Hz MiniTap takes longer to complete than a 150Hz MicroTap (e.g., 12.8ms vs.9.4 ms). However, for haptic output patterns having different frequencies, this rule may not apply (e.g., a haptic output with more cycles but a higher frequency may take a shorter amount of time to complete than a haptic output with fewer cycles but a lower frequency, or vice versa). For example, at 300Hz, FullTap takes as long as MiniTap (e.g., 9.9 ms).
As shown in fig. 4C to 4E, the haptic output pattern also has a characteristic magnitude that affects the amount of energy contained in the haptic signal or the "intensity" of the tactile sensation that the user can feel through the haptic output having the characteristic magnitude. In some embodiments, the characteristic magnitude of the tactile output pattern refers to an absolute or normalized value representing the maximum displacement of the movable mass relative to a neutral position when generating the tactile output. In some embodiments, the characteristic magnitude of the haptic output pattern may be adjusted according to various conditions (e.g., customized based on user interface context and behavior) and/or preconfigured metrics (e.g., input-based metrics, and/or user interface-based metrics), such as by a fixed or dynamically determined gain factor (e.g., a value between 0 and 1). In some embodiments, a characteristic of an input (e.g., a rate of change of a characteristic intensity of a contact in a press input or a rate of movement of a contact on a touch-sensitive surface) during the input that triggers generation of a tactile output is measured based on a metric of the input (e.g., an intensity change metric or an input speed metric). In some embodiments, a characteristic of a user interface element (e.g., the speed of movement of the element through a bug or visible boundary in the user interface) during a user interface change that triggers generation of a tactile output is measured based on a metric of the user interface (e.g., a cross-boundary speed metric). In some embodiments, the characteristic amplitude of the tactile output pattern may be "envelope" modulated, and the peaks of adjacent cycles may have different amplitudes, with one of the waveforms shown above being further modified by multiplying with an envelope parameter that varies over time (e.g., from 0 to 1) to gradually adjust the amplitude of portions of the tactile output over time as the tactile output is generated.
Although specific frequencies, amplitudes, and waveforms are shown in the sample haptic output patterns in fig. 4C-4E for illustrative purposes, haptic output patterns having other frequencies, amplitudes, and waveforms may be used for similar purposes. For example, a waveform having between 0.5 and 4 cycles may be used. Other frequencies in the range of 60Hz to 400Hz may also be used. Table 1 provides examples of specific tactile feedback behaviors, configurations, and examples of their use.
It should be understood that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of these components. The various components shown in fig. 1A are implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
The memory 102 optionally includes high-speed random access memory, and also optionally includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Memory controller 122 optionally controls access to memory 102 by other components of device 100.
Peripheral interface 118 may be used to couple the input and output peripherals of the device to CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions of the device 100 and to process data. In some embodiments, peripherals interface 118, CPU 120, and memory controller 122 are optionally implemented on a single chip, such as chip 104. In some other embodiments, they are optionally implemented on separate chips.
RF (radio frequency) circuitry 108 receives and transmits RF signals, also referred to as electromagnetic signals. The RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communication networks and other communication devices via electromagnetic signals. RF circuitry 108 optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a codec chipset, a Subscriber Identity Module (SIM) card, memory, and so forth. RF circuitry 108 optionally communicates with networks, such as the internet, also known as the World Wide Web (WWW), intranets, and/or wireless networks, such as cellular telephone networks, wireless Local Area Networks (LANs), and/or Metropolitan Area Networks (MANs), and other devices via wireless communication. RF circuitry 108 optionally includes well-known circuitry for detecting Near Field Communication (NFC) fields, such as by a short-range communication radio. The wireless communication optionally uses any of a number of communication standards, protocols, and techniques, including, but not limited to, Global System for Mobile communications (GSM), Enhanced Data GSM Environment (EDGE), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), evolution, data-only (EV-DO), HSPA +, Dual-cell HSPA (DC-HSPDA), Long Term Evolution (LTE), Near Field Communication (NFC), wideband code division multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth Low energy (BTLE), Wireless Fidelity (Wi-Fi) (e.g., IEEE802.11a, IEEE802.11 b, IEEE802.11 g, IEEE802.11 n, and/or IEEE802.1 ac), Voice over Internet protocol (VoIP), Wi-MAX, email protocols (e.g., Internet Message Access Protocol (IMAP), and/or Post Office Protocol (POP)) Instant messaging (e.g., extensible messaging and presence protocol (XMPP), session initiation protocol for instant messaging and presence with extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol including communication protocols not yet developed at the time of filing date of this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. The audio circuitry 110 receives audio data from the peripheral interface 118, converts the audio data to electrical signals, and transmits the electrical signals to the speaker 111. The speaker 111 converts the electrical signals into human audible sound waves. The audio circuit 110 also receives electrical signals converted from sound waves by the microphone 113. The audio circuit 110 converts the electrical signals to audio data and transmits the audio data to the peripheral interface 118 for processing. Audio data is optionally retrieved from and/or transmitted to memory 102 and/or RF circuitry 108 by peripheral interface 118. In some embodiments, the audio circuit 110 also includes a headset jack (e.g., 212 in fig. 2). The headset jack provides an interface between the audio circuitry 110 and a removable audio input/output peripheral such as an output-only headphone or a headset having both an output (e.g., a monaural headphone or a binaural headphone) and an input (e.g., a microphone).
The I/O subsystem 106 couples input/output peripheral devices on the device 100, such as a touch screen 112 and other input control devices 116, to a peripheral interface 118. The I/O subsystem 106 optionally includes a display controller 156, an optical sensor controller 158, an intensity sensor controller 159, a haptic feedback controller 161, a depth camera controller 169, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/transmit electrical signals from/to other input control devices 116. Other input control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels, and the like. In some alternative embodiments, input controller 160 is optionally coupled to (or not coupled to) any of: a keyboard, an infrared port, a USB port, and a pointing device such as a mouse. The one or more buttons (e.g., 208 in fig. 2) optionally include an up/down button for volume control of the speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., 206 in fig. 2).
Quick depression of the push button optionally unlocks the touch screen 112 or optionally begins the process of Unlocking the device using a gesture on the touch screen, as described in U.S. patent application 11/322,549 (i.e., U.S. patent No.7,657,849) entitled "Unlocking a device by Performing diagnostics on devices on an Unlock Image," filed on 23.12.2005, which is hereby incorporated by reference in its entirety. A long press of a button (e.g., 206) optionally turns the device 100 on or off. The functionality of one or more buttons is optionally customizable by the user. The touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
Touch-sensitive display 112 provides an input interface and an output interface between the device and a user. Display controller 156 receives and/or transmits electrical signals to and/or from touch screen 112. Touch screen 112 displays visual output to a user. The visual output optionally includes graphics, text, icons, video, and any combination thereof (collectively "graphics"). In some embodiments, some or all of the visual output optionally corresponds to a user interface object.
Touch screen 112 has a touch-sensitive surface, sensor, or group of sensors that accept input from a user based on tactile sensation and/or tactile contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on touch screen 112 and convert the detected contact into interaction with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on touch screen 112. In an exemplary embodiment, the point of contact between touch screen 112 and the user corresponds to a finger of the user.
Touch screen 112 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. Touch screen 112 and display controller 156 optionally detect contact and any movement or breaking thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In one exemplary embodiment, projected mutual capacitance sensing technology is used, such as that available from Apple Inc. (Cupertino, California)
Figure RE-GDA0002386764700000531
And iPod
Figure RE-GDA0002386764700000532
The technique used in (1).
The touch sensitive display in some embodiments of touch screen 112 is optionally similar to a multi-touch sensitive trackpad described in the following U.S. patents: 6,323,846(Westerman et al), 6,570,557 (Westerman et al) and/or 6,677,932(Westerman et al) and/or U.S. patent publication 2002/0015024a1, each of which is hereby incorporated by reference in its entirety. However, touch screen 112 displays visual output from device 100, while touch sensitive trackpads do not provide visual output.
In some embodiments, the touch sensitive display of touch screen 112 is as described in the following applications: (1) U.S. patent application No.11/381,313 entitled "multi-Touch Surface Controller" filed on 2.5.2006; (2) U.S. patent application No.10/840,862 entitled "Multipoint touch screen" filed on 6.5.2004; (3) U.S. patent application No.10/903,964 entitled "gestures for Touch Sensitive Input Devices" filed on 30.7.2004; (4) U.S. patent application No.11/048,264 entitled "Gestures For Touch Sensitive input devices" filed on 31/1/2005; (5) U.S. patent application No.11/038,590 entitled "Mode-Based Graphical User Interfaces For Touch Sensitive input devices" (Pattern-Based Graphical User interface For Touch Sensitive input device) filed on 18.1.2005; (6) U.S. patent application No.11/228,758 entitled "Virtual Input device On A touchScreen User Interface" (Virtual Input device placed On a TouchScreen User Interface) filed On 16.9.2005; (7) U.S. patent application No.11/228,700 entitled "Operation Of a Computer With touch Screen Interface," filed on 16.9.2005; (8) U.S. patent application No.11/228,737 entitled "Activating Virtual Keys Of a Touch-screen Virtual key" (Activating Virtual Keys Of a Touch screen Virtual Keyboard) filed on 16.9.2005; and (9) U.S. patent application No. 11/367,749 entitled "Multi-Functional Hand-Held Device" filed 3.3.2006. All of these applications are incorporated herein by reference in their entirety.
The touch screen 112 optionally has a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of about 160 dpi. The user optionally makes contact with touch screen 112 using any suitable object or appendage, such as a stylus, finger, or the like. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which may not be as accurate as stylus-based input due to the larger contact area of the finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the action desired by the user.
In some embodiments, in addition to a touch screen, device 100 optionally includes a touch pad (not shown) for activating or deactivating particular functions. In some embodiments, the trackpad is a touch-sensitive area of the device that, unlike a touchscreen, does not display visual output. The touchpad is optionally a touch-sensitive surface separate from touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
The device 100 also includes a power system 162 for powering the various components. Power system 162 optionally includes a power management system, one or more power sources (e.g., battery, Alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a Light Emitting Diode (LED)), and any other components associated with the generation, management, and distribution of power in a portable device.
The device 100 optionally further includes one or more optical sensors 164. FIG. 1A shows an optical sensor coupled to an optical sensor controller 158 in the I/O subsystem 106. The optical sensor 164 optionally includes a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The optical sensor 164 receives light from the environment projected through one or more lenses and converts the light into data representing an image. In conjunction with imaging module 143 (also called a camera module), optical sensor 164 optionally captures still images or video. In some embodiments, an optical sensor is located on the back of device 100, opposite touch screen display 112 on the front of the device, so that the touch screen display can be used as a viewfinder for still and/or video image acquisition. In some embodiments, an optical sensor is located on the front of the device so that images of the user are optionally acquired for the video conference while the user views other video conference participants on the touch screen display. In some implementations, the position of the optical sensor 164 can be changed by the user (e.g., by rotating a lens and sensor in the device housing) such that a single optical sensor 164 is used with the touch screen display for both video conferencing and still image and/or video image capture.
Device 100 optionally further comprises one or more contact intensity sensors 165. FIG. 1A shows a contact intensity sensor coupled to an intensity sensor controller 159 in the I/O subsystem 106. Contact intensity sensor 165 optionally includes one or more piezoresistive strain gauges, capacitive force sensors, electrical force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other intensity sensors (e.g., sensors for measuring the force (or pressure) of a contact on a touch-sensitive surface). Contact intensity sensor 165 receives contact intensity information (e.g., pressure information or a proxy for pressure information) from the environment. In some implementations, at least one contact intensity sensor is collocated with or proximate to a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the back of device 100, opposite touch screen display 112, which is located on the front of device 100.
The device 100 optionally further includes one or more proximity sensors 166. Fig. 1A shows a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 is optionally coupled to the input controller 160 in the I/O subsystem 106. The proximity sensor 166 optionally performs as described in the following U.S. patent applications: 11/241,839 entitled "Proximaty Detector In Handheld Device"; no.11/240,788 entitled "proximity detector In handeld Device" (proximity detector In Handheld Device); no. 11/620,702, entitled "Using Ambient Light Sensor To Automation Proximity Sensor Output" (enhanced Proximity Sensor Output Using Ambient Light Sensor); no.11/586,862, entitled "Automated Response To sensing Of User Activity In Portable Devices" (Automated Response and sensing Of User Activity In Portable Devices); and U.S. patent application No.11/638,251 entitled "Methods And Systems For automatic configuration Of Peripherals" (Methods And Systems For automatic configuration Of Peripherals), which is hereby incorporated by reference in its entirety. In some embodiments, the proximity sensor turns off and disables the touch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call).
Device 100 optionally further comprises one or more tactile output generators 167. FIG. 1A shows a haptic output generator coupled to a haptic feedback controller 161 in the I/O subsystem 106. Tactile output generator 167 optionally includes one or more electro-acoustic devices such as speakers or other audio components, and/or electromechanical devices that convert energy into linear motion such as motors, solenoids, electroactive aggregators, piezoelectric actuators, electrostatic actuators, or other tactile output generating components (e.g., components that convert electrical signals into tactile output on the device). Contact intensity sensor 165 receives haptic feedback generation instructions from haptic feedback module 133 and generates haptic output on device 100 that can be felt by a user of device 100. In some embodiments, at least one tactile output generator is juxtaposed or adjacent to a touch-sensitive surface (e.g., touch-sensitive display system 112), and optionally generates tactile output by moving the touch-sensitive surface vertically (e.g., into/out of the surface of device 100) or laterally (e.g., back and forth in the same plane as the surface of device 100). In some embodiments, at least one tactile output generator sensor is located on the back of device 100, opposite touch screen display 112, which is located on the front of device 100.
Device 100 optionally also includes one or more accelerometers 168. Fig. 1A shows accelerometer 168 coupled to peripheral interface 118. Alternatively, accelerometer 168 is optionally coupled to input controller 160 in I/O subsystem 106. Accelerometer 168 optionally performs as described in the following U.S. patent publications: U.S. patent publication 20050190059 entitled "Acceleration-Based Detection System for Portable electronic Devices" And U.S. patent publication 20060017692 entitled "Methods And apparatus for Operating A Portable Device Based On An Accelerometer," both of which are incorporated herein by reference in their entirety. In some embodiments, information is displayed in a portrait view or a landscape view on the touch screen display based on analysis of data received from one or more accelerometers. Device 100 optionally includes a magnetometer (not shown) and a GPS (or GLONASS or other global navigation system) receiver (not shown) in addition to accelerometer 168 for obtaining information about the position and orientation (e.g., portrait or landscape) of device 100.
In some embodiments, the device 100 also includes (or is in communication with) one or more fingerprint sensors. The one or more fingerprint sensors are coupled to the peripheral interface 118. Alternatively, the one or more fingerprint sensors are optionally coupled to an input controller 160 in the I/O system 106. However, in one common embodiment, the fingerprinting operation is performed using secure dedicated computing hardware (e.g., one or more processors, memory, and/or communication buses) with additional security features in order to enhance the security of the fingerprint information determined by the one or more fingerprint sensors. As used herein, a fingerprint sensor is a sensor that is capable of distinguishing fingerprint features (sometimes called "minutiae") of ridges and valleys of skin, such as those found on human fingers and toes. The fingerprint sensor may use any of a variety of techniques to distinguish fingerprint features, including but not limited to: optical fingerprint imaging, ultrasonic fingerprint imaging, active capacitance fingerprint imaging and passive capacitance fingerprint imaging. In some embodiments, in addition to distinguishing fingerprint features in a fingerprint, the one or more fingerprint sensors are also capable of tracking movement of fingerprint features over time, thereby determining/characterizing movement of the fingerprint over time on the one or more fingerprint sensors. While the one or more fingerprint sensors may be separate from the touch-sensitive surface (e.g., touch-sensitive display system 112), it should be understood that in some implementations, the touch-sensitive surface (e.g., touch-sensitive display system 112) has a spatial resolution high enough to detect fingerprint features formed by the various fingerprint ridges, and is used as a fingerprint sensor instead of or in addition to the one or more fingerprint sensors. In some embodiments, device 100 includes a set of one or more orientation sensors for determining the orientation of a finger or hand on or proximate to the device (e.g., the orientation of the finger above one or more fingerprint sensors). Additionally, in some embodiments, the set of one or more orientation sensors is used to detect rotation of the contact that is interacting with the device in addition to or in place of the fingerprint sensor (e.g., in one or more of the methods described below, instead of using the fingerprint sensor to detect rotation of the fingerprint/contact, the set of one or more orientation sensors is used to detect rotation of the contact that includes the fingerprint, with or without detecting features of the fingerprint).
In some embodiments, the features of the fingerprint and the comparison between the detected features of the fingerprint and the stored features of the fingerprint are performed by secure dedicated computing hardware (e.g., one or more processors, memory, and/or a communication bus) separate from the processor 120 in order to improve the security of the fingerprint data generated, stored, and processed by the one or more fingerprint sensors. In some embodiments, the features of the fingerprint and the comparison between the features of the detected fingerprint and the features of the enrolled fingerprint are performed by the processor 120 using a fingerprint analysis module.
In some embodiments, during an enrollment process, a device (e.g., a fingerprint analysis module or a separate security module in communication with the one or more fingerprint sensors) collects biometric information about one or more fingerprints of a user (e.g., identifies the relative locations of a plurality of nodes in the user's fingerprint). After the enrollment process has been completed, the biometric information is stored at the device (e.g., in a secure fingerprint module) for later use in authenticating the detected fingerprint. In some embodiments, the biometric information stored at the device excludes an image of the fingerprint, and also excludes information from which an image of the fingerprint can be reconstructed, so that the image of the fingerprint does not inadvertently become available if the security of the device is compromised. In some embodiments, during an authentication process, a device (e.g., a fingerprint analysis module or a separate security module in communication with the one or more fingerprint sensors) determines whether a finger input detected by the one or more fingerprint sensors includes a fingerprint that matches a fingerprint previously enrolled by collecting biometric information about the fingerprint detected on the one or more fingerprint sensors (e.g., identifies relative locations of a plurality of nodes in the fingerprint detected on the one or more fingerprint sensors) and compares the biometric information corresponding to the detected fingerprint to the biometric information corresponding to the enrolled fingerprint. In some embodiments, comparing the biometric information corresponding to the detected fingerprint to the biometric information corresponding to the enrolled fingerprint includes comparing the type and location of the node in the biometric information corresponding to the detected fingerprint to the type and location of the node in the biometric information corresponding to the enrolled fingerprint. However, the determination as to whether the finger input includes a fingerprint that matches a previously enrolled fingerprint that was enrolled using the device is optionally performed using any of a variety of well-known fingerprint authentication techniques to determine whether the detected fingerprint matches the enrolled fingerprint.
The device 100 optionally also includes one or more depth camera sensors 175. FIG. 1A shows a depth camera sensor coupled to a depth camera controller 169 in I/O subsystem 106. The depth camera sensor 175 receives data projected from the environment through the sensor. In conjunction with imaging module 143 (also referred to as a camera module), depth camera sensor 175 cameras are optionally used to determine depth maps of different portions of the images captured by imaging module 143. In some embodiments, the depth camera sensor is located in front of the device 100 so that the user image with depth information is available for different functions of the device, such as video conferencing to capture self-shots with depth map data and authenticating the user of the device. In some implementations, the position of the depth camera sensor 175 can be changed by the user (e.g., by rotating a lens and sensor in the device housing) such that the depth camera sensor 175 is used with a touch screen display for both video conferencing and still image and/or video image capture.
In some embodiments, the software components stored in memory 102 include an operating system 126, a communication module (or set of instructions) 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and an application program (or set of instructions) 136. Further, in some embodiments, memory 102 (fig. 1A) or memory 370 (fig. 3) stores device/global internal state 157, as shown in fig. 1A and 3. Device/global internal state 157 includes one or more of: an active application state indicating which applications (if any) are currently active; display state indicating what applications, views, or other information occupy various areas of the touch screen display 112; sensor status, including information obtained from the various sensors of the device and the input control device 116; and location information regarding the location and/or pose of the device.
Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communications module 128 facilitates communications with other devices through one or more external ports 124 and also includes various software components for processing data received by RF circuitry 108 and/or external ports 124. External port 124 (e.g., Universal Serial Bus (USB), firewire, etc.) is adapted to couple directly to other devices or indirectly through a network (e.g., the internet, wireless LAN, etc.)And (4) coupling. In some embodiments, the external port is an external port
Figure RE-GDA0002386764700000591
(trademark of Apple inc.) a multi-pin (e.g., 30-pin) connector that is the same as or similar to and/or compatible with the 30-pin connector used on the device.
Contact/motion module 130 optionally detects contact with touch screen 112 (in conjunction with display controller 156) and other touch-sensitive devices (e.g., a trackpad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to the detection of contact, such as determining whether contact has occurred (e.g., detecting a finger-down event), determining the intensity of contact (e.g., the force or pressure of the contact, or a substitute for the force or pressure of the contact), determining whether there is movement of the contact and tracking movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining whether contact has ceased (e.g., detecting a finger-up event or a break in contact). The contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the point of contact optionally includes determining velocity (magnitude), velocity (magnitude and direction), and/or acceleration (change in magnitude and/or direction) of the point of contact, the movement of the point of contact being represented by a series of contact data. These operations are optionally applied to single point contacts (e.g., single finger contacts) or multiple point simultaneous contacts (e.g., "multi-touch"/multiple finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 detect contact on a touch pad.
In some embodiments, the contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by the user (e.g., determine whether the user has "clicked" on an icon). In some embodiments, at least a subset of the intensity thresholds are determined as a function of software parameters (e.g., the intensity thresholds are not determined by the activation thresholds of particular physical actuators and may be adjusted without changing the physical hardware of device 100). For example, the mouse "click" threshold of the trackpad or touchscreen can be set to any one of a wide range of predefined thresholds without changing the trackpad or touchscreen display hardware. Additionally, in some implementations, a user of the device is provided with software settings for adjusting one or more intensity thresholds of a set of intensity thresholds (e.g., by adjusting individual intensity thresholds and/or by adjusting multiple intensity thresholds at once with a system-level click on an "intensity" parameter).
The contact/motion module 130 optionally detects gesture input by the user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different motions, timings, and/or intensities of detected contacts). Thus, the gesture is optionally detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event, and then detecting a finger-up (lift-off) event at the same location (or substantially the same location) as the finger-down event (e.g., at the location of the icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event, then detecting one or more finger-dragging events, and then subsequently detecting a finger-up (lift-off) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast, or other visual attributes) of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including without limitation text, web pages, icons (such as user interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is optionally assigned a corresponding code. The graphics module 132 receives one or more codes specifying graphics to be displayed, if necessary together with coordinate data and other graphics attribute data, from an application program or the like, and then generates screen image data to output to the display controller 156.
Haptic feedback module 133 includes various software components for generating instructions for use by haptic output generator 167 in generating haptic outputs at one or more locations on device 100 in response to user interaction with device 100.
Text input module 134, which is optionally a component of graphics module 132, provides a soft keyboard for entering text in various applications (e.g., contacts 137, — email 140, IM 141, browser 147, and any other application that requires text input).
The GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to the phone 138 for use in location-based dialing; to the camera 143 as picture/video metadata; and to applications that provide location-based services, such as weather desktop applets, local yellow pages desktop applets, and map/navigation desktop applets).
Application 136 optionally includes the following modules (or sets of instructions), or a subset or superset thereof:
contacts module 137 (sometimes referred to as a contact list or contact list);
a phone module 138;
a video conferencing module 139;
an email client module 140;
an Instant Messaging (IM) module 141;
fitness support module 142;
a camera module 143 for still and/or video images;
an image management module 144;
a video player module;
a music player module;
a browser module 147;
a calendar module 148;
desktop applet module 149, optionally including one or more of: a weather desktop applet 149-1, a stock market desktop applet 149-2, a calculator desktop applet 149-3, an alarm desktop applet 149-4, a dictionary desktop applet 149-5, and other desktop applets acquired by the user, and a user created desktop applet 149-6;
a desktop applet creator module 150 for forming a user-created desktop applet 149-6;
a search module 151;
a video and music player module 152 that incorporates a video player module and a music player module;
a notepad module 153;
a map module 154; and/or
Online video module 155.
Examples of other applications 136 optionally stored in memory 102 include other word processing applications, other image editing applications, drawing applications, rendering applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, contacts module 137 is optionally used to manage contact lists or contact lists (e.g., stored in memory 102 or in application internal state 192 of contacts module 137 in memory 370), including: adding one or more names to the address book; delete names from the address book; associating a telephone number, email address, physical address, or other information with a name; associating the image with a name; classifying and classifying names; providing a telephone number or email address to initiate and/or facilitate communication through telephone 138, video conferencing module 139, email 140, or IM 141; and so on.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, phone module 138 is optionally used to enter a sequence of characters corresponding to a phone number, access one or more phone numbers in contacts module 137, modify an entered phone number, dial a corresponding phone number, conduct a conversation, and disconnect or hang up when the conversation is complete. As noted above, the wireless communication optionally uses any of a variety of communication standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, text input module 134, contacts module 137, and telephony module 138, video conference module 139 includes executable instructions for initiating, conducting, and terminating video conferences between the user and one or more other participants according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, email client module 140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 144, the email client module 140 makes it very easy to create and send an email with a still image or a video image captured by the camera module 143.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, instant messaging module 141 includes executable instructions for: entering a sequence of characters corresponding to an instant message, modifying previously entered characters, transmitting a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for phone-based instant messages or using XMPP, SIMPLE, or IMPS for internet-based instant messages), receiving an instant message, and viewing the received instant message. In some embodiments, the transmitted and/or received instant messages optionally include graphics, photos, audio files, video files, and/or MMS and/or other attachments supported in an Enhanced Messaging Service (EMS). As used herein, "instant message" refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module, workout support module 142 includes executable instructions to create a workout (e.g., having time, distance, and/or calorie burning goals); communicating with fitness sensors (sports equipment); receiving fitness sensor data; calibrating a sensor for monitoring fitness; selecting and playing music for fitness; and displaying, storing and transmitting fitness data.
In conjunction with touch screen 112, display controller 156, one or more optical sensors 164, optical sensor controller 158, contact/motion module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions for: capturing still images or video (including video streams) and storing them in the memory 102, modifying features of the still images or video, or deleting the still images or video from the memory 102.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions for: arrange, modify (e.g., edit), or otherwise manipulate, tag, delete, present (e.g., in a digital slide or album), and store still and/or video images.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions for browsing the internet (including searching for, linking to, receiving, and displaying web pages or portions thereof, and attachments and other files linked to web pages) according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, email client module 140, and browser module 147, calendar module 148 includes executable instructions for creating, displaying, modifying, and storing calendars and data associated with calendars (e.g., calendar entries, to-do items, etc.) according to user instructions.
In conjunction with the RF circuitry 108, the touch screen 112, the display system controller 156, the contact/motion module 130, the graphics module 132, the text input module 134, and the browser module 147, the desktop applet module 149 is a mini-application (e.g., a weather desktop applet 149-1, a stock market desktop applet 149-2, a calculator desktop applet 149-3, an alarm clock desktop applet 149-4, and a dictionary desktop applet 149-5) or a mini-application created by a user (e.g., a user created desktop applet 149-6) that is optionally downloaded and used by the user. In some embodiments, the desktop applet includes an HTML (hypertext markup language) file, a CSS (cascading style sheet) file, and a JavaScript file. In some embodiments, the desktop applet includes an XML (extensible markup language) file and a JavaScript file (e.g., Yahoo! desktop applet).
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, the desktop applet creator module 150 is optionally used by a user to create a desktop applet (e.g., to turn a user-specified portion of a web page into a desktop applet).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions for searching memory 102 for text, music, sound, images, videos, and/or other files that match one or more search criteria (e.g., one or more user-specified search terms) in accordance with user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speakers 111, RF circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow a user to download and playback recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files, as well as executable instructions for displaying, rendering, or otherwise playing back video (e.g., on touch screen 112 or on an external display connected via external port 124). In some embodiments, the device 100 optionally includes the functionality of an MP3 player, such as an iPod (trademark of Apple inc.).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, notepad module 153 includes executable instructions to create and manage notepads, backlogs, and the like according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 is optionally used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions, data related to stores and other points of interest at or near a particular location, and other location-based data) according to user instructions.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, email client module 140, and browser module 147, online video module 155 includes instructions for: allowing a user to access, browse, receive (e.g., by streaming and/or downloading), playback (e.g., on a touch screen or on an external display connected via external port 124), send an email with a link to a particular online video, and otherwise manage online video in one or more file formats, such as h.264. In some embodiments, the link to a particular online video is sent using instant messaging module 141 instead of email client module 140. Additional descriptions of Online video applications may be found in U.S. provisional patent application 60/936,562 entitled "Portable Multi function Device, Method, and Graphical user interface for Playing Online video," filed on year 2007, 20, and U.S. patent application 11/968,067 entitled "Portable Multi function Device, Method, and Graphical user interface for Playing Online video," filed on year 2007, 31, which are both hereby incorporated by reference in their entirety.
Each of the modules and applications described above corresponds to a set of executable instructions for performing one or more of the functions described above as well as the methods described in this patent application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (e.g., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. For example, the video player module is optionally combined with the music player module into a single module (e.g., the video and music player module 152 in fig. 1A). In some embodiments, memory 102 optionally stores a subset of the modules and data structures described above. Further, memory 102 optionally stores additional modules and data structures not described above.
In some embodiments, device 100 is a device that performs the operation of a predefined set of functions thereon solely through a touch screen and/or trackpad. By using a touch screen and/or trackpad as the primary input control device for operating the device 100, the number of physical input control devices (e.g., push buttons, dials, etc.) on the device 100 is optionally reduced.
The predefined set of functions performed exclusively through the touchscreen and/or trackpad optionally includes navigation between user interfaces. In some embodiments, the trackpad, when touched by a user, navigates device 100 from any user interface displayed on device 100 to a main, home, or root menu. In such embodiments, a touchpad is used to implement a "menu button". In some other embodiments, the menu button is a physical push button or other physical input control device, rather than a touchpad.
Fig. 1B is a block diagram illustrating exemplary components for event processing, according to some embodiments. In some embodiments, memory 102 (FIG. 1A) or memory 370 (FIG. 3) includes event classifier 170 (e.g., in operating system 126) and corresponding application 136-1 (e.g., any of the aforementioned applications 137-151, 155, 380-390).
Event sorter 170 receives the event information and determines application 136-1 and application view 191 of application 136-1 to which the event information is to be delivered. The event sorter 170 includes an event monitor 171 and an event dispatcher module 174. In some embodiments, application 136-1 includes an application internal state 192 that indicates one or more current application views that are displayed on touch-sensitive display 112 when the application is active or executing. In some embodiments, device/global internal state 157 is used by event classifier 170 to determine which application(s) are currently active, and application internal state 192 is used by event classifier 170 to determine the application view 191 to which to deliver event information.
In some embodiments, the application internal state 192 includes additional information, such as one or more of: resume information to be used when the application 136-1 resumes execution, user interface state information indicating information being displayed by the application 136-1 or ready for display by the application, a state queue for enabling a user to return to a previous state or view of the application 136-1, and a repeat/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripheral interface 118. The event information includes information about a sub-event (e.g., a user touch on touch-sensitive display 112 as part of a multi-touch gesture). Peripherals interface 118 transmits information it receives from I/O subsystem 106 or sensors such as proximity sensor 166, accelerometer 168, and/or microphone 113 (through audio circuitry 110). Information received by peripheral interface 118 from I/O subsystem 106 includes information from touch-sensitive display 112 or a touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to peripheral interface 118 at predetermined intervals. In response, peripheral interface 118 transmits the event information. In other embodiments, peripheral interface 118 transmits event information only when there is a significant event (e.g., receiving input above a predetermined noise threshold and/or receiving input for more than a predetermined duration).
In some embodiments, event classifier 170 further includes hit view determination module 172 and/or active event recognizer determination module 173.
When touch-sensitive display 112 displays more than one view, hit view determination module 172 provides a software process for determining where within one or more views a sub-event has occurred. The view consists of controls and other elements that the user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes referred to herein as application views or user interface windows, in which information is displayed and touch-based gestures occur. The application view (of the respective application) in which the touch is detected optionally corresponds to a programmatic level within a programmatic or view hierarchy of applications. For example, the lowest level view in which a touch is detected is optionally referred to as a hit view, and the set of events identified as correct inputs is optionally determined based at least in part on the hit view of the initial touch that initiated the touch-based gesture.
Hit view determination module 172 receives information related to sub-events of the touch-based gesture. When the application has multiple views organized in a hierarchy, hit view determination module 172 identifies the hit view as the lowest view in the hierarchy that should handle the sub-event. In most cases, the hit view is the lowest level view in which the initiating sub-event (e.g., the first sub-event in the sequence of sub-events that form an event or potential event) occurs. Once the hit view is identified by hit view determination module 172, the hit view typically receives all sub-events related to the same touch or input source for which it was identified as the hit view.
The active event recognizer determination module 173 determines which view or views within the view hierarchy should receive a particular sequence of sub-events. In some implementations, the active event recognizer determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, active event recognizer determination module 173 determines that all views that include the physical location of the sub-event are actively participating views, and thus determines that all actively participating views should receive a particular sequence of sub-events. In other embodiments, even if the touch sub-event is completely confined to the area associated with a particular view, the higher views in the hierarchy will remain actively participating views.
The event dispatcher module 174 dispatches the event information to an event recognizer (e.g., event recognizer 180). In embodiments that include active event recognizer determination module 173, event dispatcher module 174 delivers event information to event recognizers determined by active event recognizer determination module 173. In some embodiments, the event dispatcher module 174 stores event information in an event queue, which is retrieved by the respective event receiver 182.
In some embodiments, the operating system 126 includes an event classifier 170. Alternatively, application 136-1 includes event classifier 170. In yet another embodiment, the event classifier 170 is a stand-alone module or is part of another module stored in the memory 102 (such as the contact/motion module 130).
In some embodiments, the application 136-1 includes a plurality of event handlers 190 and one or more application views 191, where each application view includes instructions for handling touch events occurring within a respective view of the application's user interface. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, the respective application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more of the event recognizers 180 are part of a separate module that is a higher-level object such as a user interface toolkit (not shown) or application 136-1 that inherits methods and other properties from it. In some embodiments, the respective event handlers 190 comprise one or more of: data updater 176, object updater 177, GUI updater 178, and/or event data 179 received from event sorter 170. Event handler 190 optionally utilizes or calls data updater 176, object updater 177 or GUI updater 178 to update application internal state 192. Alternatively, one or more of the application views 191 include one or more respective event handlers 190. Additionally, in some embodiments, one or more of data updater 176, object updater 177, and GUI updater 178 are included in a respective application view 191.
The corresponding event recognizer 180 receives event information (e.g., event data 179) from the event classifier 170 and recognizes events from the event information. The event recognizer 180 includes an event receiver 182 and an event comparator 184. In some embodiments, event recognizer 180 also includes metadata 183 and at least a subset of event delivery instructions 188 (which optionally include sub-event delivery instructions).
The event receiver 182 receives event information from the event sorter 170. The event information includes information about a sub-event such as a touch or touch movement. According to the sub-event, the event information further includes additional information, such as the location of the sub-event. When the sub-event relates to motion of a touch, the event information optionally also includes the velocity and direction of the sub-event. In some embodiments, the event comprises rotation of the device from one orientation to another (e.g., from a portrait orientation to a landscape orientation, or vice versa), and the event information comprises corresponding information about the current orientation of the device (also referred to as the device pose).
Event comparator 184 compares the event information to predefined event or sub-event definitions and determines an event or sub-event or determines or updates the state of an event or sub-event based on the comparison. In some embodiments, event comparator 184 includes event definitions 186. Event definition 186 contains definitions of events (e.g., predefined sub-event sequences), such as event 1(187-1), event 2(187-2), and other events. In some embodiments, sub-events in event (187) include, for example, touch start, touch end, touch move, touch cancel, and multi-touch. In one example, the definition of event 1(187-1) is a double click on the displayed object. For example, a double tap includes a first touch (touch start) on the displayed object for a predetermined length of time, a first lift-off (touch end) for a predetermined length of time, a second touch (touch start) on the displayed object for a predetermined length of time, and a second lift-off (touch end) for a predetermined length of time. In another example, the definition of event 2(187-2) is a drag on the displayed object. For example, the drag includes a predetermined length of time of touch (or contact) on the displayed object, movement of the touch on the touch-sensitive display 112, and liftoff of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definition 187 includes definitions of events for respective user interface objects. In some embodiments, event comparator 184 performs a hit test to determine which user interface object is associated with a sub-event. For example, in an application view that displays three user interface objects on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit-test to determine which of the three user interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the results of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects the event handler associated with the sub-event and the object that triggered the hit test.
In some embodiments, the definition of the respective event 187 further includes a delay action that delays the delivery of the event information until it has been determined whether the sequence of sub-events does or does not correspond to the event type of the event identifier.
When the respective event recognizer 180 determines that the sequence of sub-events does not match any event in the event definition 186, the respective event recognizer 180 enters an event not possible, event failed, or event ended state, after which subsequent sub-events of the touch-based gesture are ignored. In this case, other event recognizers (if any) that remain active for the hit view continue to track and process sub-events of the ongoing touch-based gesture.
In some embodiments, the respective event recognizer 180 includes metadata 183 having configurable attributes, flags, and/or lists indicating how the event delivery system should perform sub-event delivery to actively participating event recognizers. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate how or how event recognizers interact with each other. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate whether a sub-event is delivered to a different level in the view or programmatic hierarchy.
In some embodiments, when one or more particular sub-events of an event are identified, the respective event identifier 180 activates an event handler 190 associated with the event. In some embodiments, the respective event identifier 180 delivers event information associated with the event to the event handler 190. Activating the event handler 190 is different from sending (and deferring) sub-events to the corresponding hit view. In some embodiments, the event recognizer 180 throws a flag associated with the recognized event, and the event handler 190 associated with the flag catches the flag and performs a predefined process.
In some embodiments, event delivery instructions 188 include sub-event delivery instructions that deliver event information about sub-events without activating an event handler. Instead, the sub-event delivery instructions deliver event information to event handlers associated with the sequence of sub-events or to actively participating views. Event handlers associated with the sequence of sub-events or with actively participating views receive the event information and perform a predetermined process.
In some embodiments, data updater 176 creates and updates data used in application 136-1. For example, the data updater 176 updates a phone number used in the contacts module 137 or stores a video file used in the video player module. In some embodiments, object updater 177 creates and updates objects used in application 136-1. For example, object updater 177 creates a new user interface object or updates the location of a user interface object. GUI updater 178 updates the GUI. For example, GUI updater 178 prepares display information and sends the display information to graphics module 132 for display on the touch-sensitive display.
In some embodiments, event handler 190 includes, or has access to, data updater 176, object updater 177, and GUI updater 178. In some embodiments, data updater 176, object updater 177, and GUI updater 178 are included in a single module of a respective application 136-1 or application view 191. In other embodiments, they are included in two or more software modules.
Fig. 1C is a block diagram illustrating a haptic output module according to some embodiments. In some embodiments, I/O subsystem 106 (e.g., haptic feedback controller 161 (fig. 1A) and/or other input controller 160 (fig. 1A)) includes at least some of the example components shown in fig. 1C. In some embodiments, peripheral interface 118 includes at least some of the example components shown in fig. 1C.
In some embodiments, the haptic output module includes a haptic feedback module 133. In some embodiments, the tactile feedback module 133 aggregates and combines tactile output from user interface feedback from software applications on the electronic device (e.g., feedback responsive to user input corresponding to a displayed user interface and prompts and other notifications indicating performance of operations or occurrence of events in the user interface of the electronic device). The haptic feedback module 133 includes one or more of a waveform module 123 (for providing a waveform for generating a haptic output), a mixer 125 (for mixing waveforms, such as waveforms in different channels), a compressor 127 (for reducing or compressing the dynamic range of the waveform), a low pass filter 129 (for filtering out high frequency signal components in the waveform), and a thermal controller 131 (for adjusting the waveform according to thermal conditions). In some embodiments, the haptic feedback module 133 is included in the haptic feedback controller 161 (fig. 1A). In some embodiments, a separate unit of the haptic feedback module 133 (or a separate implementation of the haptic feedback module 133) is also included in the audio controller (e.g., the audio circuit 110 in fig. 1A) and used to generate the audio signal. In some embodiments, a single haptic feedback module 133 is used to generate the audio signal and to generate the waveform of the haptic output.
In some embodiments, the haptic feedback module 133 also includes a trigger module 121 (e.g., a software application, operating system, or other software module that determines that a haptic output is to be generated and initiates a process for generating a corresponding haptic output). In some embodiments, the trigger module 121 generates a trigger signal for causing (e.g., by the waveform module 123) the generation of the waveform. For example, the trigger module 121 generates a trigger signal based on a preset timing criterion. In some embodiments, the trigger module 121 receives a trigger signal from outside the haptic feedback module 133 (e.g., in some embodiments, the haptic feedback module 133 receives a trigger signal from a hardware input processing module 146 located outside the haptic feedback module 133) and relays the trigger signal to other components within the haptic feedback module 133 (e.g., the waveform module 123) or to a software application that triggers an operation (e.g., using the trigger module 121) based on activation of a user interface element (e.g., an application icon or affordance within an application) or a hardware input device (e.g., a home button or intensity-sensitive input surface, such as an intensity-sensitive touch screen). In some embodiments, the trigger module 121 also receives haptic feedback generation instructions (e.g., from the haptic feedback module 133 in fig. 1A and 3). In some embodiments, the trigger module 121 generates the trigger signal in response to the haptic feedback module 133 (or the trigger module 121 in the haptic feedback module 133) (e.g., from the haptic feedback module 133 in fig. 1A and 3) receiving a haptic feedback instruction.
The waveform module 123 receives as input a trigger signal (e.g., from the trigger module 121) and provides waveforms for generating one or more tactile outputs (e.g., waveforms selected from a predefined set of waveforms assigned for use by the waveform module 123, such as the waveforms described in more detail below with reference to fig. 4C-4D) in response to receiving the trigger signal.
The mixer 125 receives waveforms as input (e.g., from the waveform module 123) and mixes the waveforms together. For example, when the mixer 125 receives two or more waveforms (e.g., a first waveform in a first channel and a second waveform in a second channel that at least partially overlaps the first waveform), the mixer 125 outputs a combined waveform corresponding to the sum of the two or more waveforms. In some embodiments, the mixer 125 also modifies one or more of the two or more waveforms to emphasize a particular waveform relative to the rest of the two or more waveforms (e.g., by increasing the scale of the particular waveform and/or decreasing the scale of the other of the waveforms). In some cases, mixer 125 selects one or more waveforms to remove from the combined waveform (e.g., when there are waveforms from more than three sources that have been requested to be output by tactile output generator 167 simultaneously, the waveform from the oldest source is discarded).
Compressor 127 receives as input waveforms (e.g., the combined waveforms from mixer 125) and modifies these waveforms. In some embodiments, compressor 127 reduces the waveforms (e.g., according to the physical specifications of tactile output generator 167 (fig. 1A) or 357 (fig. 3)) such that the tactile outputs corresponding to the waveforms are reduced. In some embodiments, the compressor 127 limits the waveform, such as by imposing a predefined maximum amplitude for the waveform. For example, the compressor 127 reduces the amplitude of the waveform portions that exceed a predefined amplitude threshold, while maintaining the amplitude of the waveform portions that do not exceed the predefined amplitude threshold. In some implementations, the compressor 127 reduces the dynamic range of the waveform. In some embodiments, compressor 127 dynamically reduces the dynamic range of the waveform such that the combined waveform remains within the performance specifications (e.g., force and/or movable mass displacement limits) of tactile output generator 167.
Low pass filter 129 receives as input a waveform (e.g., a compressed waveform from compressor 127) and filters (e.g., smoothes) the waveform (e.g., removes or reduces high frequency signal components in the waveform). For example, in some cases, compressor 127 includes extraneous signals (e.g., high frequency signal components) in the compressed waveform that prevent haptic output generation and/or exceed the performance specifications of haptic output generator 167 in generating haptic output from the compressed waveform. Low pass filter 129 reduces or removes such extraneous signals in the waveform.
Thermal controller 131 receives as input a waveform (e.g., a filtered waveform from low pass filter 129) and adjusts the waveform according to a thermal condition of apparatus 100 (e.g., based on an internal temperature detected within apparatus 100, such as a temperature of tactile feedback controller 161, and/or an external temperature detected by apparatus 100). For example, in some cases, the output of the tactile feedback controller 161 varies as a function of temperature (e.g., in response to receiving the same waveform, the tactile feedback controller 161 generates a first tactile output when the tactile feedback controller 161 is at a first temperature and a second tactile output when the tactile feedback controller 161 is at a second temperature different from the first temperature). For example, the magnitude of the haptic output may vary as a function of temperature. To reduce the effects of temperature variations, the waveform is modified (e.g., the amplitude of the waveform is increased or decreased based on temperature).
In some embodiments, the haptic feedback module 133 (e.g., trigger module 121) is coupled to the hardware input processing module 146. In some embodiments, the other input controller 160 in fig. 1A includes a hardware input processing module 146. In some embodiments, the hardware input processing module 146 receives input from a hardware input device 145 (e.g., the other input or control device 116 in fig. 1A, such as a home button, or an intensity-sensitive input surface, such as an intensity-sensitive touch screen). In some embodiments, the hardware input device 145 is any input device described herein, such as one of the touch-sensitive display system 112 (fig. 1A), the keyboard/mouse 350 (fig. 3), the trackpad 355 (fig. 3), the other input or control device 116 (fig. 1A), or an intensity-sensitive home button. In some embodiments, hardware input device 145 is comprised of an intensity-sensitive home button, rather than touch-sensitive display system 112 (FIG. 1A), keyboard/mouse 350 (FIG. 3), or trackpad 355 (FIG. 3). In some embodiments, in response to input from the hardware input device 145 (e.g., an intensity-sensitive home button or touch screen), the hardware input processing module 146 provides one or more trigger signals to the haptic feedback module 133 to indicate that a user input has been detected that meets predefined input criteria, such as an input corresponding to a "click" on the home button (e.g., "press click" or "release click"). In some embodiments, the tactile feedback module 133 provides a waveform corresponding to the primary button "click" in response to an input corresponding to the primary button "click" to simulate tactile feedback of pressing a physical primary button.
In some embodiments, the haptic output module includes a haptic feedback controller 161 (e.g., haptic feedback controller 161 in fig. 1A) that controls the generation of haptic outputs. In some embodiments, the tactile feedback controller 161 is coupled to a plurality of tactile output generators and selects one or more of the plurality of tactile output generators and sends a waveform to the selected one or more tactile output generators for generating a tactile output. In some embodiments, the haptic feedback controller 161 coordinates haptic output requests corresponding to activating the hardware input device 145 and haptic output requests corresponding to software events (e.g., haptic output requests from the haptic feedback module 133) and modifies one or more of the two or more waveforms to emphasize a particular waveform relative to the rest of the two or more waveforms (e.g., by increasing the scale of the particular waveform and/or decreasing the scale of the rest of the waveforms to preferentially process haptic output corresponding to activating the hardware input device 145 over haptic output corresponding to software events).
In some embodiments, as shown in fig. 1C, the output of haptic feedback controller 161 is coupled to an audio circuit of device 100 (e.g., audio circuit 110 in fig. 1A) and provides an audio signal to the audio circuit of device 100. In some embodiments, the tactile feedback controller 161 provides both a waveform for generating a tactile output and an audio signal for providing an audio output in conjunction with generating the tactile output. In some embodiments, the tactile feedback controller 161 modifies the audio signal and/or waveform (used to generate the haptic output) such that the audio output and the haptic output are synchronized (e.g., by delaying the audio signal and/or waveform). in some embodiments, the tactile feedback controller 161 includes a digital-to-analog converter for converting a digital waveform to an analog signal, which is received by the amplifier 163 and/or the haptic output generator 167.
In some embodiments, the haptic output module includes an amplifier 163. In some embodiments, amplifier 163 receives a waveform (e.g., from haptic feedback controller 161) and amplifies the waveform and then sends the amplified waveform to haptic output generator 167 (e.g., either of haptic output generators 167 (fig. 1A) or 357 (fig. 3)). For example, amplifier 163 amplifies the received waveform to a signal level that meets the physical specifications of tactile output generator 167 (e.g., to a voltage and/or current required by tactile output generator 167 to generate a tactile output such that the signal sent to tactile output generator 167 generates a tactile output corresponding to the waveform received from tactile feedback controller 161) and sends the amplified waveform to tactile output generator 167. In response, tactile output generator 167 generates a tactile output (e.g., by shifting the movable mass back and forth in one or more dimensions relative to a neutral position of the movable mass).
In some embodiments, the haptic output module includes a sensor 169 coupled to a haptic output generator 167. Sensor 169 detects a state or change in state (e.g., mechanical position, physical displacement, and/or movement) of tactile output generator 167 or one or more components of tactile output generator 167 (e.g., one or more moving components, such as a membrane, used to generate tactile output). In some embodiments, the sensor 169 is a magnetic field sensor (e.g., a hall effect sensor) or other displacement and/or motion sensor. In some embodiments, sensor 169 provides information (e.g., position, displacement, and/or movement of one or more components in tactile output generator 167) to tactile feedback controller 161, and tactile feedback controller 161 adjusts the waveform output from tactile feedback controller 161 (e.g., the waveform optionally sent to tactile output generator 167 via amplifier 163) based on the information provided by sensor 169 regarding the state of tactile output generator 167.
It should be understood that the above discussion of event processing with respect to user touches on a touch sensitive display also applies to other forms of user input utilizing an input device to operate multifunction device 100, not all of which are initiated on a touch screen. For example, mouse movements and mouse button presses, optionally in conjunction with single or multiple keyboard presses or holds; contact movements on the touchpad, such as tapping, dragging, scrolling, etc.; inputting by a stylus; movement of the device; verbal instructions; detected eye movement; inputting biological characteristics; and/or any combination thereof, is optionally used as input corresponding to sub-events defining the event to be identified.
Fig. 2 illustrates a portable multifunction device 100 with a touch screen 112 in accordance with some embodiments. The touch screen optionally displays one or more graphics within the User Interface (UI) 200. In this embodiment, as well as other embodiments described below, a user can select one or more of these graphics by making gestures on the graphics, for example, with one or more fingers 202 (not drawn to scale in the figures) or with one or more styluses 203 (not drawn to scale in the figures). In some embodiments, selection of one or more graphics will occur when the user breaks contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (left to right, right to left, up, and/or down), and/or a rolling of a finger (right to left, left to right, up, and/or down) that has made contact with device 100. In some implementations, or in some cases, inadvertent contact with a graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that sweeps over an application icon optionally does not select the corresponding application.
Device 100 optionally also includes one or more physical buttons, such as a "home" or menu button 204. As previously described, the menu button 204 is optionally used to navigate to any application 136 in a set of applications that are optionally executed on the device 100. Alternatively, in some embodiments, the menu buttons are implemented as soft keys in a GUI displayed on touch screen 112.
In some embodiments, device 100 includes touch screen 112, menu buttons 204, push buttons 206 for powering the device on/off and for locking the device, one or more volume adjustment buttons 208, a Subscriber Identity Module (SIM) card slot 210, a headset jack 212, and docking/charging external port 124. Pressing the button 206 optionally serves to turn the device on/off by pressing the button and holding the button in a pressed state for a predefined time interval; locking the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or unlocking the device or initiating an unlocking process. In an alternative embodiment, device 100 also accepts verbal input through microphone 113 for activating or deactivating some functions. Device 100 also optionally includes one or more contact intensity sensors 165 for detecting the intensity of contacts on touch screen 112, and/or one or more tactile output generators 167 for generating tactile outputs for a user of device 100.
Fig. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. The device 300 need not be portable. In some embodiments, the device 300 is a laptop computer, desktop computer, tablet computer, multimedia player device, navigation device, educational device (such as a child learning toy), gaming system, or control device (e.g., a home controller or industrial controller). Device 300 typically includes one or more processing units (CPUs) 310, one or more network or other communication interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. The communication bus 320 optionally includes circuitry (sometimes referred to as a chipset) that interconnects and controls communication between system components. Device 300 includes an input/output (I/O) interface 330 with a display 340, typically a touch screen display. I/O interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and a touchpad 355, a tactile output generator 357 (e.g., similar to tactile output generator 167 described above with reference to fig. 1A) for generating tactile outputs on device 300, sensors 359 (e.g., optical sensors, acceleration sensors, proximity sensors, touch-sensitive sensors, and/or contact intensity sensors (similar to contact intensity sensors 165 described above with reference to fig. 1A)). Memory 370 includes high speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. Memory 370 optionally includes one or more storage devices located remotely from CPU 310. In some embodiments, memory 370 stores programs, modules, and data structures similar to or a subset of the programs, modules, and data structures stored in memory 102 of portable multifunction device 100 (fig. 1A). Further, memory 370 optionally stores additional programs, modules, and data structures not present in memory 102 of portable multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk editing module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (FIG. 1A) optionally does not store these modules.
Each of the above elements in fig. 3 is optionally stored in one or more of the previously mentioned memory devices. Each of the above modules corresponds to a set of instructions for performing a function described above. The modules or programs (e.g., sets of instructions) described above need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are optionally combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 optionally stores a subset of the modules and data structures described above. Further, memory 370 optionally stores additional modules and data structures not described above.
Attention is now directed to embodiments of user interfaces optionally implemented on, for example, portable multifunction device 100.
Fig. 4A illustrates an exemplary user interface for a menu of applications on portable multifunction device 100, according to some embodiments. A similar user interface is optionally implemented on device 300. In some embodiments, the user interface 400 includes the following elements, or a subset or superset thereof:
one or more signal strength indicators 402 for one or more wireless communications (such as cellular signals and Wi-Fi signals);
time 404;
a bluetooth indicator 405;
a battery status indicator 406;
tray 408 with icons of common applications, such as:
an icon 416 of the telephony module 138 labeled "telephony", optionally including an indicator 414 of the number of missed calls or voice messages;
an icon 418 of the email client module 140 labeled "mail", optionally including an indicator 410 of the number of unread emails;
icon 420 of browser module 147, labeled "browser"; and
an icon 422 labeled "iPod" of video and music player module 152 (also referred to as iPod (trademark of Apple inc.) module 152); and
icons for other applications, such as:
icon 424 of IM module 141 labeled "message";
icon 426 of calendar module 148 labeled "calendar";
icon 428 of image management module 144 labeled "photo";
icon 430 of camera module 143 labeled "camera";
icon 432 of online video module 155 labeled "online video";
an icon 434 of the stock market desktop applet 149-2 labeled "stock market";
icon 436 of map module 154 labeled "map";
icon 438 labeled "weather" for weather desktop applet 149-1;
icon 440 of alarm clock desktop applet 149-4 labeled "clock";
icon 442 labeled "fitness support" for fitness support module 142;
icon 444 of notepad module 153 labeled "notepad"; and
an icon 446 labeled "settings" for setting applications or modules, which provides access to the settings of device 100 and its various applications 136.
It should be noted that the icon labels shown in fig. 4A are merely exemplary. For example, icon 422 of video and music player module 152 is labeled "music" or "music player". Other tabs are optionally used for the various application icons. In some embodiments, the label of the respective application icon includes a name of the application corresponding to the respective application icon. In some embodiments, the label of a particular application icon is different from the name of the application corresponding to the particular application icon.
Fig. 4B illustrates an exemplary user interface on a device (e.g., device 300 of fig. 3) having a touch-sensitive surface 451 (e.g., tablet or trackpad 355 of fig. 3) separate from a display 450 (e.g., touchscreen display 112). Device 300 also optionally includes one or more contact intensity sensors (e.g., one or more of sensors 359) to detect the intensity of contacts on touch-sensitive surface 451 and/or one or more tactile output generators 357 to generate tactile outputs for a user of device 300.
Although some of the examples below will be given with reference to input on touch screen display 112 (where the touch-sensitive surface and the display are combined), in some embodiments, the device detects input on a touch-sensitive surface that is separate from the display, as shown in fig. 4B. In some implementations, the touch-sensitive surface (e.g., 451 in fig. 4B) has a primary axis (e.g., 452 in fig. 4B) that corresponds to a primary axis (e.g., 453 in fig. 4B) on the display (e.g., 450). In accordance with these embodiments, the device detects contacts (e.g., 460 and 462 in fig. 4B) with the touch-sensitive surface 451 at locations that correspond to respective locations on the display (e.g., in fig. 4B, 460 corresponds to 468 and 462 corresponds to 470). As such, when the touch-sensitive surface (e.g., 451 in fig. 4B) is separated from the display (450 in fig. 4B) of the multifunction device, user inputs (e.g., contacts 460 and 462 and their movements) detected by the device on the touch-sensitive surface are used by the device to manipulate the user interface on the display. It should be understood that similar methods are optionally used for the other user interfaces described herein.
Additionally, while the following examples are given primarily with reference to finger inputs (e.g., finger contact, single-finger tap gesture, finger swipe gesture), it should be understood that in some embodiments one or more of these finger inputs are replaced by inputs from another input device (e.g., mouse-based inputs or stylus inputs). For example, the swipe gesture is optionally replaced by a mouse click (e.g., rather than a contact), followed by movement of the cursor along the path of the swipe (e.g., rather than movement of the contact). As another example, a flick gesture is optionally replaced by a mouse click (e.g., instead of detecting a contact, followed by ceasing to detect a contact) while the cursor is over the location of the flick gesture. Similarly, when multiple user inputs are detected simultaneously, it should be understood that multiple computer mice are optionally used simultaneously, or mouse and finger contacts are optionally used simultaneously.
Fig. 5A illustrates an exemplary personal electronic device 500. The device 500 includes a body 502. In some embodiments, device 500 may include some or all of the features described with respect to devices 100 and 300 (e.g., fig. 1A-4B). In some embodiments, the device 500 has a touch-sensitive display screen 504, hereinafter referred to as a touch screen 504. Alternatively or in addition to the touch screen 504, the device 500 also has a display and a touch-sensitive surface. As with devices 100 and 300, in some embodiments, touch screen 504 (or touch-sensitive surface) optionally includes one or more intensity sensors for detecting the intensity of an applied contact (e.g., touch). One or more intensity sensors of the touch screen 504 (or touch-sensitive surface) may provide output data representing the intensity of a touch. The user interface of device 500 may respond to a touch based on its intensity, meaning that different intensities of touches may invoke different user interface operations on device 500.
Exemplary techniques for detecting and processing touch intensity are found, for example, in the following related patent applications: international patent Application Ser. No. PCT/US2013/040061, entitled "Device, Method, and Graphical User Interface for displaying User Interface Objects reforming to an Application", filed on 8.5.2013, published as WIPO patent publication No. WO/2013/169849; and International patent application Ser. No. PCT/US2013/069483 entitled "Device, Method, and Graphical User Interface for transiting between WEeTouch Input to Display Output Relationships", filed 2013, 11/11, published as WIPO patent publication No. WO/2014/105276, each of which is hereby incorporated by reference in its entirety.
In some embodiments, the device 500 has one or more input mechanisms 506 and 508. The input mechanisms 506 and 508 (if included) may be in physical form. Examples of physical input mechanisms include push buttons and rotatable mechanisms. In some embodiments, device 500 has one or more attachment mechanisms. Such attachment mechanisms, if included, may allow for attachment of the device 500 with, for example, a hat, glasses, earrings, necklace, shirt, jacket, bracelet, watchband, bracelet, pants, belt, shoe, purse, backpack, and the like. These attachment mechanisms allow the user to wear the device 500.
Fig. 5B illustrates an exemplary personal electronic device 500. In some embodiments, the apparatus 500 may include some or all of the components described with respect to fig. 1A, 1B, and 3. The device 500 has a bus 512 that operatively couples an I/O portion 514 with one or more computer processors 516 and a memory 518. The I/O portion 514 may be connected to the display 504, which may have a touch sensitive member 522 and optionally an intensity sensor 524 (e.g., a contact intensity sensor). Further, I/O portion 514 may interface with communication unit 530 for receiving application programs and operating system data using Wi-Fi, Bluetooth, Near Field Communication (NFC), cellular, and/or other wireless communication techniques. Device 500 may include input mechanisms 506 and/or 508. For example, the input mechanism 506 is optionally a rotatable input device or a depressible input device and a rotatable input device. In some examples, the input mechanism 508 is optionally a button.
In some examples, the input mechanism 508 is optionally a microphone. Personal electronic device 500 optionally includes various sensors, such as a GPS sensor 532, an accelerometer 534, an orientation sensor 540 (e.g., a compass), a gyroscope 536, a motion sensor 538, and/or combinations thereof, all of which may be operatively connected to I/O portion 514.
Memory 518 of personal electronic device 500 may include one or more non-transitory computer-readable storage media for storing computer-executable instructions that, when executed by one or more computer processors 516, may cause the computer processors to perform techniques including process 800 (fig. 8), process 1000 (fig. 10), process 1200 (fig. 12), process 1400 (fig. 14), process 1600 (fig. 16), process 1800 (fig. 18), process 2000 (fig. 20), process 2200 (fig. 22), process 2500 (fig. 25), process 2700 (fig. 27), process 2900 (fig. 29), process 3100 (fig. 31), process 3300 (fig. 33), and process 3500 (fig. 35), 3700 (fig. 37), 3900 (fig. 39), 4100 (fig. 41), 4300 (fig. 43), for example. A computer readable storage medium may be any medium that can tangibly contain or store computer-executable instructions for use by or in connection with an instruction execution system, apparatus, or device. In some examples, the storage medium is a transitory computer-readable storage medium. In some examples, the storage medium is a non-transitory computer-readable storage medium. The non-transitory computer readable storage medium may include, but is not limited to, magnetic storage devices, optical storage devices, and/or semiconductor storage devices. Examples of such storage devices include magnetic disks, optical disks based on CD, DVD, or blu-ray technology, and persistent solid state memory such as flash memory, solid state drives, and the like. The personal electronic device 500 is not limited to the components and configuration of fig. 5B, but may include other components or additional components in a variety of configurations.
As used herein, the term "affordance" refers to a user-interactive graphical user interface object that is optionally displayed on a display screen of device 100,300, and/or 500 (fig. 1A, 3, and 5A-5B). For example, images (e.g., icons), buttons, and text (e.g., hyperlinks) optionally each constitute an affordance.
As used herein, the term "focus selector" refers to an input element that indicates the current portion of the user interface with which the user is interacting. In some implementations that include a cursor or other position marker, the cursor acts as a "focus selector" such that when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touchpad 355 in fig. 3 or touch-sensitive surface 451 in fig. 4B) while the cursor is over a particular user interface element (e.g., a button, window, slider, or other user interface element), the particular user interface element is adjusted in accordance with the detected input. In some implementations that include a touch screen display (e.g., touch-sensitive display system 112 in fig. 1A or touch screen 112 in fig. 4A) that enables direct interaction with user interface elements on the touch screen display, a detected contact on the touch screen acts as a "focus selector" such that when an input (e.g., a press input by the contact) is detected at a location of a particular user interface element (e.g., a button, window, slider, or other user interface element) on the touch screen display, the particular user interface element is adjusted in accordance with the detected input. In some implementations, the focus is moved from one area of the user interface to another area of the user interface without corresponding movement of a cursor or movement of a contact on the touch screen display (e.g., by moving the focus from one button to another using tab or arrow keys); in these implementations, the focus selector moves according to movement of the focus between different regions of the user interface. Regardless of the particular form taken by the focus selector, the focus selector is typically a user interface element (or contact on a touch screen display) that is controlled by the user to deliver the user's intended interaction with the user interface (e.g., by indicating to the device the element with which the user of the user interface desires to interact). For example, upon detection of a press input on a touch-sensitive surface (e.g., a touchpad or touchscreen), the location of a focus selector (e.g., a cursor, contact, or selection box) over a respective button will indicate that the user desires to activate the respective button (as opposed to other user interface elements shown on the device display).
As used in the specification and in the claims, the term "characteristic intensity" of a contact refers to a characteristic of the contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on a plurality of intensity samples. The characteristic intensity is optionally based on a predefined number of intensity samples or a set of intensity samples acquired during a predetermined time period (e.g., 0.05 seconds, 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, 10 seconds) relative to a predefined event (e.g., after detecting contact, before detecting contact liftoff, before or after detecting contact start movement, before or after detecting contact end, before or after detecting an increase in intensity of contact, and/or before or after detecting a decrease in intensity of contact). The characteristic intensity of the contact is optionally based on one or more of: a maximum value of the intensity of the contact, a mean value of the intensity of the contact, an average value of the intensity of the contact, a value at the top 10% of the intensity of the contact, a half-maximum value of the intensity of the contact, a 90% maximum value of the intensity of the contact, and the like. In some embodiments, the duration of the contact is used in determining the characteristic intensity (e.g., when the characteristic intensity is an average of the intensity of the contact over time). In some embodiments, the characteristic intensity is compared to a set of one or more intensity thresholds to determine whether the user has performed an operation. For example, the set of one or more intensity thresholds optionally includes a first intensity threshold and a second intensity threshold. In this example, a contact whose characteristic intensity does not exceed the first threshold results in a first operation, a contact whose characteristic intensity exceeds the first intensity threshold but does not exceed the second intensity threshold results in a second operation, and a contact whose characteristic intensity exceeds the second threshold results in a third operation. In some embodiments, a comparison between the feature strengths and one or more thresholds is used to determine whether to perform one or more operations (e.g., whether to perform the respective operation or to forgo performing the respective operation) rather than to determine whether to perform the first operation or the second operation.
FIG. 5C illustrates the detection of multiple contacts 552A-552E on the touch-sensitive display screen 504 using multiple intensity sensors 524A-524D. FIG. 5C also includes an intensity map that shows current intensity measurements of the intensity sensors 524A-524D relative to intensity units. In this example, the intensity measurements of intensity sensors 524A and 524D are each 9 intensity units, and the intensity measurements of intensity sensors 524B and 524C are each 7 intensity units. In some implementations, the cumulative intensity is a sum of intensity measurements of the plurality of intensity sensors 524A-524D, which in this example is 32 intensity units. In some embodiments, each contact is assigned a respective intensity, i.e., a fraction of the cumulative intensity. FIG. 5D illustrates assigning cumulative intensities to the contacts 552A-552E based on their distances from the center of the force 554. In this example, each of contacts 552A, 552B, and 552E is assigned a strength of 8 strength units of contact of cumulative strength, and each of contacts 552C and 552D is assigned a strength of 4 strength units of contact of cumulative strength. More generally, in some implementations, each contact j is assigned a respective intensity Ij, which is a portion of the cumulative intensity a, according to a predefined mathematical function Ij ═ a · (Dj/Σ Di), where Dj is the distance of the respective contact j from the force center, and Σ Di is the sum of the distances of all respective contacts (e.g., i ═ 1 to last) from the force center. The operations described with reference to fig. 5C-5D may be performed using an electronic device similar or identical to device 100,300, or 500. In some embodiments, the characteristic intensity of the contact is based on one or more intensities of the contact. In some embodiments, an intensity sensor is used to determine a single characteristic intensity (e.g., a single characteristic intensity of a single contact). It should be noted that the intensity map is not part of the displayed user interface, but is included in fig. 5C-5D to assist the reader.
In some implementations, a portion of the gesture is recognized for determining the feature intensity. For example, the touch-sensitive surface optionally receives a continuous swipe contact that transitions from a starting location and reaches an ending location where the contact intensity increases. In this example, the characteristic intensity of the contact at the end location is optionally based on only a portion of the continuous swipe contact, rather than the entire swipe contact (e.g., only the portion of the swipe contact at the end location). In some embodiments, a smoothing algorithm is optionally applied to the intensity of the swipe contact before determining the characteristic intensity of the contact. For example, the smoothing algorithm optionally includes one or more of: a non-weighted moving average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some cases, these smoothing algorithms eliminate narrow spikes or dips in the intensity of the swipe contact for the purpose of determining the feature intensity.
Contact intensity on the touch-sensitive surface is optionally characterized relative to one or more intensity thresholds, such as a contact detection intensity threshold, a light press intensity threshold, a deep press intensity threshold, and/or one or more other intensity thresholds. In some embodiments, the light press intensity threshold corresponds to an intensity that: at which intensity the device will perform the operations typically associated with clicking a button of a physical mouse or touchpad. In some embodiments, the deep press intensity threshold corresponds to an intensity that: at which intensity the device will perform a different operation than that typically associated with clicking a button of a physical mouse or trackpad. In some embodiments, when a contact is detected whose characteristic intensity is below a light press intensity threshold (e.g., and above a nominal contact detection intensity threshold, a contact below the nominal contact detection intensity threshold is no longer detected), the device will move the focus selector in accordance with movement of the contact on the touch-sensitive surface without performing operations associated with a light press intensity threshold or a deep press intensity threshold. Generally, unless otherwise stated, these intensity thresholds are consistent between different sets of user interface drawings.
Increasing the contact characteristic intensity from an intensity below the light press intensity threshold to an intensity between the light press intensity threshold and the deep press intensity threshold is sometimes referred to as a "light press" input. Increasing the contact characteristic intensity from an intensity below the deep press intensity threshold to an intensity above the deep press intensity threshold is sometimes referred to as a "deep press" input. Increasing the contact characteristic intensity from an intensity below the contact detection intensity threshold to an intensity between the contact detection intensity threshold and the light press intensity threshold is sometimes referred to as detecting a contact on the touch surface. The decrease in the characteristic intensity of the contact from an intensity above the contact detection intensity threshold to an intensity below the contact detection intensity threshold is sometimes referred to as detecting lift-off of the contact from the touch surface. In some embodiments, the contact detection intensity threshold is zero. In some embodiments, the contact detection intensity threshold is greater than zero.
In some embodiments described herein, one or more operations are performed in response to detecting a gesture that includes a respective press input or in response to detecting a respective press input performed with a respective contact (or contacts), wherein the respective press input is detected based at least in part on detecting an increase in intensity of the contact (or contacts) above a press input intensity threshold. In some embodiments, the respective operation is performed in response to detecting an increase in intensity of the respective contact above a press input intensity threshold (e.g., a "down stroke" of the respective press input). In some embodiments, the press input includes an increase in intensity of the respective contact above a press input intensity threshold and a subsequent decrease in intensity of the contact below the press input intensity threshold, and the respective operation is performed in response to detecting a subsequent decrease in intensity of the respective contact below the press input threshold (e.g., an "up stroke" of the respective press input).
5E-5H illustrate detection of a gesture that includes an intensity of contact 562 from below the light press intensity threshold in FIG. 5E (e.g., "IT" pressure thresholdL") intensityIncrease above the deep press intensity threshold in FIG. 5H (e.g., "IT)D") intensity corresponds to a press input. On the displayed user interface 570 including the application icons 572A-572D displayed in the predefined area 574, a gesture performed with the contact 562 is detected on the touch-sensitive surface 560 while the cursor 576 is displayed over the application icon 572B corresponding to application 2. In some implementations, the gesture is detected on the touch-sensitive display 504. The intensity sensor detects the intensity of the contact on the touch-sensitive surface 560. The device determines that the intensity of contact 562 is at a deep press intensity threshold (e.g., "ITD") above peak. A contact 562 is maintained on the touch-sensitive surface 560. In response to detecting the gesture, and in accordance with the intensity rising to the deep press intensity threshold during the gesture (e.g., "IT)D") above, displays reduced-scale representations 578A-578C (e.g., thumbnails) of the document recently opened for application 2, as shown in fig. 5F-5H. In some embodiments, the intensity is a characteristic intensity of the contact compared to one or more intensity thresholds. It should be noted that the intensity map for contact 562 is not part of the displayed user interface, but is included in fig. 5E-5H to assist the reader.
In some embodiments, the display of representations 578A-578C includes animation. For example, the representation 578A is initially displayed in proximity to the application icon 572B, as shown in fig. 5F. As the animation progresses, the representation 578A moves upward and a representation 578B is displayed adjacent to the application icon 572B, as shown in fig. 5G. Representation 578A then moves upward, 578B moves upward toward representation 578A, and representation 578C is displayed adjacent to application icon 572B, as shown in fig. 5H. Representations 578A-578C form an array over icon 572B. In some embodiments, the animation progresses according to the intensity of contact 562, as shown in fig. 5F-5G, where representations 578A-578C appear and press an intensity threshold deep with the intensity of contact 562 (e.g., "ITD") increases and moves upward. In some embodiments, the intensity at which the animation progresses is a characteristic intensity of the contact. The operations described with reference to fig. 5E-5H may be performed using an electronic device similar or identical to device 100,300, or 500.
In some embodiments, the device employs intensity hysteresis to avoid accidental input sometimes referred to as "jitter," where the device defines or selects a hysteresis intensity threshold having a predefined relationship to the press input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units lower than the press input intensity threshold, or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the press input intensity threshold). Thus, in some embodiments, the press input includes an increase in intensity of the respective contact above a press input intensity threshold and a subsequent decrease in intensity of the contact below a hysteresis intensity threshold corresponding to the press input intensity threshold, and the respective operation is performed in response to detecting a subsequent decrease in intensity of the respective contact below the hysteresis intensity threshold (e.g., an "upstroke" of the respective press input). Similarly, in some embodiments, a press input is detected only when the device detects an increase in contact intensity from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press input intensity threshold and optionally a subsequent decrease in contact intensity to an intensity at or below the hysteresis intensity, and a corresponding operation is performed in response to detecting the press input (e.g., depending on the circumstances, the increase in contact intensity or the decrease in contact intensity).
For ease of explanation, optionally, a description of an operation performed in response to a press input associated with a press input intensity threshold or in response to a gesture that includes a press input is triggered in response to detection of any of the following: the contact intensity increases above the press input intensity threshold, the contact intensity increases from an intensity below the hysteresis intensity threshold to an intensity above the press input intensity threshold, the contact intensity decreases below the press input intensity threshold, and/or the contact intensity decreases below the hysteresis intensity threshold corresponding to the press input intensity threshold. Additionally, in examples in which operations are described as being performed in response to detecting that the intensity of the contact decreases below the press input intensity threshold, the operations are optionally performed in response to detecting that the intensity of the contact decreases below a hysteresis intensity threshold that corresponds to and is less than the press input intensity threshold.
As used herein, an "installed application" refers to a software application that has been downloaded to an electronic device (e.g., device 100,300, and/or 500) and is ready to be launched (e.g., become open) on the device. In some embodiments, the downloaded application is changed to an installed application with an installer, the installed application extracts program portions from the downloaded software package and integrates the extracted portions with the operating system of the computer system.
As used herein, the term "open application" or "executing application" refers to a software application that has maintained state information (e.g., as part of device/global internal state 157 and/or application internal state 192). The open or executing application is optionally any of the following types of applications:
an active application, which is currently displayed on the display screen of the device on which the application is being used;
a background application (or background process) that is not currently displayed but one or more processes of the application are being processed by one or more processors; and
a suspended or dormant application that is not running but has state information stored in memory (volatile and non-volatile, respectively) and that can be used to resume execution of the application.
As used herein, the term "closed application" refers to a software application that does not have retained state information (e.g., the state information of the closed application is not stored in the memory of the device). Thus, closing an application includes stopping and/or removing the application's application process and removing the application's state information from the device's memory. Generally, while in a first application, opening a second application does not close the first application. The first application becomes the background application when the second application is displayed and the first application stops being displayed.
Attention is now directed to embodiments of a user interface ("UI") and associated processes implemented on an electronic device, such as portable multifunction device 100, device 300, or device 500.
Fig. 6 illustrates an exemplary device connected via one or more communication channels to participate in a transaction, according to some embodiments. One or more exemplary electronic devices (e.g., devices 100,300, and 500) are configured to optionally detect an input (e.g., a specific user input, an NFC field), and optionally transmit payment information (e.g., using NFC). The one or more electronic devices optionally include NFC hardware and are configured to support NFC.
The electronic device (e.g., devices 100,300, and 500) is optionally configured to store payment account information associated with each of the one or more payment accounts. The payment account information includes, for example, one or more of: a person or company name, a billing address, a login name, a password, an account number, an expiration date, a security code, a phone number, a bank associated with a payment account (e.g., a issuing bank), and a card network identifier. In some examples, the payment account information includes an image, such as a photograph of the payment card (e.g., a photograph taken by and/or received by the device). In some examples, the electronic device receives user input including at least some payment account information (e.g., receives a user-entered credit card number, debit card number, account number, or shopping card number, and an expiration date). In some examples, the electronic device detects at least some payment account information from an image (e.g., of a payment card captured by a camera sensor of the device). In some examples, the electronic device receives at least some payment account information from another device (e.g., another user device or a server). In some examples, the electronic device receives payment account information from a server associated with another service (e.g., an application for renting or selling audio and/or video files) for which the account of the user or user device has previously made a purchase or identified payment account data.
In some embodiments, a payment account is added to an electronic device (e.g., devices 100,300, and 500) such that payment account information is securely stored on the electronic device. In some examples, after a user initiates such a process, the electronic device transmits information of the payment account to a transaction coordination server, which then communicates with a server (e.g., a payment server) operated by the payment network of the account to ensure the validity of the information. The electronic device is optionally configured to receive the script from a server that allows the electronic device to program payment information for the account onto the secure element.
In some embodiments, communication between devices 100,300, and 500 facilitates a transaction (e.g., a general transaction or a specific transaction). For example, a first electronic device (e.g., 100) may function as a configuration device or a management device and may send new or updated payment account data (e.g., information for a new account, updated information for an existing account, and/or alerts regarding an existing account) to a second electronic device (e.g., 500). As another example, a first electronic device (e.g., 100) may transmit data to a second electronic device, where the data reflects information about a payment transaction facilitated at the first electronic device. The information optionally includes one or more of: payment amount, account used, time of purchase, and whether to change the default account. The second device (e.g., 500) optionally uses such information to update the default payment account (e.g., based on a learning algorithm or explicit user input).
Electronic devices (e.g., 100,300,500) are configured to communicate with each other over any of a variety of networks. For example, the devices communicate using a bluetooth connection 608 (e.g., which includes a conventional bluetooth connection or a bluetooth low energy connection) or using a WiFi network 606. Communications between user devices are optionally adjusted to reduce the likelihood of improper sharing of information between devices. For example, communication regarding payment information requires that the communication devices be paired (e.g., associated with each other via explicit user interaction) or associated with the same user account.
In some embodiments, the electronic device (e.g., 100,300,500) is used to communicate with a point of sale (POS) payment terminal 600, which optionally supports NFC. The communication is optionally conducted using various communication channels and/or techniques. In some examples, the electronic device (e.g., 100,300,500) uses the NFC channel 610 to communicate with the payment terminal 600. In some embodiments, the payment terminal 600 uses a peer-to-peer NFC mode to communicate with an electronic device (e.g., 100,300, 500). The electronic device (e.g., 100,300,500) is optionally configured to transmit a signal to the payment terminal 600 that includes payment information for the payment account (e.g., a default account or an account selected for a particular transaction).
In some embodiments, proceeding with the transaction includes transmitting a signal including payment information for an account (such as a payment account). In some embodiments, proceeding with the transaction includes reconfiguring the electronic device (e.g., 100,300,500) to respond as a contactless payment card (such as an NFC-enabled contactless payment card) and then transmitting credentials of the account via NFC, such as to the payment terminal 600. In some embodiments, after transmitting credentials of the account via NFC, the electronic device is reconfigured not to respond as a contactless payment card (e.g., authorization is required before being reconfigured again to respond as a contactless payment card via NFC).
In some embodiments, the generation and/or transmission of the signal is controlled by a secure element in the electronic device (e.g., 100,300, 500). The secure element optionally requires specific user input before releasing the payment information. For example, the secure element optionally requires: detecting an electronic device being worn, detecting a button press, detecting a password input, detecting a touch, detecting one or more option selections (e.g., an option selection received while interacting with an application), detecting a fingerprint signature, detecting a voice or voice command, and/or detecting a gesture or movement (e.g., rotation or acceleration). In some examples, if a communication channel (e.g., NFC communication channel) is established with another device (e.g., payment terminal 600) within a defined period of time from the detection of the input, the secure element releases the payment information to be transmitted to the other device (e.g., payment terminal 600). In some examples, the secure element is a hardware component that controls the release of the secure information. In some examples, the secure element is a software component that controls the release of secure information.
In some embodiments, the protocol associated with transaction participation depends on, for example, the type of device. For example, the conditions under which payment information is generated and/or transmitted may be different for a wearable device (e.g., device 500) and a phone (e.g., device 100). For example, the generation condition and/or the transmission condition for the wearable device includes detecting that a button has been depressed (e.g., after security verification), while the corresponding condition for the phone does not require button depression, but rather requires detection of a particular interaction with the application. In some embodiments, the conditions for transmitting and/or releasing payment information include receiving a specific input on each of the plurality of devices. For example, release of payment information optionally requires detection of a fingerprint and/or password at a device (e.g., device 100) and detection of a mechanical input (e.g., a button press) on another device (e.g., device 500).
The payment terminal 600 optionally uses the payment information to generate a signal for transmission to the payment server 604 to determine whether the payment is authorized. The payment server 604 optionally includes any device or system configured to receive payment information associated with a payment account and determine whether a proposed purchase is authorized. In some examples, the payment server 604 comprises a server of an issuing bank. The payment terminal 600 communicates with the payment server 604 directly or indirectly via one or more devices or systems (e.g., servers of an acquiring bank and/or servers of a card network).
The payment server 604 optionally uses at least some of the payment information to identify the user account from among a database of user accounts (e.g., 602). For example, each user account includes payment information. The account is optionally located by locating an account having specific payment information that matches the information from the POS communication. In some examples, payment is denied when the provided payment information is inconsistent (e.g., the expiration date does not correspond to a credit card number, a debit card number, or a shopping card number) or when no account includes payment information that matches the information from the POS communication.
In some embodiments, the data of the user account also identifies one or more restrictions (e.g., credit limits); a current balance or a previous balance; previous transaction date, location and/or amount; account status (e.g., active or frozen) and/or authorization instructions. In some examples, a payment server (e.g., 604) uses such data to determine whether to authorize a payment. For example, the payment server denies payment when a purchase amount added to the current balance would result in exceeding an account limit, when an account is frozen, when a previous transaction amount exceeds a threshold, or when a previous number or frequency of transactions exceeds a threshold.
In some embodiments, the payment server 604 responds to the POS payment terminal 600 with an indication of whether the proposed purchase is authorized or denied. In some examples, the POS payment terminal 600 transmits a signal to the electronic device (e.g., 100,300,500) to identify the result. For example, when a purchase is authorized (e.g., via a transaction coordination server managing a transaction application on a user device), POS payment terminal 600 sends a receipt to an electronic device (e.g., 100,300, 500). In some cases, POS payment terminal 600 presents an output (e.g., a visual output or an audio output) indicative of the result. The payment may be transmitted to the merchant as part of the authorization process, or may be transmitted at a later time.
In some embodiments, the electronic device (e.g., 100,300,500) participates in a transaction that is completed without involving the POS payment terminal 600. For example, upon detecting that mechanical input has been received, a secure element in the electronic device (e.g., 100,300,500) releases payment information to allow an application on the electronic device to access the information (e.g., and transmit the information to a server associated with the application).
In some embodiments, the electronic device (e.g., 100,300,500) is in a locked state or an unlocked state. In the locked state, the electronic device is powered on and operable, but is prevented from performing a predefined set of operations in response to user input. The predefined set of operations optionally includes navigating between user interfaces, activating or deactivating a predefined set of functions, and activating or deactivating certain applications. The locked state may be used to prevent unintended or unauthorized use of some functions of the electronic device, or to activate or deactivate some functions on the electronic device. In the unlocked state, the electronic device 100 is powered on and operable and is not prevented from performing at least a portion of a predefined set of operations that cannot be performed while in the locked state.
When a device is in a locked state, the device is said to be locked. In some embodiments, the device in the locked state is optionally responsive to a limited set of user inputs including an input corresponding to an attempt to transition the device to the unlocked state or an input corresponding to closing the device.
In some examples, the secure element (e.g., 115) is a hardware component (e.g., a secure microcontroller chip) configured to securely store data or algorithms such that the device cannot access the securely stored data without proper authentication information from a user of the device. Maintaining the securely stored data in a secure element separate from other storage on the device prevents access to the securely stored data even if other storage locations on the device are compromised (e.g., by malicious code or other attempts to compromise information stored on the device). In some examples, the secure element provides (or issues) payment information (e.g., account number and/or transaction-specific dynamic security code). In some examples, the secure element provides (or releases) payment information in response to the device receiving an authorization, such as user authentication (e.g., fingerprint authentication; password authentication; detecting two presses of a hardware button while the device is in an unlocked state and optionally while the device has been continuously on the user's wrist since unlocking the device by providing authentication credentials to the device, wherein the continuous presence of the device on the user's wrist is determined by periodically checking that the device is in contact with the user's skin). For example, the device detects a fingerprint at a fingerprint sensor of the device (e.g., a fingerprint sensor integrated into a button). The device determines whether the fingerprint is consistent with the enrolled fingerprint. Upon determining that the fingerprint is consistent with the enrolled fingerprint, the secure element provides (or issues) payment information. Upon determining that the fingerprint does not correspond to the enrolled fingerprint, the secure element foregoes providing (or issuing) the payment information.
Attention is now directed to embodiments of a user interface ("UI") and associated processes implemented on an electronic device, such as portable multifunction device 100, device 300, or device 500.
Fig. 7A-7S illustrate an example user interface for providing an instructional tutorial for registering biometric features on an electronic device (e.g., device 100, device 300, or device 500) according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the process in FIG. 8.
Fig. 7A shows an electronic device 700 (e.g., portable multifunction device 100, device 300, or device 500). In the illustrative example shown in fig. 7A-7S, the electronic device 700 is a smartphone. In other examples, the electronic device 1500 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 700 has a display 702, one or more input devices (e.g., a touch screen of the display 1502, buttons, a microphone), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 703), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, one or more of the biometric sensors are biometric sensors (e.g., facial recognition sensors), such as those described in U.S. serial No. 14/341,860 entitled "overlaying Pattern Projector," U.S. publication No. 2016/0025993 and U.S. serial No. 13/810,451, U.S. patent 9,098,931 entitled "Scanning projectors and Image Capture Modules For3D Mapping," filed 7, 14/2014, which is hereby incorporated by reference in its entirety For any purpose. In some examples, the electronic device includes a depth camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate the object during capture of the image by a visible light camera and a depth camera (e.g., an IR camera), and information from the depth camera and the visible light camera is used to determine depth maps of different portions of the object captured by the visible light camera. In some examples, the lighting effects described herein are displayed using disparity information from two cameras (e.g., two visible light cameras) for backward images, and depth information from a depth camera is used in combination with image data from the visible light cameras for forward images (e.g., self-portrait images). In some examples, using the same user interface provides a consistent experience for a user when using two visible light cameras to determine depth information and when using depth cameras to determine depth information, even when using distinct techniques to determine information used when generating a lighting effect. In some examples, while displaying the camera user interface using one of the applied lighting effects, the device detects a selection of the camera switching affordance and switches from a forward-facing video camera (e.g., a depth camera and a visible light camera) to a backward-facing video camera (e.g., two visible light cameras spaced apart from each other) (or vice versa) while maintaining the display of the user interface controls for applying the lighting effect and replacing the display of the field of view of the forward-facing video camera with the field of view of the backward-facing video camera (or vice versa).
As shown in fig. 7A, the device 700 displays a device setting user interface 702 on a display 701. In some examples, the display device settings user interface 702 is first powered up by a user when the device 700 is first powered up (e.g., when a factory-sealed device is first powered up). In some examples, device settings user interface 702 is displayed when device 700 is reset to factory settings. The phone settings user interface 702 includes one or more prompts 704. In the example of fig. 7A, the prompt 704 is clear text that prompts the user to proceed with initial device settings (e.g., language selection, authentication measures, etc.). Device setup interface 702 includes one or more affordances, such as a continue affordance 706 and a skip affordance 708. In some examples, in response to detecting a user input corresponding to activation of skip affordance 708, device 700 optionally displays a main user interface, such as the user interface of fig. 4A, without setting one or more features.
As shown in FIG. 7B, upon display of the settings interface 702, the electronic device 700 detects activation (e.g., selection) of the continuation affordance 706. In some examples, the activation is a tap gesture 710 on the contact region 710 at the continuation affordance 706. In some examples in which display 700 is a touch-sensitive display, the activation of the continuation affordance is a touch, swipe, or other gesture on the display surface at contact area 710. In some examples where the display 700 is not touch sensitive, the user input is a keyboard input or an activation of a continuation affordance 706 having a focus selector (e.g., a mouse cursor).
In response to detecting activation of the continuation affordance 706, the device displays a facial authentication tutorial interface 712 as shown in FIG. 7C. In some examples, the facial authentication settings interface 712 is displayed in response to completing a previous stage of the device settings user interface process or in response to selecting a facial authentication enrollment option in the settings user interface. The facial authentication settings interface 712 includes one or more prompts 714, a continue affordance 716, and a later affordance 718. In the example of FIG. 7C, prompt 714 is in clear text indicating that the user has the option of setting facial authentication in place of a digital password. The facial authentication settings interface 712 also includes a graphical representation of the face (e.g., a biometric authentication flag 720) displayed within the viewfinder element 722. In the example of fig. 7C, the framing element 722 is rectangular in shape, surrounding the biometric authentication flag symbol 720 such that only the corners of the rectangle are displayed. In some examples, the framing element is optionally a solid rectangle or any other shape (e.g., a circle or an ellipse) surrounding the glyph 720. In some examples, the framing element 722 helps indicate to the user how to properly position their face relative to the biometric sensor 703 in conjunction with additional features described below.
Turning to fig. 7D, device 700 detects activation (e.g., selection) of continuation affordance 716. In some examples, the activation is a tap gesture 724 at the continuation affordance 716. In some examples in which display 701 is a touch-sensitive display, the activation of the continuation affordance is a touch, swipe, or other gesture on the display surface at contact area 724. In some examples where the display 701 is not touch sensitive, the user input is a keyboard input or an activation of a continuation affordance 716 having a focus selector (e.g., a mouse cursor).
In response to detecting selection of the continuation affordance 716, the device 700 displays a prompt 726 (e.g., replaces display of the prompt 714 with it), as shown in fig. 7E. In addition, the device replaces the display of continuation affordance 716 with start affordance 728. Upon selection of the continuation affordance 716, the device 700 maintains (e.g., continues) display of the logo 720 and the framing element 722.
Turning to fig. 7F, device 700 detects activation (e.g., selection) of start affordance 728. In some examples, the activation is a tap gesture 730 at the start affordance 7728. Activation of start affordance 728 optionally indicates a user request to initiate a face authentication registration (e.g., setup) process.
As shown in fig. 7H-7Q, in response to detecting selection of the start affordance 728, the device 700 displays a facial authentication tutorial interface 732. At the same time, the device displays a tutorial animation (e.g., tutorial) that indicates to the user how to properly position and move his or her face relative to biometric sensor 703 so that device 700 will be able to acquire sufficient biometric (e.g., facial imaging) data needed for secure (e.g., biometric) authentication. Details of the tutorial interface and tutorial animation are described below.
As shown in fig. 7G-7H, the device 700 changes the display of the framing element 722 to a single continuous framing element 723 surrounding the logo 720. As shown in fig. 7G, the device 700 optionally rounds each corner of the framing element 722 into a portion of a circle and merges and/or shrinks the portions to form a continuous circle around the glyph 720 (e.g., the framing element 723 as shown in fig. 7H).
As shown in fig. 7H, the device 700 simultaneously displays a teaching progress meter 734 adjacent to and/or surrounding the symbol 720. In the example of fig. 7H, the teaching progress meter 734 contains a set of progress elements (e.g., progress scales 734a, 734b, and 734c) that are evenly distributed around the landmark symbol 720. In the example of fig. 7H, the progress scales 734a, 734b, and 734c are equidistant and extend radially outward from the landmark 720, e.g., forming a circle around the landmark. In some examples, these progression elements are optionally points, circles, line segments, or any other suitable discrete elements. In some examples, the progression elements are optionally arranged around the glyph 720 in a square, rectangle, oval, or any other suitable pattern.
Upon displaying the facial authentication tutorial interface 732 (e.g., a symbol surrounded by the framing element 723 and the tutorial progress meter 734), the device 700 begins displaying a tutorial animation showing the process of registering the user's facial data, as shown in fig. 7I. As described in more detail below with reference to fig. 7I-7P, the device 700 displays movement of the circular motion of the landmark symbol 720 and a corresponding advancement of the teaching progress meter 734 to simulate successful facial authentication.
At the start of the tutorial animation, device 700 overlays orientation guide 736 over the display of logo 720. In the example of fig. 7I, the orientation guide 736 is a pair of intersecting curves (e.g., crosshairs) that extend from the viewing element 723 and the glyph 720 such that they appear to protrude outward (e.g., in the simulated z-direction) from the plane of the display. In some examples, in conjunction with the circular viewing element 723, the arc of the orientation guide 736 gives the otherwise two-dimensional glyph 720 a three-dimensional appearance as if it were located on the surface of a sphere. In general, the instructional animation maintains the orientation guide 736 at a fixed position relative to the center of the landmark symbol 720 such that the orientation guide appears to rotate and tilt with (e.g., in the same direction as) the face representation. In some examples, the landmark symbol 720 is itself a three-dimensional representation of a face, such as a three-dimensional line drawing with lines at a simulated z-height. In such examples, orientation guide 736 is optionally omitted. In this case, when the face representation is tilted in different directions, the lines at different z-heights appear to move relative to each other based on a simulated parallax effect to present the appearance of three-dimensional movement.
The device 700 begins a tutorial animation on the facial authentication tutorial interface 732 by displaying movement (e.g., rotation and/or tilt) of the glyph 720 and the orientation guide 736 in a first direction (e.g., up, down, left, or right). In the example of fig. 7I, the glyph 720 and the overlaid orientation guide 736 are tilted to the right relative to a vertical axis extending from the plane of the display 700. Tilting the glyph 720 in this manner optionally reveals a portion of the simulated face (e.g., the left side of the face) and conceals another portion of the simulated face (e.g., the right side of the face) to further present the appearance that the three-dimensional head is tilted or rotated in a particular direction.
As shown in fig. 7I, the device 700 changes the appearance of a subset of the glyph 720 (and/or the orientation guide 736) as it is tilted toward the progression element. In particular, the progression elements in the gauge portion 738 optionally elongate from their initial state and/or change color as the facial graphics are tilted toward them. This elongation and/or color change is optionally more pronounced as the insignia 720 is further tilted in its direction. In some examples, the progression element in the gauge portion 738 also optionally changes appearance in other ways. For example, additionally and/or alternatively, the line thickness, number, or pattern of progression elements is optionally varied. Changing the appearance of the progression element in this manner indicates to the user that biometric sensor 703 is configured to capture image data of a corresponding portion of the face when oriented in that direction. While displaying the tutorial animation, the device 700 keeps displaying the progression element (e.g., the element of the gauge portion 740) toward which the face graphic is in an initial state not yet tilted. In the example of fig. 7I, the device 700 displays the progression element in the initial state as an unfilled outline.
In some examples, the device 700 thereafter rotates the glyph 720 about a second axis parallel to the plane of the display such that the simulated face appears to be tilted up or down. In the example of fig. 7J, the signpost 720 appears to be tilted upward from its position in fig. 7 such that the simulated face is pointing right up. Upon rotating the glyph 720 in this manner, the device 700 changes the appearance of the corresponding meter portion 740 (shown in FIG. 7I) that was previously in the initial state. The device changes the appearance of meter portion 740 in the same manner as described above with respect to fig. 7I (e.g., by elongating the progression element in that portion of the teaching progression meter and/or changing its color). At the same time, the device 700 transitions the progress element in the portion 738 corresponding to the portion of the face representation registered in fig. 7I to the second state. The progression element in the success state (e.g., the progression element in the count portion 738) is different from the progression element in the initial state in shape, color, line width, and the like. In the example of fig. 7I, the progression element in the success state is displayed with the same size and width as the progression element in the initial state (e.g., the progression element in gauge portion 742), but is darkened and/or filled to indicate that the face representation has been oriented in that direction.
Fig. 7K illustrates further tilting and/or rotation of the landmark symbol 720 and orientation guide 736 until the simulated face appears to be looking up. As described above, when the symbol 720 is oriented in its direction, the device 700 changes the appearance of the progression element in the meter portion 742 (e.g., elongates the progression element and/or changes its color). Meanwhile, after the simulated face was previously oriented in its direction but no longer oriented in its direction, the apparatus 700 transitions the progression element in the gauge portion 740 to the success state. The progression element in the count portion 738 remains in a success state. Generally, the appearance of the progression element that has transitioned to the successful state is not modified thereafter. Thus, in response to movement of the display glyph 720, the device 700 changes the appearance of the elements in the instructional progressive meter 734.
In some examples, during the tutorial animation, the device 700 optionally continues to display the rotation and/or tilt of the glyph 720 until it has displayed a full circular (e.g., clockwise, counterclockwise) motion of the simulated face (e.g., until the glyph 720 returns to the right tilt orientation as shown in fig. 7I). Likewise, as the tick marks 720 rotate past them, the device 700 incrementally transitions the elements of the teaching progress meter 734 to a success state. After displaying the full rotation of the simulated face, the device displays that all progress elements of the teaching progress meter 734 are in a successful state, as shown in fig. 7L. In some examples, the device 700 stops displaying the orientation guide 736 and returns the glyph 720 to its initial position after displaying a full rotation.
After all progress elements of the teaching progress meter 724 have transitioned to a success state, the device 700 transitions the progress meter 734 (e.g., the progress meter itself) to an authentication success state, such as a solid circle around the symbol 720. The progress meter 724 is displayed with an authentication success status, optionally indicating a successful face authentication setting. Referring to fig. 7L-7O, the device 700 transitions the display of discrete progress scales of the teaching progress meter 734 to an authentication success state by shortening each of the progress scales and merging them together into a continuous solid circle (e.g., success state meter 744) around the landmark symbol 720. In the example of fig. 7O and 7P, the circle is shrunk around the landmark 720 until the radius of the success status indicator 744 is substantially the same as the radius of the framing element 723 (e.g., as shown in fig. 7P).
As shown in fig. 7I-7Q, the face authentication tutorial interface 732 also includes a start affordance 746 that is optionally displayed throughout the face authentication tutorial. In some examples, start affordance 746 is enabled for activation after the tutorial animation is complete (e.g., after device 700 displays tutorial progress meter 734 in the authentication success state of fig. 7Q). In other examples, the start affordance 746 is enabled for activation at any time during display of the facial authentication tutorial animation prior to completion of the tutorial animation.
Turning now to fig. 7Q, device 700 detects an activation (e.g., selection) of start affordance 746. In some examples, the activation is a user input corresponding to a request to start facial authentication settings. In response to detecting activation of start affordance 746, device 700 replaces the display of landmark symbol 720 with image 750 of the user's face captured by biometric sensor 703 as shown in fig. 7R. In some examples, the image 748 is a real-time preview of the field of view of the biometric sensor 703. In other examples, the image 750 is a wire-frame representation of the user's face based on the movement of the user's face in the field of view of the optical sensor. Thus, the image 750 changes (e.g., continuously updates) as the position and orientation of the user's face relative to the biometric sensor changes.
As shown in fig. 7R, device 700 also displays a positioning element 752 around user image 750. In some examples, the positioning element 752 optionally has similar or identical visual properties to the framing element 722 initially positioned around the landmark symbol 720 in fig. 7C-7F. In some examples, the positioning element is displayed to highlight a predetermined portion of a display of the electronic device indicating where the user should position his or her face relative to the biometric sensor for subsequent facial authentication settings. In some examples, the positional element is a shape (e.g., a square) that at least partially separates the predetermined portion of the display from other portions of the display. Device 700 also displays a prompt 754, which is text that prompts the user to move his/her face relative to the optical sensor so that user image 750 appears within positioning element 750.
Turning now to fig. 7S, in response to detecting that the user image 750 has been properly positioned within the positioning element 750 (e.g., the user' S face is properly aligned with the biometric sensor 703), the device 700 displays a facial authentication registration interface 756. In the example of fig. 7S, the facial authentication registration interface 756 includes a progress meter 758 and a user image 760. In some examples, the registration interface 756 includes an orientation guide 762 that is a set of curves (e.g., crosshairs) that appear to extend out of the plane of the display, the progress meter 758 optionally having some or all of the features of the instructional progress indicator 734 displayed during the facial authentication tutorial animation. In the example of fig. 7S, progress meter 758 also includes a set of progress elements (e.g., progress graduations 758a, 758b, and 758c) distributed around user 750. Further description of the alignment of the user's face with respect to the optical sensor may be found below with respect to fig. 9A-9 AE and 11A-11O.
Figures 8A-8C are flow diagrams illustrating a method for providing a tutorial for registering biometric features on an electronic device, according to some examples. Method 800 is performed at a device (e.g., 100,300,500, 700) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4GLTE connection). In some examples, the display is a touch-sensitive display. In some embodiments, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2000 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 800 provides an intuitive way to provide a tutorial for registering biometric features on an electronic device. The method reduces the cognitive burden on the user when registering biometric features on the device, thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling users to more quickly and efficiently enroll biometric features conserves power and increases the interval between battery charges.
The device displays (802) a first user interface (e.g., facial authentication settings interface 712) on the display. While displaying the first user interface, the device detects (806) an occurrence of a condition corresponding to introducing a biometric enrollment process for enrolling a biometric feature (e.g., face, finger, eye, voice, etc.). In some examples, the occurrence of the condition is an input corresponding to a request to initiate a biometric enrollment process, such as completing a previous stage of a device setup user interface process or selecting a biometric enrollment option in a setup user interface. In some examples, the biometric feature is used for authentication at the device.
In response to detecting the occurrence of a condition corresponding to introducing a biometric enrollment process, the device displays (808) a biometric enrollment introduction interface (e.g., facial authentication tutorial interface 732). Displaying the biometric enrollment introduction interface includes simultaneously displaying (810) a simulated representation of the biometric feature (e.g., 720, an animation of the biometric feature, such as a video of the face/head or biometric feature displayed in an animated manner) and a simulated progress indicator (e.g., an instructional progress meter 734, a display element indicating progress of enrollment). In some examples, the simulated progress indicator is positioned proximate to the simulated representation of the biometric feature. In some examples, the simulated progress indicator includes a plurality of progress elements (e.g., progress elements 734a, 734b, and 734c) optionally distributed around the simulation of the biometric feature, such as a set of tick marks extending outward (e.g., radially) from the simulation of the biometric feature and forming an elliptical shape, such as a circle.
In some examples, the simulated representation of the biometric feature is a simulated representation (e.g., 720) of at least a portion of a face (812). In some examples, the representation is a simulated representation of a portion of a face. In some examples, the representation is a simulated representation of the entire face. In some examples, the simulation of biometric features is a representation of a generic face, such as a line drawing including eyes, nose, and mouth. In some examples, the simulated representation of the biometric feature is a three-dimensional representation (814). For example, the simulated representation of the biometric feature is a three-dimensional rendered object. Alternatively, the instructional animation is optionally instead a 2D animation.
In some examples, the simulated representation of the biometric feature is a line drawing (e.g., a 3D representation of 720) with lines at different simulated z-heights (816). For example, when a line drawing of a face is tilted in different directions, lines at different simulated z-heights appear to move relative to each other based on simulated parallax effects. In some examples, the biometric enrollment introduction interface includes (820) an orientation guide (e.g., orientation guide 736, which is a curve that curves backwards in the simulated z-direction, as described in more detail below with reference to method 1200) that is overlaid on the representation (e.g., 720) of the simulated biometric feature and that tilts in different directions as the representation of the simulated biometric feature tilts in the different directions.
In displaying the biometric enrollment introduction interface, the device displays (824) a tutorial animation (e.g., movement of 720 shown in fig. 7H-7L and advancement of tutorial progress indicator 734), including displaying movement (e.g., tilting and/or rotation) of the simulated representation of the biometric feature and incremental advancement of the progress indicator (e.g., the progress element of the progress indicator changes color and/or shape in response to displaying movement of the simulated representation of the biometric feature). Displaying a tutorial animation that includes movement of the simulated representation of the biometric feature and incremental advancement of the simulated progress indicator shows in advance the appropriate user inputs needed for the subsequent biometric enrollment process (e.g., method 1200 and/or method 1400) and thus helps the user intuitively identify how quickly and correctly enroll their biometric features, reduces the duration of time required for the device to display the biometric enrollment interface (e.g., 756) during the process and reduces the number of user inputs performed at those interfaces. Reducing the number of inputs and the amount of time required to perform a registration operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device displays (826) the simulated movement of the biometric feature tilted relative to a plane of a display of the device. For example, the movement of the simulated representation includes a rotation of the simulated representation along an axis perpendicular to a plane of a display of the device. In some examples, the tilt relative to the biometric sensor and/or the field of view of the sensor defines a plane of the display. In another example, the device displays (828) movement of rotating the simulated representation of the biometric feature about a first axis (e.g., an axis perpendicular to the display 700) and rotating the simulated representation of the biometric feature about a second axis different from the first axis (e.g., an axis in the plane of the display 700). In some examples, the first axis is a vertical axis such that the movement of the representation is from left to right and/or from right to left. In some examples, the first axis is perpendicular to the second axis. For example, the second axis is optionally a horizontal axis such that the movement of the representation is downward and/or upward. In some examples, the first axis is any axis other than an axis perpendicular to a display of the device (e.g., representing rotation in any direction), and the second axis is an axis perpendicular to the display of the device. In this example, the simulated head is optionally moved in a circular pattern about the second axis. Displaying movements that tilt the simulation of the biometric feature relative to the plane of the display ahead of time shows the appropriate user inputs required for subsequent biometric enrollment processes (e.g., method 1200 and/or method 1400) and thus helps users intuitively identify how quickly and correctly enroll their biometric features, reduces the duration required for the device to display biometric enrollment interfaces (e.g., 756) during the process and reduces the number of user inputs performed at those interfaces. Reducing the number of inputs and the amount of time required to perform a registration operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, when displaying the tutorial animation, the device optionally displays (830) the simulated representation of the biometric feature in a first pose so as to expose a first portion of the representation (e.g., a first side of 720) and not expose a second portion of the representation (e.g., a second, different side of 720). The device then optionally displays the simulated representation of the biometric feature in a second pose different from the first pose so as to reveal a second portion of the representation without revealing the first portion of the representation. In examples where the biometric feature is a face, the simulated face is optionally tilted in a first direction to expose a first portion of the simulated face, and then tilted in a second direction to expose a second portion of the simulated face. Displaying the simulated biometric feature in a first orientation and subsequently displaying the simulated biometric feature in a second, different orientation shows in advance the appropriate user inputs required for a subsequent biometric enrollment process (e.g., method 1200 and/or method 1400), and thus helps users intuitively identify how quickly and correctly enroll their biometric features, reducing the duration required for the device to display biometric enrollment interfaces (e.g., 756) during that process and reducing the number of user inputs performed at those interfaces. Reducing the number of inputs and the amount of time required to perform a registration operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device displays a progress indicator (e.g., 734) surrounding the simulation of the simulated representation (e.g., 720) of the biometric feature. For example, the simulated progress indicator is displayed such that the simulated progress indicator surrounds (or substantially surrounds) a portion or all of the simulated representation of the biometric feature. In some examples, the simulated progress indicator is centered around the representation of the biometric feature of the user. In some examples, displaying the simulated progress indicator includes displaying (832) a plurality of progress elements (e.g., points, circles, line segments such as progress scales 734a, 734b, and 734c) proximate to the simulated representation of the biometric feature (e.g., the facial graphic 720). In some examples, the progression elements are equidistant from the representation and/or extend radially outward from the representation. In some examples, the progression elements are arranged in a circular, square, rectangular, or elliptical pattern.
In some examples, the device transitions (834) one or more of the plurality of progress elements from a first state to a second state different from the first state when displaying an incremental progression of the simulated progress indicator. For example, in a first state, the progression element optionally has a first color and/or a first length, and in a second state, the progression element optionally has a second color different from the first color and/or a second length different from the first length. In some examples, the progression element may also optionally change appearance in other ways. For example, the progression element optionally changes line thickness, number, pattern, and the like. Changing the display of a portion of the simulated progress indicator allows the user to identify that the orientation of the simulated biometric feature shown in the tutorial animation needs to be changed to properly register his/her biometric feature. This helps to show in advance the appropriate user inputs required for subsequent biometric enrollment processes (e.g., method 1200 and/or method 1400), reduces the duration of time required for the device to display biometric enrollment interfaces (e.g., 756) during the process and reduces the number of user inputs performed at those interfaces. Reducing the number of inputs and the amount of time required to perform a registration operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the simulated representation of the biometric characteristic is a line drawing (836) that includes at least a portion (e.g., some or all) of a flag symbol (e.g., success status progress meter 744) indicating a successful biometric enrollment. In some examples, one or more progress elements of the simulated progress indicator are all updated to a second state (e.g., green and lengthened, or the state of meter portion 738 in fig. 7J), and are not modified thereafter. In some examples, the simulated progress indicator transitions to a success state (e.g., success state progress meter 744) when each progress element has been updated to the second state. In some examples, converting the simulated progress indicator to the success state includes converting the simulated progress indicator to a solid circle surrounding the simulated representation of the biometric feature.
After displaying at least a portion of the tutorial animation, the device detects (838) an occurrence of a condition corresponding to initiating the biometric enrollment process. In some examples, the condition corresponding to initiating the biometric enrollment process includes (840) a selection of an affordance for initiating the biometric enrollment process. For example, a condition is an input corresponding to a request to "start registration" (e.g., a user input at contact area 748), such as a tap on the "start registration" or "next" affordance (e.g., start affordance 746), optionally followed by aligning the biometric feature of the user with the one or more biometric sensors. A more detailed description of the biometric enrollment process is described in more detail herein with reference to method 900. In some examples, the electronic device provides a tactile and/or audible output in response to selection of the affordance.
In response to (842) detecting an occurrence of a condition corresponding to initiating a biometric enrollment process, the device displays (844) a representation of a biometric feature of the user (e.g., user image 750, the user's face, the user's finger, the user's eye, the user's head) determined by the one or more biometric sensors of the device at a location previously occupied by the simulated representation of the biometric feature in the biometric enrollment introduction interface (e.g., facial authentication tutorial interface 732). In some examples, the device optionally displays a registration progress user interface (e.g., 756) after the representation of the biometric characteristic of the user (e.g., 750, 760) has been aligned with the one or more biometric sensors (e.g., 703).
In some examples, the representation is a representation of a portion of a user's face (846), e.g., a portion of user image 750. In some examples, the representation is a representation of the entire face of the user. In some examples, the representation of the biometric characteristic of the user is a user-specific representation of the user. For example, the representation of the user is an image of the user's face or a wireframe that matches the outline of the user's face.
In some examples, the biometric enrollment user interface includes (848) orientation guides (e.g., orientation guides 736, 762) overlaid on a representation of a biometric feature (e.g., user image 750). The orientation guide is optionally tilted as the biometric feature is tilted in different directions. Displaying the orientation guide that moves with the user's biometric feature provides feedback to the user regarding the orientation of his or her biometric feature in three-dimensional space relative to the device's biometric sensor, enabling the user to more quickly place his or her biometric feature in the appropriate orientation during subsequent enrollment processes (e.g., method 1200 and/or method 1400). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the representation (e.g., 750) of the biometric features of the user is based (850) on image data captured by the one or more cameras (e.g., 703) of the electronic device. For example, the representation of the user's biometric features is optionally a continuous image of the user captured by the one or more cameras (e.g., 703), or a wire frame based on movement of the user's features in the field of view of the one or more cameras. In some examples, the representation of the biometric feature changes as an orientation of the biometric feature relative to the one or more biometric sensors changes (852). Updating the orientation of the displayed representation of the biometric feature provides the user with feedback regarding the orientation of his or her biometric feature relative to the biometric sensor of the device, enabling the user to more quickly place his or her biometric feature in the appropriate orientation during a subsequent enrollment process (e.g., method 1200 and/or method 1400). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In response to detecting occurrence of a condition corresponding to initiating a biometric enrollment process, the device also displays (854) a progress indicator (e.g., 756) corresponding to the simulated progress indicator (e.g., a progress indicator having some or all of the features of the simulated displayed progress indicator surrounding the biometric feature, such as a plurality of progress elements distributed around a representation of the biometric feature of the user). In some examples, displaying the progress indicator includes maintaining (856) a display of the simulated progress indicator. For example, the simulated progress indicator returns to an initial state (e.g., the state of progress elements 734a, 734b, and 734c in fig. 7H) and is used to illustrate the user's incremental enrollment process in the same or similar manner as the incremental enrollment progress used to illustrate the simulated biometric feature. Displaying an enrollment progress indicator corresponding to (e.g., similar to) the simulated progress indicator allows a user to quickly associate a change in orientation of the simulated biometric feature shown during the tutorial animation and a corresponding progression of the simulated progress indicator with appropriate input needed during a subsequent enrollment process (e.g., method 1200 and/or method 1400). This in turn enables the user to complete the enrollment process more quickly, reducing the duration of time required for the device to display biometric enrollment interfaces (e.g., 756) during the process and reducing the number of user inputs performed at those interfaces. Reducing the number of inputs and the amount of time required to perform a registration operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device displays (858) a progress indicator (e.g., 758) surrounding a representation (e.g., 760) of a biometric feature of the user. For example, the progress indicator optionally has some or all of the features of the progress indicator surrounding the simulated display of the biometric feature. These features optionally include a plurality of progression elements (e.g., 758a, 758b, 758c) distributed around the representation of the biometric feature of the user. For example, the progress indicator is displayed such that the progress indicator surrounds (or substantially surrounds) a portion or all of the representation of the biometric characteristic of the user. In some examples, the progress indicator is centered around the representation of the biometric feature of the user.
In some examples, in response to detecting an occurrence of a condition corresponding to initiating a biometric enrollment process, the device displays (860) a localization element (e.g., localization element 752) on a display of the electronic device. In some examples, the positioning element is displayed to highlight a predetermined portion (e.g., 756, 758) of the display of the electronic device. In some examples, the location element indicates where the user should locate a representation of the user's biometric feature (e.g., 750) for subsequent biometric feature enrollment. In some examples, the positioning element is an object that visually at least partially separates the first and second portions of the display (e.g., display portion 756 and display portion 758). The positioning element is a shape, such as a square in some examples, and is optionally segmented. Displaying the positioning elements that make up a particular portion of the digital viewfinder allows a user to quickly identify whether the location and/or orientation of his or her biometric feature within the field of view of the biometric sensor is optimal for a subsequent biometric enrollment process (e.g., method 1200 and/or method 1400), enabling the user to more quickly place his or her biometric feature in an appropriate orientation. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
It is noted that the details of the processes described above with respect to method 800 (e.g., fig. 8A-C) may also be applied in a similar manner to the methods described below. For example, method 800 optionally includes one or more of the features of the various methods described below with reference to methods 1000, 1200, 1400, 1600, 1800, 2000, 2200, 2500, and 2700. As another example, the guidance may be relative to the orientation described in the tutorial animation application method 1200 displayed on the facial authentication tutorial interface (e.g., 732). As another example, one or more aspects of the biometric enrollment described in method 1200 may be applied with respect to an enrollment interface (e.g., 756). As another example, one or more aspects of the reminder described in method 1400 may be applied to the display of a facial authentication tutorial interface (e.g., 732).
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 8A-8C are optionally implemented by the components depicted in fig. 1A-1B. For example, display operation 802, detect operation 806, display operation 810, display operation 824, detect operation 838, display operation 844, and display operation 854 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some examples, event handlers 190 access respective GUI updaters 178 to update content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Fig. 9A-9 AE illustrate an example user interface for a tutorial for registering biometric features on an electronic device (e.g., device 100, device 300, device 500, or device 700), according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the process in FIG. 10.
Fig. 9A shows an electronic device 900 (e.g., portable multifunction device 100, device 300, device 500, or device 700). In the exemplary example shown in fig. 9A to 9AE, the electronic device 900 is a smartphone. In other examples, the electronic device 900 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 900 has a display 901, one or more input devices (e.g., a touch screen, buttons, a microphone of the display 901), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 903), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 903 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
As shown in fig. 9A, the device 900 displays a facial authentication introduction interface 905. In some examples, the face authentication introduction interface 905 is similar to the face authentication tutorial interface 732 described above in connection with fig. 7S. By way of example, the facial authentication introduction interface 905 includes a facial graphic 902, which is optionally the same or similar to the landmark sign 720 described above with respect to the facial authentication tutorial interface 732. Additionally or alternatively, device 900 optionally also displays a success status teaching progress meter 907, which is optionally the same as or similar to success status teaching progress meter 744 in fig. 7P-7Q. The facial authentication introduction interface 905 also includes a start button 904 (e.g., a start affordance). As shown in fig. 9A, device 900 detects activation (e.g., selection) of start affordance 904. For example, the activation is optionally a user input at a contact region 906 on the start affordance 904. In some cases, the user input will correspond to a request to start face authentication settings (e.g., start face registration).
In some examples, in response to detecting user selection of start button 904, device 900 displays face alignment interface 908 as shown in fig. 9B. The facial alignment interface 908 includes a positioning element 910, which is a framing circle or bracket that indicates an alignment boundary in some examples. In some examples, the positioning element 910 identifies an inner display portion 912 and an outer display portion 912. In some examples, the electronic device determines that the user's biometric features are properly aligned when generally positioned in the inner display portion 912 in a predetermined manner. In some examples, positioning element 910 separates inner display portion 912 from outer display portion 914. Generally, if the user's face is positioned relative to the biometric sensor 903 such that a portion of the user's image appears in the outer display portion 914, in some cases, the user's face will not be properly aligned with the camera. As such, the face alignment interface 908 also includes a text prompt 916 instructing the user to position his or her face within the positioning element 910 (e.g., within the inner display portion 812).
Referring to FIG. 9C, in some examples, the user positions the electronic device 900 substantially in front of the user's face 917 during the alignment process. In some examples, the user holds the device 900 at about the same height as his or her face so that the face is in the field of view of the biometric sensor 903.
As shown in fig. 9D, once the user has initiated the alignment process, the device displays a facial alignment interface 908 (recall that the user optionally initiated the registration process by activating affordance 904). The facial alignment interface 908 includes a digital viewfinder showing a preview of the image data captured by the biometric sensor 903. In some examples, the preview of the image data is a real-time preview that is continuously updated (e.g., changes over time) as the field of view of the cameras changes (e.g., if device 900 is moved or if the user moves closer to or farther from the cameras). The digital viewfinder includes a user face image 918, and a positioning element 910 superimposed on the field of view of the camera. As described above, the positioning element 910 separates the inner display portion 912 from the surrounding outer display portion 914. To provide further visual separation between the inner display portion 912 (where the user's facial image 918 will be located) and the outer display portion 914, the device 900 visually obscures (e.g., colors, darkens, or blurs) the outer display portion 914, as shown in fig. 9D.
In general, properly registering a user's facial features for authentication requires that the user's face be located in a predetermined manner and/or within a predetermined range of distances from the camera of device 900. In some examples, alignment of the user's face with the camera of device 900 requires that the user be neither too close nor too far from the device. Thus, if the electronic device 900 determines that the user's face is too close or too far away, the electronic device displays a text prompt 920 in the face alignment interface 908 instructing the user to position their face at an acceptable distance (e.g., 20 to 40mm) from the device 900. In the example of fig. 9D, device 900 detects that the user's face is too far away from the camera on the device (e.g., user face image 918 is within positioning element 910, but does not substantially fill inner display portion 912). In some examples, the electronic device prompts the user to move his or her face closer to the device. In some examples, the device generates one or more outputs, such as an audio output 922 (e.g., a series of beeps or other audio outputs) and a tactile output 924 (e.g., a series of vibrations or other tactile outputs) to notify the user of the improper alignment. In some examples, audio output 922 and/or tactile output 924 have a magnitude and repetition rate (e.g., frequency) that varies based on a distance between device 900 and the user's face. For example, the output frequency and/or magnitude optionally increases as the user's face moves closer to an acceptable range of distances from the device (e.g., 20 to 40 mm). Conversely, the output frequency and/or magnitude optionally decreases as the user's face moves further away from the acceptable distance range. In this case, when device 900 detects a change in the distance between the user's face and biometric sensor 903, it continuously changes (e.g., updates) the frequency and/or magnitude of audio output 922 and/or tactile output 924. In some examples, device 900 provides these outputs as long as the user's face is outside of an acceptable range of distances from the device. In some examples, audio output 922 and tactile output 924 are accompanied by corresponding visual output on display 700. These ongoing audio, tactile, and/or visual outputs optionally provide an intuitive reminder as to how the user is properly aligning his or her face with the camera, reducing the time required to perform a successful facial alignment.
Fig. 9E shows the face alignment interface 908 in the event that the user's face is positioned too close to the device 900 (e.g., a substantial portion of the user's facial image 918 falls within the outer display portion 914). In this case, the alignment interface 908 also includes a text prompt 920 that instructs the user to position his or her face at an acceptable distance from the device 900. In some examples, the electronic device instructs the user to move his or her face closer to the device. As described above in connection with fig. 9D, device 900 optionally generates ongoing audio output 922 and/or tactile output 924 in response to detecting that the user's face is too close to the camera. In particular, when the device 900 detects a change in the distance between the user's face and the camera, it changes the frequency and/or magnitude of these outputs.
Fig. 9F shows the face alignment interface 908 in the case where the user's face is at an acceptable distance from the device 900, but outside the frame (e.g., too far to the right or left). For example, face 918 is optionally positioned such that a major portion of face 918 is located outside of positioning element 910 within outer display portion 914. In this case, device 900 optionally displays a text prompt 926 on alignment interface 908 instructing the user to position his or her face within positioning element 910 (e.g., such that user image 918 is displayed within inner display area 912).
Referring to fig. 9G-9L, in some examples, the electronic device 900 displays the face alignment interface 908 in response to determining that the user's face is outside a predetermined range of angles relative to the electronic device. As shown in fig. 9G, the electronic device 900 is located at a low angle relative to the electronic device (e.g., the electronic device is aligned with the chin of the user) such that the electronic device cannot properly obtain (e.g., capture biometric data). Referring to fig. 9H, in response to determining that the electronic device 900 is outside of the predetermined range of angles, the electronic device 900 obscures at least a portion of the face alignment interface 908, such as the inner display portion 912 and the outer display portion 914. In some examples, the electronic device also outputs a prompt 986 instructing the user to position his or her face within the positioning element 910 (e.g., such that the user image 918 is displayed at an appropriate angle within the inner display region 912). In fig. 9I and 9K, the user lifts the device 900 until the electronic device is within a predetermined angular range. As the user raises the electronic device, referring to fig. 9J and 9K, the electronic device 900 gradually reduces the blur of the displayed elements. In this way, the electronic device indicates to the user that the angle of the electronic device relative to the user is approaching the acceptable range of angles. In some examples, the electronic device is too high relative to the user such that the electronic device is not within the predetermined range of angles. Similar to the described examples, the electronic device optionally reduces or increases the blur of the displayed objects as the electronic device is moved relative to the user.
In some examples, if the device detects an alignment error for a predetermined amount of time, the device 900 optionally displays an accessibility option affordance 928 on the facial alignment interface 908, as shown in fig. 9G. For example, if device 900 does not detect a user's face at an acceptable distance from the device and/or within a positional element at a predetermined time after starting alignment (e.g., after selecting start button 904), it optionally displays accessibility option affordance 928. In some examples, the predetermined amount of time is optionally 10 seconds, 15 seconds, 30 seconds, or any other suitable amount of time. Similarly, device 900 optionally displays an accessibility option affordance after a certain number of registration attempts have failed. As discussed in more detail below, in response to detecting selection of accessibility option affordance 928, device 900 optionally displays additional options or reminders and/or initiates an alternative face registration process. In some examples, activation of the accessibility option affordance 928 enables the user to proceed with biometric enrollment without preferentially correcting alignment errors.
In general, the quality of facial feature registration for the facial authentication methods described herein depends, at least in part, on the lighting conditions under which the user's facial data is captured. For example, in some cases, strong backlighting or direct exposure on the user's face can adversely affect registration quality. Turning now to fig. 9H, in response to detecting an adverse lighting condition, device 900 optionally displays a text prompt 930 on alignment interface 908 indicating the adverse lighting to the user. The text prompt 930 is optionally accompanied by an audio, visual, and/or tactile output 932. Output 932 is optionally the same as output 922 and/or output 924 described in connection with the alignment error discussed above. In some examples, the output is error-specific; output 932 is thus optionally a different audio, visual, and/or tactile output than output 922 and output 924.
Generally, the quality of the facial feature registration also depends in part on the angle at which the user's face is oriented relative to one or more cameras (e.g., biometric sensor 903) of device 900. In particular, one or more optical sensors of device 900 must be capable of capturing image data of a user's face at a particular angle or within a predetermined range of angles. Even assuming that the user's face is within the above-described acceptable distance range, facial authentication registration may be adversely affected if the device 900 is positioned too far above or far below the user's face. Thus, in some examples, when a successful alignment condition is detected, device 900 requires that the user's face be positioned within a predetermined range of angles relative to one or more of their cameras.
In some examples, in response to detecting that the user's face is outside the predetermined range of angles relative to biometric sensor 903, device 900 blurs image data displayed in the digital viewfinder of alignment interface 808. In some examples, the amount of blur optionally depends on a difference between the detected elevation angle of the user's face relative to the camera and one or more threshold angles that constrain the predetermined range of angles. For example, the higher or lower the device 900 is positioned relative to the user's face, the greater the degree to which the device 900 blurs the preview image. If device 900 detects a change in elevation angle such that its camera is more closely aligned with the user's face, it optionally reduces the amount of blur as the elevation angle changes (e.g., in a continuous gradient). In some examples, the preview image is not blurred if the elevation angle between the device 900 and the user's face is actively changing (e.g., the user is moving the device 900 relative to his or her face). Blurring is optionally delayed until device 900 determines that the angle between the user's face and one or more of its cameras has been outside of a predetermined range of angles for a set period of time (e.g., 1 second, 2 seconds, 5 seconds, or any suitable period of time). In some examples, only a portion of the preview image (e.g., the outer display portion 914) is obscured, while in other examples, the entire preview image is optionally obscured. Blurring the preview image in this manner optionally prompts the user to more quickly position the device 900 at a desired angle relative to his or her face, reducing the amount of time spent during the alignment process. In some examples, device 900 optionally issues and/or generates a tactile output to inform the user that his or her face is positioned at a suitable angle relative to biometric sensor 903.
In fig. 9N, the user's face is correctly positioned with respect to the biometric sensor 903. In this case, face 918 is displayed substantially within alignment element 910 and inner display portion 912. As shown in fig. 9N, face 918 also occupies a major portion of inner display portion 912, indicating that the user's face is within a threshold distance from device 900. In response to detecting a face that meets the alignment criteria described above, device 900 emits audio output 934 and haptic output 936 to indicate successful alignment of the user's face with the camera. In general, output 934 and output 936 are different than outputs 922, 924, and 932 issued in response to detecting an alignment error. In some examples, upon successful alignment with the camera, device 900 captures and stores one or more images of the user's face.
In some examples, after detecting a successful alignment, device 900 visually highlights inner display portion 912 in which face 918 is displayed. In the example of fig. 9P, device 900 further obscures the outer display portion 914 by blacking out or further blurring the image in the outer portion of the digital viewfinder preview while continuing to display the portion of the digital viewfinder preview in the inner display portion 914 (e.g., within positioning element 910). In some examples, device 900 further visually highlights the content of inner display portion 912 by enlarging or enlarging the image within inner display portion 912.
In some examples, the device further highlights the inner display portion 912 by changing the appearance of the positional element 910. In particular, the apparatus 900 optionally changes the appearance of the alignment element by "rounding" the corners of the alignment element as shown in fig. 9P and/or by merging the corners of the alignment element 910 into a circular positioning element 941 around the face 918 as shown in fig. 9Q.
Turning now to the example of fig. 9R, in response to detecting that the user's face is oriented such that the alignment criteria referenced above are satisfied, device 900 initiates a face authentication registration process by displaying face registration interface 938 (e.g., replacing the display of alignment interface 908 therewith). In some examples, the face registration interface 938 has similar or identical visual features to the face authentication registration interface 756 described above in connection with fig. 7S or the registration interface 1104 described below in connection with fig. 11A. In the example of fig. 9R, the face registration interface 938 includes a user face image 939 displayed within a positional element 941. In the example of fig. 9R, the user facial image 939 is a real-time preview of the image data captured by the biometric sensor 903. Face registration interface 938 also optionally includes a registration progress meter 940 surrounding the user's facial image 939 and a positional element 941. As described above in connection with fig. 7S and 11A-11H, registration progress meter 940 includes a set of progress elements (e.g., 940a, 940b, and 940c) that extend radially outward from user facial image 939 and, in some examples, enclose it in a circular pattern. The face registration interface 938 optionally includes orientation guides 942. In some examples, the orientation guide includes a set of curves (e.g., crosshairs) that intersect over the center of the user facial image 939 that appear to extend out of the plane of the display 901 in the virtual z-dimension. In some examples, the orientation guide provides a sense of three-dimensional orientation of the user's face even though the face image 939 is two-dimensional. In this case, orientation guides 942 assist the user in the face enrollment process by making the rotation and/or tilt of the user's head visually more pronounced relative to device 900. The face registration interface 938 also includes a text prompt 944 that optionally instructs the user to begin tilting their head (e.g., in a circle) to perform registration.
Generally, once the registration process is initiated, if the device 900 moves too much relative to the user's face, the registration quality may degrade (e.g., the device should remain stationary while the user moves to slowly rotate/tilt his or her face). In the example of fig. 9S, device 900 detects excessive movement of its one or more cameras relative to the user' S face. This excessive movement is optionally a significant change in the orientation and/or position of the user's face relative to the device 900 that is consistent with the movement of the device itself and prevents reliable alignment and/or registration. In response, device 900 issues a visual prompt 946 on registration interface 938 instructing the user to reduce the movement of the device (e.g., prompting the user to hold the device stationary during the registration process). Device 900 optionally also simultaneously generates visual and/or audible output 948. In some examples, the movement of the device itself is measured by the accelerometer 168 rather than the biometric sensor 903. The movement of the device is also optionally measured by a magnetometer, inertial measurement unit, etc., of the device 900.
Successful registration typically requires maintaining alignment of the user's face with respect to a camera on the device 900 throughout the registration process. Thus, in some examples, device 900 optionally exits the face registration process if one or more alignment errors are detected during registration. In some examples, if device 900 detects one or more alignment errors during the registration process, the electronic device exits the registration process (e.g., stops displaying face registration interface 938) and initiates (e.g., transitions to) an alignment process in which optionally device displays alignment interface 908-2. In the example of fig. 9T-9U, the alignment interface 908-2 and its components optionally have similar or identical visual features to the initial alignment interface 908 described above with respect to fig. 9B-9O. In the example of fig. 9T-U, device 900 has determined that the user's face is outside the frame, and therefore, device 900 displays user face image 918-2 within inner display portion 912-2 outside the position compared to the successful alignment depicted in fig. 9O. In some examples, the device outputs an indication of a mis-alignment, such as a text prompt 950, indicating that the user facial image 918-2 is not properly aligned within the positioning element 910. This example is merely illustrative. In some examples, the alignment error is optionally a failure to meet any of the other alignment criteria discussed above (e.g., distance from the device, orientation angle, adverse lighting, etc.). In such cases, the text prompt 950 instructs the user to move the device and/or their face into an acceptable distance range, or to correct the orientation angle. In other examples, the alignment error is optionally different from the above criteria, such that a small change in alignment does not cause the device to exit the face registration process. In response to detecting the one or more alignment errors, the device no longer visually highlights the inner display portion 912-2 by revealing the image preview portion displayed in the outer display portion 914-2 and displaying the positioning element 910-2 as shown in FIG. 9U. For example, device 900 brightens or sharpens the preview image in outer display portion 914-2 to assist the user in realigning their face with respect to biometric sensor 903. In the example of FIG. 9U, the inner display portion 912-2 is no longer highlighted, indicating that a substantial portion of the user's facial image 918-2 is located outside of the positional element 910-2 and within the outer display portion 914-2.
In some examples, the device 900 again detects that the user's face is properly aligned with the biometric sensor 903. In response, device 900 outputs audio output 934-2 and/or haptic output 936-2 indicating successful alignment. In some examples, audio output 934-2 and haptic output 934-6 have similar features to audio output 934 and haptic output 936, respectively, as described with reference to fig. 9O. In some examples, the device 900 then resumes the registration process. For example, device 900 highlights interior portion 912-2 and facial image 918-2 in the manner discussed above with respect to interior display portion 912 and facial image 918-2 in fig. 9P-9O. In some examples, device 900 resumes the registration process at the point where the electronic device detected the alignment error (e.g., face registration interface 938 is displayed again, with registration progressive meter 940 advancing to the same state as when the alignment error was detected).
In some examples, if the device does not detect that proper alignment has been established (e.g., reestablished) within a predetermined period of time, the device 900 displays an accessibility option affordance 928-2, as shown in fig. 9V. In some examples, the accessibility option provides an option to continue the registration process without all alignment conditions being met, as described below. In some examples, the accessibility option provides an option to set biometric (e.g., facial) authentication with only partial enrollment (e.g., scanning of a portion of a user's face).
In response to detecting activation (e.g., selection) of accessibility option button 928-2 (e.g., by tap gesture 952), the device displays accessibility registration interface 954, as shown in FIG. 9W. One or more features of accessibility registration interface 954 have visual characteristics similar or identical to corresponding features of registration interface 938. For example, in FIG. 9W, face registration interface 954 includes a user face image 939-2 displayed within a positional element 941-2. In some examples, the user facial image 939-2 is a real-time preview of the image data captured by the biometric sensor 903-2. Accessibility registration interface 954 also optionally includes registration progressing meter 940-2 surrounding user facial image 939-2 and location element 941-2. As described above in connection with fig. 7S and 11A-11H, registration progress meter 940-2 includes a set of progress elements (e.g., 940-2a, 940-2b, and 940-2c) that extend radially outward from user facial image 939-2 and, in some examples, enclose it in a circular pattern. Accessibility registration interface 954 optionally includes orientation guide 942-2. In some examples, the orientation guide includes a set of curves (e.g., crosshairs) that intersect over the center of the user facial image 939-2 that appear to extend out of the plane of the display 901 in the virtual z-dimension. Like facial registration interface 938, accessibility interface 954 optionally includes text prompts (e.g., prompt 956) that provide written instructions for successful completion of the registration process. In some examples, accessibility registration interface 954 also includes a completion affordance 956, activation of which allows the user to exit the registration process and proceed to set facial authentication using only a partial scan of their facial features. In some examples, partial scanning may be helpful in some cases for a user with a condition that prohibits the user from tilting his or her head in all directions that would otherwise be required for registration.
In response to activation (e.g., selection) of the completion affordance 956 (e.g., via the user input 958 shown in fig. 9X), the device displays the face registration confirmation interface 960 shown in fig. 9Y. The facial registration confirmation interface includes a facial image 939-3, which in the example of FIG. 9Y has similar visual characteristics as the user's facial image 939-2. Face image 939-3 is optionally surrounded by registration progress meter 962, which displays a successful authentication status as described above in connection with fig. 7P and 7Q. The face registration confirmation interface also includes a partial scan registration affordance 964 that allows a user to register the collected face data for device authentication. The facial registration confirmation interface 960 also includes a return affordance 966 that allows a user to navigate back to the accessibility registration interface 954.
As shown in fig. 9Z, the device detects a user input 968 corresponding to activation (e.g., selection) of a return affordance 966. In response to detecting the user input, device 900 displays accessibility registration interface 954 (e.g., again). While accessibility registration interface 954 is displayed, device 900 detects movement (e.g., rotation and/or tilting) of the user's face relative to biometric sensor 903. In the case of fig. 9AA, device 900 detects that the user's face has tilted in a particular direction (e.g., downward and/or rightward toward counting portion 970). As described in further detail below with respect to fig. 11B-11H, device 900 updates user facial image 939-2 based on the detected movement and updates the position of orientation guide 942-2 to indicate that the user's head has been tilted and/or rotated in three-dimensional space. In response to detecting movement of the user's face, device 900 captures image data of a portion of the user's face (e.g., the left side of the face) and simultaneously changes the appearance of a corresponding portion of registration progress meter 940-2 (e.g., meter portion 970). In some examples, one or more progress elements in the extensometer portion 970 of the device 900 change their color to indicate that the portion of the user's face is currently being registered (as described in more detail with respect to fig. 7I-7K and 11B-11H). In some examples, the device 900 maintains a display of the meter portion 972 (e.g., does not change its appearance) because the meter portion 972 corresponds to a face orientation that has not yet been registered.
As shown in fig. 9AB, in some examples, device 900 detects a change in orientation of the user's face relative to its camera(s) (e.g., the user's face has tilted upward) and updates user face image 939-2 and orientation guide 942-2 accordingly. By way of example, because the image data in the orientation of the face corresponding to the meter portion 972 has been successfully registered, the device 900 transitions the state of the progress element in the meter portion 972 to the "registered" state (e.g., by coloring the progress element or changing its color and/or line width) as described in more detail below with respect to fig. 11B-I. As shown in FIG. 9AB, device 900 again detects activation (e.g., selection) of completion affordance 956 (e.g., via user input 958-2).
In response to detecting activation of the completion affordance 956, the device 900 returns to displaying the face registration confirmation interface 960 as shown in fig. 9 AC. Since a portion of the user's face has been successfully registered, the device 900 displays a registration success indicator 974, e.g., adjacent the user's face image 939-3. In the example of fig. 9AC, registration success indicator 974 indicates an orientation in which the user's face has been successfully registered. In some examples, registration success indicator 974 is a circular bar. Thus, in some examples, registration success indicator 974 indicates (e.g., is located at) a location where the registration progressive meter transitions to a success state during registration.
In some examples, the partial scan registration affordance 964 is selectable because the accessibility registration interface 960 allows users to set facial authentication with only partial registration of their facial features. As shown in fig. 9AD, device 900 detects activation (e.g., selection) of partial scan registration affordance 964 (e.g., via user input 976). In response to detecting activation of partial scan registration affordance 964, device 900 displays registration complete interface 978 shown in AE of fig. 9. The registration completion interface 978 includes a text prompt 980 indicating to the user that the registration process is complete and that facial authentication has been securely set. The registration completion interface 978 includes a generic facial graphic 982, optionally at a location previously occupied by the user facial image 939-3. In some examples, the registration completion interface 978 also includes a completion affordance, activation of which causes the electronic device to exit facial authentication settings.
Fig. 10 is a flow diagram illustrating a method for aligning biometric features on a display of an electronic device according to some examples. The method 1000 is performed at a device (e.g., 100,300,500, 900) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2000 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 1000 provides an intuitive way for aligning biometric features on a display of an electronic device. The method reduces the cognitive burden on the user when registering biometric features on the device, thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling users to more quickly and efficiently enroll biometric features conserves power and increases the interval between battery charges.
The device displays (1002) a first user interface (e.g., 905) on the display. For example, the first user interface is optionally a registration introduction user interface as described above with respect to method 700.
While displaying the first user interface, the device detects (1004) an occurrence of a condition corresponding to initiating a biometric enrollment process for enrolling a respective type of biometric characteristic (e.g., 917). For example, the occurrence of the condition is optionally an input corresponding to a request to "start registration" (e.g., 906 on start affordance 904).
In response to detecting an occurrence of a condition corresponding to initiating a biometric enrollment process (e.g., selecting a user input to initiate enrollment), the device displays (1006) on the display a digital viewfinder (e.g., display portion 912 and display portion 914) that includes a preview of image data (e.g., user facial image 918) captured by the one or more cameras (e.g., 903). In some examples, the preview of image data includes a first portion of the field of view of the one or more cameras (e.g., an outer portion 914 of the field of view) and a second portion of the field of view of the one or more cameras (e.g., an inner portion 912 of the field of view). In some examples, the second portion (e.g., 914) of the field of view is a portion of the field of view that encompasses (1008) the first portion (e.g., 912) of the field of view. In some examples, the inner portion of the field of view is optionally separated from the outer portion by an alignment element (e.g., positioning element 910). In some examples, the preview of the image data optionally changes over time as the content in the field of view of the one or more cameras (e.g., 903) changes. Displaying a preview of the image captured by the biometric sensor provides the user with feedback regarding the location and orientation of his or her biometric feature relative to the biometric sensor of the device, enabling the user to more quickly and efficiently properly align his or her biometric feature with the sensor. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device displays (1010) an alignment element (e.g., positioning element 910) concurrently with the preview of the image data indicating the portion (e.g., 912) of the preview in which the user's face (e.g., 918) should be placed in order to proceed with the biometric enrollment. For example, the alignment element is optionally a circle or bracket displayed in a center portion (e.g., 912) of the preview image for prompting the user to move the device or their face into alignment with the center portion of the preview image. Displaying the alignment elements that make up a particular portion of the digital viewfinder provides the user with feedback regarding the position of his or her biometric feature relative to a portion of the field of view of the biometric sensor that corresponds to the proper alignment of the biometric feature. This in turn enables the user to more quickly and efficiently correctly locate his or her biometric features relative to the sensor. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, after initiating the biometric enrollment process (1012), the device determines (1014) whether a respective type of biometric feature (e.g., 917) that satisfies the alignment criteria has been detected in the field of view of the one or more cameras (e.g., 903). Determining whether the user's biometric feature is properly aligned with the biometric sensor improves the quality of subsequent biometric enrollment (e.g., according to method 1200 and/or method 1400) by ensuring that image data corresponding to a particular portion and/or orientation of the biometric feature is captured during enrollment. This in turn improves the ability of the device to match the user's biometric characteristics to the captured data during biometric authentication at the device. Performing an optimization operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in response to (1016) detecting a respective type of biometric feature (e.g., 917) that satisfies the alignment criteria, the device outputs (1018) a first type of haptic output (e.g., 934, 936, 934-2, 936-2, e.g., the haptic output is an output corresponding to a successful alignment). Issuing a tactile output upon detecting that the biometric feature is properly aligned with the biometric sensor provides feedback to the user indicating successful alignment, which prompts the user to maintain the biometric feature in that alignment during a subsequent biometric enrollment process (e.g., method 1200 and/or method 1400). Providing improved tactile feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device during biometric enrollment), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in response (1016) to detecting respective types of biometric features that satisfy the alignment criteria, the device stores (1020) image data corresponding to the biometric features (e.g., 917). In some examples, upon successful alignment, the device captures data associated with the biometric feature. Storing biometric (e.g., image) data in response to detecting successful alignment of the biometric feature allows the device to automatically capture data to be referenced during a subsequent biometric authorization attempt. Performing an optimization operation when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the alignment criteria include (1024) a requirement that at least a portion of the biometric feature (e.g., 917) be within a first portion (e.g., inner display portion 912, 912-2) of the field of view of the one or more cameras. For example, in some examples, the electronic device determines whether the image data includes data corresponding to biometric features that satisfy the alignment criteria. In some examples, the alignment criteria include (1050) lighting condition criteria. In some examples, the alignment criteria require that the lighting conditions of the electronic device be suitable for capturing image data during biometric feature enrollment, including a requirement that at least a first threshold amount of light be detected and/or that no more than a second threshold amount of light be detected (e.g., by 903).
In some examples, the alignment criteria include (1052) a requirement that a portion of the biometric feature (e.g., a portion of 917) be oriented in a predetermined manner with respect to the electronic device. In examples where the biometric feature is a user's face, the alignment criteria optionally includes a requirement that the user gaze is directed at least one of the one or more cameras (e.g., 903) of the electronic device or a display (e.g., 901) of the electronic device. In some examples, the requirement that a portion of the biometric feature (e.g., a portion of the user's facial image 918) be oriented in a predetermined manner relative to the electronic device is a requirement that the biometric feature (e.g., 917) be positioned within a threshold angle (e.g., elevation) relative to the one or more biometric sensors (e.g., 903). In some examples, the alignment criteria require that the biometric feature (e.g., 917) be positioned in a predetermined manner relative to the biometric sensor (e.g., 903) such that the biometric sensor can capture biometric data corresponding to the biometric feature at a particular angle or over a range of angles. In some examples, the device obscures a display (e.g., display portion 912 and/or display portion 914) of the electronic device, e.g., based on an extent to which the biometric characteristic (e.g., 917) is outside of a predefined range of angles relative to the one or more biometric sensors (e.g., 903).
In some examples, the alignment criteria include (1042) a requirement that the biometric feature (e.g., 917) is within a first threshold distance from the one or more biometric sensors (e.g., 903) (e.g., the biometric feature is not too far from the biometric sensors) and a requirement that the biometric feature is not within a second threshold distance from the one or more biometric sensors (e.g., the biometric feature is not too close to the biometric sensors) (1026).
In some examples, when the biometric feature (e.g., 917) is located at a first distance from the electronic device that is not within the predetermined range of distances from the electronic device, the device detects (1044), by the one or more cameras (e.g., 903), a change in the distance of the biometric feature (e.g., 917) from the first distance to a second distance that is not within the predetermined range of distances from the electronic device. In response to detecting the change in distance, the device generates (1046) an output (e.g., audio, tactile, and/or visual output 922, 924) having a value (e.g., magnitude or amplitude, or frequency or repetition rate) of an output feature that varies based on the distance of the biometric feature from the predetermined distance range. In some examples, the electronic device emits an ongoing audio output (e.g., 924, e.g., a series of beeps) having a frequency that increases as the distance between the biometric characteristic (e.g., 917) and the electronic device approaches a target distance (or range of distances) from the electronic device. For example, the rate of the beeps is optionally increased. Conversely, the frequency of the audio output (e.g., 922) optionally decreases as the distance between the biometric feature and the electronic device moves further away from the target distance (or range of distances) from the electronic device. For example, the rate of the beeps is optionally reduced. In some examples, similar feedback is generated with a tactile output (e.g., output 924) or a visual output. Issuing an audio, tactile, and/or visual output based on the change in distance between the biometric characteristic and the device provides the user with ongoing feedback regarding the location of his or her biometric characteristic relative to a range of distances from the biometric sensor corresponding to proper alignment. This in turn reduces the amount of time to display the alignment interface and reduces the amount of user input required during the alignment process. Thus, providing the user with improved audio, tactile, and/or visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
After initiating the biometric enrollment process, in accordance with a determination that a respective type of biometric feature (e.g., 917) that satisfies the alignment criteria has been detected in the field of view of the one or more cameras (e.g., 903) (1022), the device highlights (1028) a first portion of the field of view of the one or more cameras (e.g., inner display portion 912 in fig. 9J) (e.g., darkens, blurs, and/or blackens a second portion of the field of view without darkening, blurring, and/or blacking the first portion of the field of view of the one or more cameras) relative to a second portion of the field of view of the one or more cameras (e.g., outer display portion 914 in fig. 9J). For example, the alignment criteria include a requirement that the user's face (e.g., 917) be aligned with the camera (e.g., 903) at a predetermined alignment or that the user's eyes be aligned with the camera at a predetermined alignment. Providing a visual effect of highlighting a portion of the display upon detecting successful alignment of the user's biometric feature with the biometric sensor allows the user to quickly identify that the current location of his or her biometric feature is optimal for a subsequent biometric enrollment process (e.g., in accordance with method 1200 and/or method 1400). Providing improved visual feedback when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device obscures (1030) a portion of the digital viewfinder corresponding to a second portion (e.g., 914 in fig. 9J) of the field of view of the one or more cameras (e.g., 903). Darkening in this manner includes dimming or reducing the brightness of a portion of the digital viewfinder that corresponds to the second portion of the field of view.
In some examples, the device stops displaying (1032) a portion of the digital viewfinder corresponding to a second portion of the field of view of the one or more cameras (e.g., second display portion 914). For example, ceasing to display the portion of the viewfinder corresponding to the second portion of the field of view includes blacking out the second portion of the field of view and/or replacing the display of the second portion of the field of view with a display of other content.
In some examples, the device augments (1034) a display of a first portion (e.g., the inner display portion 912) of the field of view of the one or more cameras on the display. In some examples, enlarging the display of the first portion includes enlarging the display of some or all of the first portion of the field of view. In some examples, the display of the first portion of the expanded field of view includes enlarging the first portion of the field of view. In some examples, highlighting a first portion (e.g., 912) of the field of view of the one or more cameras relative to a second portion (e.g., 914) of the field of view of the one or more cameras includes zooming out or hiding some or all of the first portion. In some examples, the device zooms out the first portion before zooming out the display of the first portion and/or zooms out the first portion after zooming out the display of the first portion (e.g., to provide a zoom effect).
In some examples, the device modifies (1036) the alignment element (e.g., 910). For example, in some examples, modifying the alignment element includes removing the alignment element. In some examples, modifying the alignment element includes changing a shape and/or color of the alignment element (e.g., from 910 to 910-2, fig. 9J to 9K). For example, the device modifies (1038) the shape of the alignment element from a first shape to a second shape. In some examples, the first shape (1040) is substantially rectangular and the second shape is substantially circular. Alternatively, the first shape and/or the second shape are optionally part of any other shape or shape. In some examples, the shape is optionally a segmented shape, such as a segmented rectangle (e.g., a rectangle that lacks a portion of one or more edges).
In some examples, after highlighting a first portion (e.g., 912) of the field of view of the one or more cameras relative to a second portion (e.g., 914) of the field of view of the one or more cameras (e.g., 903), the device detects (1054) that a respective type of biometric feature (e.g., 917) that satisfies the alignment criteria is no longer detected in the field of view of the one or more cameras. In response to detecting that a respective type of biometric feature that satisfies an alignment criterion is no longer detected in the field of view of the one or more cameras, the device outputs an indication of an alignment error (e.g., 950). For example, for correctable errors, the device identifies the error and prompts the user to correct the error. For uncorrectable errors, the device only identifies the error. Errors are identified through textual and tactile outputs (e.g., 950, 924, 925). In some examples, errors are identified using audible output, such as those provided for accessibility purposes. In some examples, the criteria for detecting that the biometric feature is no longer detected in the field of view of the one or more cameras is the same as the criteria for determining that the biometric feature meets the alignment criteria. In some examples, the alignment criteria are different than the criteria used to detect that the biometric feature is no longer detected in the field of view of the one or more cameras (e.g., once the biometric feature is aligned with the one or more cameras, the biometric feature may move slightly out of alignment without the device exiting the biometric enrollment process and outputting an indication of an alignment error). Outputting an indication that the user's biometric feature is no longer aligned with the biometric sensor provides feedback that allows the user to quickly identify that the position and/or orientation of his or her biometric feature has deviated from a previously established alignment. The feedback prompts the user to quickly reposition his or her biometric features to re-establish proper alignment with the biometric sensor, reduces the amount of time that alignment user interfaces are displayed, reduces the number of inputs required at these alignment user interfaces, and improves the quality of biometric feature enrollment. Thus, providing the user with improved audio, tactile, and/or visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, outputting the indication of the alignment error includes outputting (1056) a second type of tactile output (e.g., 951). For example, the haptic output is a missing output corresponding to a successful alignment. In some examples, the haptic output is error-specific, and in some examples, an audible output is additionally or alternatively provided.
In some examples, outputting the indication of the alignment error includes (1058) no longer highlighting the first portion (e.g., 912-2) of the field of view of the one or more cameras relative to the second portion (e.g., 914-2) of the field of view of the one or more cameras. For example, the device optionally brightens, clears and/or uncovers a second portion of the field of view relative to the first portion of the field of view of the one or more cameras. In some examples, the electronic device brightens and clears the second portion of the field of view to no longer highlight the first portion relative to the second portion. In some examples, if the biometric feature is successfully aligned after receiving the alignment error (e.g., 917), the device resumes the biometric enrollment process from a location where the enrollment process was located prior to outputting the indication of the alignment error (e.g., until the enrollment process is retained when the alignment error is detected). In some examples, the progress indicator (e.g., 940) indicating the progress of enrollment disappears when the indication of alignment error is output, but the progress indicator (e.g., 940-2) indicating the progress of enrollment is redisplayed when the biometric feature is properly aligned with the one or more biometric sensors. In some cases, when the progress indicator is redisplayed, it includes an indication of the progress made in registering the biometric feature before outputting the indication of the alignment error. Providing a visual effect that no longer highlights a portion of the display upon detection of an alignment error allows the user to quickly recognize that the position and/or orientation of his or her biometric feature has deviated from a previously established alignment. The feedback prompts the user to quickly reposition his or her biometric features to re-establish proper alignment with the biometric sensor, which reduces the amount of time that alignment user interfaces are displayed, reduces the number of inputs required at these alignment user interfaces, and improves the quality of subsequent biometric feature registrations (e.g., according to method 1200 and/or method 1400). Providing improved visual feedback when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, based on (1062) determining that the alignment error is a first type of alignment error (e.g., the biometric feature is too far or too close to the electronic device), the device outputs (1064) (e.g., displays) a prompt (e.g., 920) to move the biometric feature to correct the first type of alignment error. For example, the device prompts the user to move closer to or further away from the electronic device, respectively.
In some examples, in accordance with a determination (1062) that the alignment error is a second type of alignment error (e.g., the biometric feature is outside the first portion of the field of view), the device outputs (1064) (e.g., displays) a prompt (e.g., 950) to move the biometric feature to correct the second type of alignment error. For example, the device prompts the user to move the biometric feature into the first portion of the field of view. In this case, the device forgoes (1068) outputting a prompt (e.g., 926) to move the biometric characteristic to correct the first type of alignment error. In some examples, a second portion (e.g., 914-2) of the field of view is modified (e.g., blurred) in response to determining that the alignment error is of a second type of alignment error.
For example, a first type of alignment error is (1074) a portion of the biometric feature (e.g., the portion of 917 shown in 939, 918-2) is oriented outside of a first portion (e.g., 912-2) of the field of view. In this case, the device outputs (1076) a prompt (e.g., 950) to move the portion of the biometric feature into the first portion of the field of view to prompt the user to correct the first type of alignment error. Providing a prompt with instructions on how to correct the alignment error provides feedback that allows the user to quickly identify how to reposition his or her biometric features in order to reestablish proper alignment and proceed with the enrollment process. This in turn reduces the amount of time the device displays the alignment interfaces and reduces the amount of user input required at these alignment interfaces. Providing improved visual feedback when a set of conditions has been met enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In another example, the first type of alignment error is (1078) a distance between a portion of the biometric feature (e.g., 917) and the one or more biometric sensors (e.g., 903) is within a threshold distance (e.g., the biometric feature is too close to the one or more biometric sensors). In this case, the device outputs (1080) a prompt (e.g., 920) to move the biometric feature away from the electronic device to prompt the user to correct the first type of alignment error.
In another example, the first type of alignment error is (1082) a distance between a portion of the biometric feature (e.g., 917) and the one or more biometric sensors (e.g., 903) exceeds a threshold distance (e.g., the biometric feature is too far away from the one or more biometric sensors). In this case, the device outputs (1084) a prompt (e.g., 920) to move the biometric feature closer to the electronic device to prompt the user to correct the first type of alignment error.
In another example, the first type of alignment error is an angle of the biometric feature (e.g., 917) with respect to the one or more biometric sensors (e.g., 903) is outside of a predefined range of angles (e.g., elevation) with respect to the one or more biometric sensors. For example, in some cases, biometric features are too high. In another example, in some cases, the one or more biometric sensors are too low. In this case, the device outputs a prompt to move the biometric feature to adjust an angle (e.g., elevation angle) of the biometric feature relative to the one or more biometric sensors.
In some examples, in accordance with a determination that the first type of error state persists for a threshold period of time (1086), the device displays (1088) an accessibility interface (e.g., 908) that enables the user to proceed with biometric enrollment without correcting the error state. For example, in some examples, the device enables the user to continue with biometric enrollment without moving the biometric feature relative to the device (e.g., 917) such that the error state is corrected or without tilting the biometric feature to capture images of different sides of the biometric feature. In some examples, the device enables the user to proceed with biometric enrollment in this manner if the biometric feature is improperly aligned for a predetermined amount of time and/or in response to a predetermined number of failed requests.
In some examples, after outputting the alignment error, in accordance with a determination that a respective type of biometric feature (e.g., 917) that satisfies the alignment criteria has been detected in the field of view of the one or more cameras (e.g., 903), the device again protrudes (1070) a first portion (e.g., 912-2) of the field of view of the one or more cameras relative to a second portion (e.g., 914-2) of the field of view of the one or more cameras. For example, optimally, the device obscures, blurs and/or blackens a second portion (e.g., 914-2) of the field of view of the one or more cameras, while not obscuring, blurring and/or blacking a first portion (e.g., 912-2) of the field of view of the one or more cameras.
In some examples, after outputting the alignment error, and in accordance with a determination that a respective type of biometric feature (e.g., 917) that satisfies the alignment criteria has been detected in the field of view of the one or more cameras (e.g., 903), the device outputs (1072) a first type of tactile output (e.g., 936). However, in some examples, the device outputs a third type of haptic output that is different from the first type and the second type.
In accordance with a determination that a respective type of biometric feature (e.g., 917) that satisfies an alignment criterion has not been detected in the field of view of the one or more cameras (e.g., the face or eye of the user has not been detected in a predetermined alignment), the device maintains (1090) display of the digital viewfinder without highlighting a first portion (e.g., 912-2) of the field of view of the one or more cameras relative to a second portion (e.g., 914-2) of the field of view of the one or more cameras (e.g., 903).
In some examples, the device detects (1092) a change in orientation and/or position of the biometric feature (e.g., 917) relative to the one or more biometric sensors (e.g., 903). For example, the device optionally detects a change in position, a change in orientation, or a change in both orientation and position.
In some examples, in response to detecting (1094) a change in orientation and/or position of the biometric feature (e.g., 917) relative to the one or more biometric sensors (e.g., 903), and in accordance with a determination that device movement criteria have been met (e.g., the device is physically moving more than a threshold amount in a manner that prevents reliable alignment/registration), the device outputs (1096) a prompt (e.g., 946, 948, visual, tactile, or audible alert) to reduce movement of the electronic device. In some examples, the device detects a decreasing movement of the device, and in response to detecting the decreasing movement of the device, the device stops outputting the prompt. In some examples, movement of the device is determined based on the one or more biometric sensors (e.g., 903). For example, the change in orientation and/or position of the biometric feature relative to the one or more biometric sensors is consistent with movement of the device around the biometric feature, rather than movement of the biometric feature in a view of the one or more biometric sensors. In some examples, the movement of the device is determined based on one or more orientation sensors of the device, such as an accelerometer (e.g., 168), a magnetometer, an inertial measurement unit, or the like, separate from the one or more biometric sensors.
In some examples, the device displays an enrollment progress indicator (e.g., 940) for enrollment of the biometric feature when the biometric feature (e.g., 917) is within the first portion (e.g., 912) of the field of view of the one or more biometric sensors (e.g., 903) and within a threshold distance of the one or more biometric sensors, and in accordance with a determination that the biometric feature is within a predefined range of angles (e.g., elevation angle relative to the one or more biometric sensors) (e.g., as described in more detail with reference to method 1200 and fig. 11A-11E). Displaying the registration progress indicator optionally includes first highlighting a first portion (e.g., 912-2) of the field of view of the one or more cameras relative to a second portion (e.g., 914-2) of the field of view of the one or more cameras as described above. Displaying the progress indicator during enrollment in this manner encourages the user to look at the display of the electronic device during enrollment to improve the ability to detect when gaze is directed at the display (and thus detect whether the user is focusing on the device). Encouraging the user to look at the display of the electronic device enhances the operability of the device and makes the user-device interface more efficient (e.g., by ensuring that the user's gaze is directed at the display, thereby ensuring that the user's biometric features are properly registered), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in accordance with a determination that the biometric feature (e.g., 917) is outside of a predefined range of angles (e.g., elevation angle relative to the one or more biometric sensors 903), the device obscures (e.g., blurs, obscures, or desaturates) at least a portion of the preview of the image data (e.g., display portion 912, 912-2, 914, and/or 914-2). In some examples, the device delays obscuring the portion of the preview of image data (e.g., for at least a predetermined period of time, such as 1 second, 2 seconds, or 5 seconds, after detecting that the biometric feature is within the first portion of the field of view and within the threshold distance of the one or more biometric sensors) such that the portion of the preview of image data is not obscured if the user is actively transitioning the orientation of the biometric feature relative to the one or more biometric sensors (e.g., 903). In some examples, the concealment is delayed as long as the angle of the biometric feature is changing. In some examples, the obscuring is delayed until the angle of the biometric feature has continued to be outside of the predefined range of angles for at least a predetermined period of time. In some examples, only a portion of the preview (e.g., 912 or 914, 912-2, or 914-2) is obscured. In some examples, all of the previews (e.g., 912 and 914, 912-2 and 914-2) are obscured. Concealing the digital viewfinder when the biometric sensor is positioned too high or far below the user's biometric feature allows the user to quickly identify his or her biometric feature misalignment. This in turn prompts the user to change the elevation angle between the device and his or her biometric features until proper alignment is established. Providing improved visual feedback when a set of conditions has been met enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device detects a change in an angle of a biometric feature (e.g., 917) relative to the one or more biometric sensors (e.g., 903) when a portion of the preview of image data (e.g., 912 or 914) is obscured. In response to detecting a change in the angle of the biometric feature relative to the one or more biometric sensors, and moving the biometric feature closer to the predefined angular range without moving the biometric feature into the predefined angular range according to the determined change in angle, the device reduces an amount of obscuring of the portion of the preview of image data (e.g., 912 or 914, 912-2, or 914-2) while continuing to obscure the portion of the preview of image data. In some examples, an amount of reduction in the obscuration of a portion of the preview of image data depends on an amount of change in the angle of the biometric feature relative to the one or more biometric sensors (e.g., the more the biometric feature moves toward the one or more biometric sensors, the greater the reduction in the amount of obscuration). Moving the biometric feature into a predefined range of angles in accordance with the change in the determined angle, the device stops obscuring the portion of the preview of the image data. In some examples, when a change in the angle of the biometric feature moves the biometric feature into a predefined range of angles, the device generates a haptic and/or audio output (e.g., 934, 936) to inform the user that the angle of the biometric feature is within the predefined range of angles. Reducing the obscuration of the digital viewfinder as the user's biometric feature moves closer to the predefined angular range allows the user to quickly identify a set of locations corresponding to successful alignment of the biometric feature. This in turn prompts the user to change the elevation angle between the device and his or her biometric features until proper alignment is established. Providing improved visual feedback when a set of conditions has been met enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in response to detecting a change in an angle of a biometric feature (e.g., 917) relative to the one or more biometric sensors (e.g., 903) and moving the biometric feature further away from a predefined range of angles according to the determined change in angle, the device increases an amount of obscuration of a portion of the preview of image data (e.g., 912 or 914, 912-2, or 914-2). In some examples, the amount of shading of the portion of the preview of image data increases depending on an amount of change in the angle of the biometric feature relative to the one or more biometric sensors (e.g., the more the biometric feature moves away from the one or more biometric sensors, the greater the amount of shading increases).
In some examples, the obscuring includes blurring the preview of the image data, and reducing the amount of obscuring of the portion of the preview of the image data includes reducing the amount of blurring of the preview of the image data (e.g., by reducing a blur radius or other blurring parameter). In some examples, increasing the amount of concealment of a portion of the preview of image data includes increasing a blur radius or other blur parameter.
Note that the details of the processes described above with respect to method 1000 (e.g., fig. 10A-10F) may also be applied in a similar manner to the methods described below. For example, method 1000 optionally includes one or more of the features of the various methods described below with reference to methods 800, 1200, 1400, 1600, 1800, 2000, 2200, 2400, and 2700. For example, the registration process as described in method 1200 may be applied with respect to a face registration interface (e.g., 954). As another example, a reminder as described in method 1400 can be applied with respect to a registration progress meter (e.g., 940). As another example, accessibility features as described in method 1400 may be applied in place of or in conjunction with accessibility options (e.g., 928-2). For the sake of brevity, these details are not repeated in the following.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 9A-9I are optionally implemented by the components depicted in fig. 1A-1B. For example, display operation 1002, detect operation 1004, display operation 1006, highlight operation 1028, and hold operation 1090 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Attention is now directed to fig. 11A-11L, which illustrate exemplary user interfaces for registering biometric features on an electronic device (e.g., device 100, device 300, device 500, device 700, or device 900) according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the process in FIG. 12.
Fig. 11A shows an electronic device 1100 (e.g., portable multifunction device 100, device 300, device 500, device 700, or device 900). In the illustrative example shown in fig. 11A-11L, the electronic device 1100 is a smartphone. In other examples, the electronic device 1100 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 1100 has a display 1102, one or more input devices (e.g., a touch screen, buttons, a microphone of the display 1102), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 1103), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 1103 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
As shown in fig. 11A, device 1100 displays a face registration user interface 1104 on display 1102. In some examples, face registration user interface 1104 is displayed after device 1100 detects successful alignment of the user's face with respect to its one or more cameras, as described above in connection with fig. 9A-9 AE. Face registration interface 1104 includes a user face image 1106. In some examples, the user facial image 1106 is an image of the user captured by one or more cameras on the device 1100. For example, the user facial image 1106 is optionally a live preview of image data captured by the one or more cameras (e.g., digital viewfinders), which is continuously updated as the field of view and/or field of view content of the cameras changes. In some examples, the background content is removed so that only the user's face is visible in the face image 1106. The face registration interface also optionally includes orientation guides 1108 superimposed (e.g., overlaid) on the user's facial image 1106. As described above in connection with fig. 7I-7K, the orientation guide 1108 is optionally a set of curves that extend into the virtual z-dimension (e.g., along an axis perpendicular to the plane of the display) and intersect over the center of the user's facial image 100. Thus, the curve of the orientation guide 1108 appears to be convex outward relative to the plane of the display 1102 to give the user a sense of the position of the head in three-dimensional space.
Face registration user interface 1104 also includes a registration progress meter 1110. Registration progress meter 1110 includes a set of display elements (e.g., progress elements 1110a, 1110b, and 1110c) arranged around user facial image 1106 and orientation guide 1108. In the example of fig. 11A, the progression element is a set of lines arranged in a circular pattern extending radially outward from the user face image 1106. In some examples, the progress elements 1110a, 1110b, 1110c, etc. indicate the orientation of the user's face required to register the corresponding facial feature. For example, as the user's head tilts upward, the progression elements in the upper portion of the registry 1110 optionally move, fill, elongate, and/or change color, which allows the one or more cameras on the device 1100 to capture image data of the underside of the user's face. This process is described in more detail below. In the example of fig. 9A, device 1110 displays that the progress element in registration progress meter 1110 is in an unregistered state (e.g., the progress element is grayed out).
The facial registration interface 1104 also includes a text prompt 1112 that instructs the user to move (e.g., rotate and/or tilt) their head in a circular motion during the registration process. In some examples, text prompt 1112 is optionally accompanied by a tactile and/or audible prompt depending on device settings and/or user selections. In some examples, device 1112 displays text prompt 1112 on registration interface 1104 throughout the face registration process.
As shown in fig. 11B, device 1100 detects movement of a user's face relative to its one or more cameras. The movement of the user's face is optionally a rotational and/or tilting motion relative to the device 1100. In response, device 1100 continuously updates user face image 1106 (e.g., displays its movement) to reflect the change in orientation of the user's face. In some examples, the orientation guide 1108 tracks movement of (e.g., moves with) the user face image 1106 to visually highlight tilting and rotational movement of the user's face in three dimensions. For example, the center (e.g., intersection) of the orientation guide 1108 is optionally located at and moves with the center point on the user's facial image 1106. In some examples, the device 1100 also adjusts the curvature of the line including the orientation guide 1108 to give the appearance of a three-dimensional rotation (e.g., relative to an axis perpendicular to the display 1100). In some examples, the device 1100 highlights the orientation guide 1108 while the orientation guide is moving (e.g., while the orientation of the user's face is changing). For example, device 1100 optionally obscures orientation guide 1108 while it is in motion and/or displays a fading trajectory while tracking the movement of the user's face. In this case, the device 1100 optionally reduces such protrusion of the orientation guide 1108 relative to the user's facial image 1106 when the user's face is not moving.
As shown in fig. 11B, in response to detecting that the user's face is oriented toward the progress meter portion 1114 (e.g., in accordance with a determination that the image data captured by the biometric sensor 1103 includes an angular view of the user's face), the device 1110 updates the display of the progress elements in the meter portion 1114 to a "registering" state by changing the appearance of the progress elements in the meter portion 1114. For example, device 1100 optionally enlarges and/or changes color of the progression element in meter portion 1114 as the user's face is oriented toward meter portion 1114. In some examples, when the progress element is updated to the "registering" state, device 1100 elongates the progress scale and changes its color from gray to blue. Changing the display of the progression element to the "registering" state in this manner indicates that the device 1100 is capturing (e.g., registering) face imaging data for an angular view corresponding to the current orientation of the user's face. In the example of fig. 11B, device 1100 maintains the progress element in meter portion 1116 in an unregistered state to indicate that the user has not oriented their face toward meter portion 1116. In some examples, the display of the meter portion 1114 is updated in this manner only if the user's face is sufficiently rotated toward the meter portion 1114 (e.g., only if the user's face is rotated by at least a threshold amount or angle).
In some examples, registration progressive meter includes a set of progressive meter portions, such as meter portion 1114 and meter portion 1116. In some examples, each progress meter section contains a predetermined number of progress elements (e.g., 3, 5, or 8 progress elements) associated with each section.
In some examples, as shown in fig. 11C, device 1110 detects a small rotation and/or tilt of the user's face and updates the digital viewfinder containing user facial graphic 1106. For example, the user's face has begun to tilt downward and rotate to the right. In the example of fig. 11C, however, the user's face is still oriented toward the progress meter portion 1114. Thus, device 1100 continues to display the progression elements of meter portion 1114 in the registering state even if the user begins to rotate down and to the right and/or tilt their head. In this case, device 1100 also maintains a display of progression elements proximate meter portion 1114, as the user's head has not been sufficiently rotated to trigger registration of the corresponding orientation.
As shown in fig. 11D, device 1110 detects that the user's face has rotated and/or tilted toward meter portion 1118. In the example of FIG. 11D, the user's face continues to move as shown in FIG. 11C, tilting downward and rotating to the right through its initial position in FIG. 11A (e.g., the user's face moves so as not to become oriented toward the rest of the registry 1110). In response to detecting the change in facial orientation, device 1100 moves orientation guide 1108, tracking the movement of user facial image 1106 in the digital viewfinder. In accordance with a determination that the user's face has become oriented toward the meter portion 1118 (e.g., the image data captured by the biometric sensor 1103 includes a second angular view of the user's face), the device 1100 updates the progression element in the meter portion 1118 to the "registering" state described above. For example, device 1100 extends the progress scale within meter portion 1118 and changes their color. In some examples, device 1100 updates the display of meter portion 1118 only if a corresponding portion of the user's face has not been previously registered (e.g., only if the progression element in meter portion 1118 is in an "unregistered" grayed-out state). In some examples, the device 1100 updates the display of the meter portion 1118 regardless of whether a corresponding portion of the user's face has been previously registered (e.g., to provide a further indication of the orientation of the user's face relative to the biometric sensor 1103).
In the example of fig. 11D, device 1100 also detects that the user's face is no longer oriented toward the progress meter portion 1114 (because the user's face is currently oriented toward the meter portion 1118). In response, device 1100 again changes the appearance of the progression element in meter portion 1114 to the "registered" state. In the example of fig. 11D, device 1100 updates the display of the progress scale in portion 1114 from the elongated "registering" state by shortening the progress scale and changing their color again. For example, the progression element in the "registered" state is the same length and/or size as the progression element in the "unregistered" state, but is displayed in green to indicate that a corresponding portion of the user's face (e.g., the angle view captured in FIG. 11B) has been successfully registered as described above in connection with FIG. 11B.
In the example of fig. 11D, device 1100 maintains the progress element in meter portion 1116 in an unregistered state to indicate that the user has not oriented their face toward meter portion 1116.
11E-11H illustrate the face registration interface 1104 as the user rotates and/or tilts their face in a counterclockwise motion through a series of orientations associated with the right side of the registration progress meter 1110. Beginning with the progress meter portion 1118, the device 1100 sequentially changes the progress elements in the rotational path to the "registering" state described above based on the user's face orientation (e.g., in response to detecting the user's face orientation toward the corresponding portion of the progress meter 1110). Once the user's face has rotated past these progress elements (e.g., in response to detecting that the user's face is no longer oriented toward the corresponding portion of the progress gauge 1110), the device 1100 updates the progress elements to a "registered" state to indicate successful registration of the corresponding portion of the user's face. This process is described in more detail below. In some examples, the visual features of the progression element in the "registering" state are based on a rate at which the user's facial orientation changes. For example, if the user's face is rotating at a first speed, the device 1100 modifies the color of progression elements that are in the "registering" state in a first manner, and if the user's face is rotating slower and/or faster, modifies the color of those progression elements in a second manner.
As shown in fig. 11E, the device 1100 detects that the user's face has been rotated in a counterclockwise manner relative to the biometric sensor 1103 (e.g., the user's face is rotated upward and/or tilted to the left relative to its position in fig. 11D). As described above, the device 1100 continuously updates the user facial image 1106 to reflect the change in orientation and moves the orientation guide 1108 to track the movement of the user facial image 1106 in the digital viewfinder. As the user's face rotates upward, device 1100 updates the display of one or more progression elements (e.g., 1116a) in meter portion 1116 to the "registering-in state" (e.g., by elongating the one or more progression elements and/or changing their color as described above). As shown by the position of the user face image 1106 in fig. 11E, the rotation moves the user face past (e.g., out of) the orientation corresponding to one or more progression elements (e.g., 1118a) in the meter portion 1118. In response to detecting that the user's face is no longer in this orientation, device 1100 updates the display of the one or more progression elements (including 1118a) to the "registered" state described above to indicate successful registration of these portions. In the example of fig. 11E, device 1100 maintains one or more elements (e.g., 1118b) of progress meter portion 1118 in a "registering" state because the user's face has not rotated out of the corresponding orientation. Likewise, device 1100 also continues to display one or more progression elements (e.g., 1116b) in meter portion 1116 in an initial "unregistered" state because the user's face has not yet been positioned in the corresponding orientation.
FIG. 11F shows the face registration interface 1104 as the user's face continues to rotate counterclockwise relative to its position in FIG. 11E. Likewise, the device 1100 continuously updates the user facial image 1106 to reflect the change in orientation and moves the orientation guide 1108 to track the movement of the user facial image 1106 in the digital viewfinder. As shown by the position of the user face image 1106 in fig. 11F, the rotation moves the user face into an orientation corresponding to the progress meter section 1116. In response to detecting the user's face in this orientation, device 1110 changes the display of one or more progress elements (e.g., 1116b) in meter portion 1116 from an "unregistered" state to an "registering" state (e.g., by elongating the one or more progress elements and/or changing their color as described above). As shown by the position of the user face image 1106 in fig. 11F, the rotation also moves the user face past (e.g., out of) the orientation of the remaining element (e.g., 1118a) corresponding to the progress meter portion 1118. In response to detecting that the user's face is no longer in this orientation, device 1100 updates the display of these progression elements (including 1118b) to the "registered" state described above, indicating successful registration of the angular view of the user's face corresponding to meter portion 1118 b. In the example of fig. 11F, device 1100 also continues to display the progression element in meter portion 1120 in the initial "unregistered" state described above, because the user's face has not yet been positioned in the corresponding orientation.
FIG. 11G shows the face registration interface 1104 as the user's face continues to rotate counterclockwise relative to its position in FIG. 11F. Likewise, the device 1100 continuously updates the user facial image 1106 to reflect the change in orientation and moves the orientation guide 1108 to track the movement of the user facial image 1106 in the digital viewfinder. As shown by the position of the user face image 1106 in fig. 11G, the rotation moves the user face into an orientation corresponding to the progress meter portion 1120. In response to detecting the user's face in this orientation, device 1110 changes the display of the progression elements in meter portion 1120 from the "unregistered" state to the "registering" state (e.g., by elongating the one or more progression elements and/or changing their color as described above). As shown by the position of the user face image 1106 in fig. 11G, the rotation also moves the user's face past (e.g., out of) the orientation corresponding to the progress meter portion 1116. In response to detecting that the user's face is no longer in this orientation, device 1100 updates the display of the progression element in meter portion 1116 to a "registered" state, indicating successful registration of the angular view of the user's face corresponding to meter portion 1116. In the example of fig. 11G, device 1100 continues to display the progression element in meter portion 1122 in the initial "unregistered" state because the user's face has not yet been positioned in the corresponding orientation.
FIG. 11H shows the face registration interface 1104 as the user's face continues to rotate counterclockwise relative to its position in FIG. 11G. Likewise, the device 1100 continuously updates the user facial image 1106 to reflect the change in orientation and moves the orientation guide 1108 to track the movement of the user facial image 1106 in the digital viewfinder. As shown by the position of user face image 1106 in fig. 11H, the rotation moves the user's face into an orientation corresponding to the progress meter portion 1122. In response to detecting the user's face in this orientation, device 1110 changes the display of the progress elements in meter portion 1122 from the "unregistered" state to the "registering" state (e.g., by elongating the one or more progress elements and/or changing their color as described above). In some examples, the orientation causes the device 1100 to change the display of one or more progression elements in the meter portion 1114 from the "registered" state shown in fig. 11D-11G back to the "registering" state based on the orientation of the user's face, even if corresponding facial features have been registered (e.g., to provide a further indication of the orientation of the user's face relative to the biometric sensor 1703). In this case, in response to detecting that the user's face is no longer oriented in that direction, device 1100 restores the elements of the progress meter portion 1114 back to the "registered" state. As shown by the position of the user face image 1106 in fig. 11H, the rotation also moves the user face past (e.g., out of) the orientation corresponding to the progress meter portion 1120. In response to detecting that the user's face is no longer in this orientation, device 1100 updates the display of the progression element in meter portion 1120 to a "registered" state, indicating successful registration of the angular view of the user's face corresponding to meter portion 1120. In the example of fig. 11G, device 1100 continues to display the remaining progress elements of registry 1110 (e.g., the progress elements not in meter portions 1114, 1116, 1118, 1120, or 1122) in the initial "unregistered" state because the user's face has not yet been positioned in the corresponding orientation.
Registration and/or scanning of facial features of the user continues in this manner until all elements of the registration progressive meter 1110 have transitioned to the registered state (e.g., until image data for all corresponding angular views of the user's face have been captured by the biometric sensor 1103). For example, registration continues until the user's face returns to the orientation corresponding to the meter portion 1118 by rotating counterclockwise.
Displaying and updating the progress indicator during enrollment in this manner encourages the user to look at the display of device 1100 during enrollment to improve the ability to detect when gaze is directed at the display (and thus detect whether the user is focusing on the device). Encouraging the user to look at the display of the device 1100 enhances the operability of the device and makes the user-device interface more efficient (e.g., by ensuring that the user's gaze is directed at the display, thereby ensuring that the user's biometric features are properly registered), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
It should be understood that the examples of fig. 11D-11H are merely illustrative. In particular, registration of the user's facial features may begin at any portion of the progress meter 1110 (e.g., the meter portion 1114). Similarly, the angular views of the user's face corresponding to each meter section or progression element may be registered in any order (e.g., by rotating clockwise).
Fig. 11I shows the face registration interface 1104 after image data for all corresponding angular views of the user's face have been captured by the biometric sensor 1103. In the example of fig. 11I, device 1100 has transitioned the display of all progression elements in registry 1110 to the "registered" state (e.g., during the registration process described above in connection with fig. 11B-11H). For example, the device 1100 changes the color of the progress element to green to indicate successful registration. In the example of fig. 11I, device 1100 displays a text prompt 1124 indicating that the first scan of the user's facial features is complete. In some examples, the device 1110 issues an audio and/or tactile notification 1126 to provide an additional indication that the first scan is complete. In some examples, the audio and/or tactile output indicating successful registration of the user's facial features is the same as the audio and/or tactile output used to indicate successful facial authentication at device 1100. In the example of fig. 11I, device 1100 continues to display user facial image 1106. In some examples, the user face image 1106 remains part of the live preview of the digital viewfinder. In other examples, device 1100 displays a single (e.g., static) user image captured during an enrollment process. In the example of fig. 11I, once the scan is complete, device 1100 stops displaying orientation guide 1108.
As shown in fig. 11J through 11K, in some examples, device 1100 displays an animation that transitions the display of registration progress meter 1110 to success status meter 1128 shown in fig. 11K. For example, the device 1100 reduces the length of each progress tick mark as shown in fig. 11J and merges the display of the previously discrete progress elements into a continuous circle. In the example of fig. 11K, after displaying the animation, device 1100 displays scan completion interface 1130. The scan completion interface 1130 includes a user face image 1132 and a success status meter 1128. In the example of fig. 11K, the user face image 1132 is blurred, faded, darkened, or otherwise obscured to indicate that additional image data is no longer collected as part of the face scan. In some examples, the success status meter 1128 is a solid continuous green circle around the user's face image 1132 that provides a visual indication that the first scan is complete. To provide further visual notification, the scan completion interface 1130 also includes a text prompt 1134 (e.g., a completion message). The scan completion interface 1130 also includes a continuation affordance 1136. In some examples, while displaying the registration completion interface 1130, the device 1100 detects activation (e.g., selection) of the continuation affordance 1136 (e.g., via user input 1137). In some examples where the display is touch sensitive, the user input 1337 is a tap, swipe, or other gesture on the display surface substantially on the continuation affordance 1136. In other examples, the activation of the continuation affordance 1136 is a keyboard input or an activation of the affordance with a focus selector (e.g., a mouse cursor).
In some examples, a second iteration of face registration is performed after the registration process described above with respect to fig. 11B-11G is completed. As shown in fig. 11L, in response to detecting activation of continuation affordance 1136, device 1100 displays a second face registration interface 1138. In the example of fig. 11L, the second facial registration indicator includes a second user facial image 1140 and a second registration progress meter 1142. In some examples, the second user face image 1140 is a representation of the field of view of the biometric sensor 1103 with similar visual processing as the user face image 1106 (e.g., the second user face image 1140 is a real-time preview of image data captured by the biometric sensor 1103 displayed as a digital viewfinder). In some examples, the device 1100 displays the second orientation guide 1144 superimposed (e.g., overlaid) on the second user facial image 1140. In the example of fig. 11L, the second orientation guide 1144 has similar visual processing as the orientation guide 1108 (e.g., the second orientation guide 1144 includes a plurality of curves that appear to extend out of the plane of the display 1102 into a virtual z-dimension). In some examples, second registration progress meter 1142 contains a set of progress elements (e.g., 1142a, 1142b, 1142c) spaced around second user facial image 1140. In some examples, the portions of the second registration progress meter 1142 (e.g., the meter portion 1146 and the meter portion 1148) optionally correspond to a particular orientation or portion of the user's face relative to the biometric sensor 1103. In some examples, some or all of the meter sections optionally include a greater number of progress elements than the corresponding sections of registration progress meters 1110. By way of example, each portion of the second progress meter 1142 corresponds to the same facial orientation or angular view of the user's face as a corresponding portion of the progress meter 1140 (e.g., meter portion 1146 corresponds to the same facial orientation as meter portion 1114 in fig. 11B-11H). In some examples, device 1100 sets the visual state of the progress element in registration progress meter 1142 to the "unregistered" state described above (e.g., resets the registration progress from the first registration scan) while second registration interface 1138 is displayed. In the example of fig. 11L, the second facial registration interface also includes a text prompt 1150 that instructs the user to move (e.g., rotate and/or tilt) their head in a circular motion during the second registration process.
In some examples, when performing the second iteration of face registration, the device 1110 updates the display of the second user face image 1140, the second progress meter 1142, and the orientation guide 1144 in response to a change in the orientation of the user's face relative to the biometric sensor 1103. For example, the user repeats the same (or similar) movement of his or her face performed in the first iteration of registration, and device 1100 updates the display of these elements of second user interface 1138 in the manner described above with respect to fig. 11B-11H (or similar manners).
Fig. 11M shows a second face registration interface 1128 after a second iteration of registration is complete (e.g., after image data for several angular views of the user's face have been captured by the biometric sensor 1103). In the example of fig. 11M, device 1100 has transitioned the display of all the progression elements in second registry 1142 to the "registered" state described above. For example, the color of each progress element has changed to green to indicate registration success. In the example of fig. 11M, device 1100 displays text prompt 1152 indicating that the second scan of the user's facial features is complete. In some examples, device 1110 issues an audio and/or tactile notification 1154 to provide an additional indication that the second scan is complete. In some examples, the audio and/or tactile notification 1154 is the same as the tactile notification 1126 issued to indicate completion of the first scan. In some examples, the audio and/or tactile output indicating a successful second scan of the facial features of the user is the same as the audio and/or tactile output used to indicate successful facial authentication at the device. In the example of fig. 11M, the device 1100 continues to display the second user face image 1140. In some examples, the second user face image 1140 is part of a live preview of a digital viewfinder. In other examples, device 1100 displays a single (e.g., static) user image captured during an enrollment process. In the example of fig. 11M, the device 1100 stops displaying the second orientation guide 1144 once the scan is complete.
In the example of fig. 11N, after issuing the notification indicating completion of the second scan, the device 1100 displays a second scan completion interface 1156. The second scan completion interface 1156 includes a user face image 1158 and a second success status meter 1160. In the example of fig. 11N, the user face image 1158 is blurred, faded, darkened, or otherwise obscured to indicate that additional image data is no longer collected as part of the second face scan. In some examples, the second success status meter 1160 is a solid continuous green circle around the user face image 1158 that provides a visual indication that the first scan is complete (e.g., similar to the success status meter 1128). To provide further visual notification, the second scan completion interface 1156 also includes a text prompt 1162 (e.g., a second scan completion message). The second scan completion interface 1156 also includes a continuation affordance 1164. In some examples, while displaying the second scan completion interface 1156, the device 1100 detects activation (e.g., selection) of the continuation affordance 1164 (e.g., via the user input 1165). In some examples where the display 1102 is touch sensitive, the activation is a tap, swipe or other gesture on the display surface substantially on the continuation affordance 1164. In other examples, the activation of the continuation affordance 1164 is a keyboard input or an activation of the affordance with a focus selector (e.g., a mouse cursor). In the example of fig. 11N, the user face image 1158 is blurred, faded, darkened, or otherwise obscured to indicate that additional image data is no longer being collected during the second face scan.
In the example of FIG. 11O, in response to detecting activation of the continuation affordance 1164, the device 1100 displays a registration completion interface 1166. As shown in fig. 11O, the registration completion interface 1166 includes a biometric authentication flag symbol 1168. For example, the biometric authentication landmark is optionally a line drawing of all or part of the face (e.g., a stylized facial graphic). In the example of fig. 11O, registration completion interface 1166 also includes text prompt 1170 that indicates that the registration process is complete and that facial authentication at the device is set and/or enabled. In some examples, the registration completion interface 1166 also includes a completion affordance 1172, activation of which causes the device 1100 to exit facial authentication settings. In some examples, registration completion interface 1166 does not include face image 1158.
12A-12B are flow diagrams illustrating methods for registering biometric features of a user on an electronic device, according to some examples. The method 1200 is performed at a device (e.g., 100, 300, 500, 1100) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2000 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 1200 provides an intuitive way for registering biometric features of a user on an electronic device. The method reduces the cognitive burden on the user when registering biometric features on the device, thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling users to more quickly and efficiently enroll biometric features conserves power and increases the interval between battery charges.
The device displays (1202) a biometric enrollment interface (e.g., 1104) on the display. Displaying the biometric enrollment interface includes displaying (1204) a representation (e.g., 1106) of the biometric characteristic. For example, the representation of the biometric feature is optionally a representation of a face, fingerprint, iris, handprint, or other physical biometric feature that may be used to distinguish one person from another in the field of view of one or more cameras of the device (e.g., a representation of the head of a user of the device). The representation of the biometric feature has an orientation determined based on an alignment of the biometric feature with one or more biometric sensors (e.g., 1103) of the device (based on camera data including a user's head located in a field of view of one or more of the cameras).
In some examples, the device displays (1206) a digital viewfinder (e.g., a live preview of image data containing 1106) that includes a representation of the field of view of the one or more cameras (e.g., 1103). For example, in some examples, the device displays a real-time preview of image data captured by the one or more cameras. In some examples, the representation of the field of view of the one or more cameras removes background content. The background is optionally determined based on depth information captured by the one or more cameras (e.g., removing background content optionally includes removing any background or just removing halos). In some examples, the device does not perform any background removal.
Displaying the biometric registration interface further includes concurrently displaying (1208) a first progress indicator portion (e.g., 1110) including at a first location on the display relative to the representation of the biometric characteristic (e.g., 1106) (e.g., 1114, 1116, 1118, 1120, or 1122), a first set of objects (e.g., 1110a, 1110b, and 1110c, or 1116a and 1116b, or 1118a and 1118b), such as a first set of tick marks) spaced around the representation of the biometric characteristic and a second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) at a second location on the display relative to the representation of the biometric characteristic (e.g., 1106) (e.g., 1110a, 1110b, and 1110c, or 1116a and 1116b, or 1118a and 1118b) spaced around the representation of the biometric characteristic), such as a second set of tick marks). A representation (e.g., 1106) of the biometric characteristic is displayed on the display between the first location and the second location. Displaying the progress indicator during enrollment in this manner encourages the user to look at the display of the electronic device during enrollment to improve the ability to detect when gaze is directed at the display (and thus detect whether the user is focusing on the device). Encouraging the user to look at the display of the electronic device enhances the operability of the device and makes the user-device interface more efficient (e.g., by ensuring that the user's gaze is directed at the display, thereby ensuring that the user's biometric features are properly registered), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the progress indicator includes (1210) a plurality of progress indicator portions (e.g., 1114, 1116, 1118, 1120, 1122), each progress indicator portion including one or more progress elements (e.g., 1110a, 1110b, 1110c, 1116a, 1116b, 1118a, 1118 b)). In some examples, the plurality of progress indicator portions includes a first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) and a second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122), and the plurality of progress indicator portions surround at least a portion of the representation of the biometric feature (e.g., 1106). In some examples, one or more progress indicator portions of the plurality of progress indicator portions include (1212) a plurality of respective progress elements (e.g., 1110a, 1110b, 1110c, 1118a, 1118 b). In some examples, the progress indicator optionally includes a set of one or more display elements (e.g., 1110a, 1110b, 1110c, 1116a, 1116b, 1118a, 1118b) arranged around the representation (e.g., 1106) of the biometric feature. For example, these display elements are optionally a circle of radially extending lines indicating the progress of the registration line ("scale") around the user's face. The lines optionally indicate a direction in which a corresponding change in orientation of the biometric feature is sufficient to register the biometric feature (e.g., pointing up to move the line up, even if the bottom of the biometric feature is being scanned). In some examples, the first set of lines corresponds to a first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) and the second set of lines corresponds to a second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122). For example, a predetermined number of graduations (e.g., 8) are associated with each portion of the progress indicator.
In some examples, when a biometric feature (e.g., a user's face) turns toward a first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122), the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) indicates (1214) an enrollment state of a first portion of the biometric feature detected (e.g., visible) by the one or more biometric sensors (e.g., 1103). For example, the appearance of the upper right portion (e.g., 1120) of the progress indicator changes when the user's face is turned toward the upper right portion of the device to register the lower left portion of the user's face. Also, in some examples, when a biometric feature (e.g., a user's face) turns toward a first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122), a second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) indicates a registration status of a second portion of the biometric feature different from the first portion of the biometric feature detected (e.g., visible) by the one or more biometric sensors (e.g., 1103). For example, the appearance of the lower right portion (e.g., 1118) of the progress indicator changes when the user's face is turned toward the lower left portion of the device to register the upper right portion of the user's face.
In some examples, displaying (1202) the biometric enrollment interface (e.g., 1104) further includes displaying a prompt (e.g., 1112) to move the biometric feature. In some examples, the displayed cue is optionally accompanied by a tactile and/or audible cue. In some examples, the type of response is provided based on a setting of the electronic device and/or is manually controlled by a user. Providing a prompt with instructions on how to properly move the biometric feature provides feedback to the user that allows them to quickly identify and perform the required movement, reducing the amount of time required to complete the enrollment process. Thus, providing improved visual cues as to the appropriate inputs required for biometric enrollment enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide the appropriate inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device displays a prompt (e.g., 1108) indicating the direction of the movement. In some examples, the prompt is an orientation guide (e.g., 1108) overlaid on a biometric enrollment interface (e.g., 1104). In some examples, the prompt is overlaid on a representation (e.g., 1106) of the biometric feature. In some examples, the device overlays a three-dimensional object (e.g., 1108) on the representation of the biometric feature (e.g., 1106). For example, the three-dimensional object is optionally an arc that extends into the virtual z-dimension and moves as the user's head is rotated. In some examples, the three-dimensional object (e.g., 1108) includes a plurality of arcs extending into the virtual z-dimension (e.g., two arcs that intersect each other at a point in front of the user's face). In some examples, the three-dimensional object (e.g., 1108) is highlighted when the user is moving (e.g., the three-dimensional object darkens or displays a fading trajectory as it moves with the movement of the biometric feature), which highlights the three-dimensional object when it is moving and reduces the highlighting of the three-dimensional object relative to the representation of the biometric feature when the biometric feature is not moving.
While simultaneously displaying the representation (e.g., 1106) of the biometric feature and the progress indicator (e.g., 1110), the device detects (1216) a change in orientation of the biometric feature relative to the one or more biometric sensors (e.g., 1103).
In some examples, in response to detecting a change in orientation of the biometric feature relative to the one or more biometric sensors (1218), the device rotates the prompt (e.g., 1108) in accordance with the change in orientation of the biometric feature relative to the one or more biometric sensors (e.g., 1103). In some examples, the rotation prompt includes rotating the three-dimensional object (e.g., 1108) at least partially into a virtual z-dimension of the display. Rotating the orientation overlaid on the representation of the biometric feature provides the user with feedback regarding the orientation of his or her biometric feature in three-dimensional space relative to the biometric sensor of the device, enabling the user to more quickly place his or her biometric feature during the enrollment process to move the biometric feature through a desired range of orientations. Thus, providing the user with improved visual feedback regarding the orientation of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in response to detecting a change in orientation of the biometric feature relative to the one or more biometric sensors, the device updates (1220) the representation (e.g., 1106) of the biometric feature according to the change in orientation of the biometric feature relative to the one or more biometric sensors (e.g., 1103). For example, in some examples, the orientation of the representation of the biometric feature (e.g., 1106) is changed regardless of whether the registration criteria are satisfied. In some examples, the orientation of the representation of the biometric feature (e.g., 1106) is changed only if the registration criteria are satisfied. Updating the orientation of the displayed representation of the biometric feature provides the user with feedback regarding the orientation of his or her biometric feature relative to the biometric sensor of the device, enabling the user to more quickly move the biometric feature through a desired range of orientations during the enrollment process. Thus, providing the user with improved visual feedback regarding the orientation of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In accordance with a determination that the change in orientation of the biometric feature satisfies registration criteria for a first portion of the biometric feature corresponding to a first progress indicator portion (e.g., 1114, 1116, 1118), the device updates (1222) one or more visual features of the first progress indicator portion. For example, determining a change in orientation of a biometric feature that satisfies enrollment criteria is optionally based on determining that the image data includes data corresponding to a user's face from a first angular view of a first perspective angle (e.g., a bottom perspective of the face, such as when the user's face is tilted upward). Updating the visual state of a portion of the progress meter corresponding to the current orientation of the biometric feature allows the user to identify that a portion of the biometric feature is properly oriented for enrollment. This in turn indicates to the user how to change the orientation of the biometric feature to register other portions corresponding to other respective portions of the progress meter, reducing the amount of time required to complete the registration process. Thus, providing the user with improved visual feedback regarding the registration status of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the registration criteria for the first portion of the biometric feature corresponding to the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) includes a requirement that the first portion of the biometric feature be oriented in a predetermined manner relative to the one or more biometric sensors (e.g., 1103) (e.g., the user's face is looking toward the first progress indicator portion).
In some examples, the registration criteria for the first portion of the biometric characteristic corresponding to the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) includes a requirement that the first portion of the biometric characteristic has not been registered.
In some examples, the registration criteria for the first portion of the biometric feature corresponding to the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) includes a requirement that the first portion of the biometric feature change (e.g., rotate) in orientation relative to the one or more biometric sensors (e.g., 1103) by at least a threshold amount (1224). In some examples, registration of the first portion of the biometric feature requires that the biometric feature move (rotate) sufficiently so that the first portion can be properly captured by the one or more biometric sensors (e.g., 1103).
In some examples, updating the one or more visual features of the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) includes updating the one or more visual features (e.g., color) of the first progress indicator portion in a first manner based on the registration status of the first portion of biometric features and updating the one or more visual features (e.g., size or length of progress element) of the first progress indicator portion in a second manner based on the alignment of the biometric features with the one or more biometric sensors (e.g., 1103) of the device. For example, when a portion of the biometric feature corresponding to the first progress indicator portion has been registered, the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) changes from black to green, and when a portion of the biometric feature corresponding to the first progress indicator portion is facing the one or more biometric sensors (e.g., 1103), the corresponding line or lines (e.g., 1110a, 1110b, 1110c, 1116a, 1116b, 1118a, 1118b) in the first progress indicator portion are stretched. In some examples, updating the one or more visual features of the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) in the second manner is based on a direction of change in the biometric feature relative to an orientation of the one or more biometric sensors (e.g., 1103). In some examples, the updating of the second manner is additionally or alternatively performed based on a rate of change of the biometric feature relative to an orientation of the one or more biometric sensors. Changing a portion of the progress meter corresponding to the current orientation of the biometric feature from the first visual state to the second visual state allows the user to quickly identify that a portion of the biometric feature is correctly oriented for enrollment. This may in turn indicate to the user how to change the orientation of the biometric feature to register other portions corresponding to other respective portions of the progress meter, which reduces the amount of time required to complete the registration process. Thus, providing the user with improved visual feedback regarding the registration status of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device updates the one or more visual features of the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) based on a rate of change of the biometric feature relative to the orientation of the one or more biometric sensors (e.g., 1103). In some examples, updating the one or more visual features in this manner includes modifying a color of the first progress indicator portion based on a rate of change of the orientation of the biometric feature.
In some examples, the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) includes a plurality of display elements (e.g., 1110a, 1110b, 1110c, 1114a, 1116b, 1118a, 1118b) in a respective order. In accordance with a determination that the change in orientation of the biometric feature relative to the one or more biometric sensors (e.g., 1103) is a change in a first direction (e.g., clockwise rotation), the device optionally changes the appearance of the display element beginning from a first end of the respective sequence (e.g., beginning at 1118 a). For example, the device optionally begins to elongate the lines in the first progress indicator portion, starting from a first side of the respective order, moving to a second side of the respective order (to 1114 a). In accordance with a determination that the change in orientation of the biometric feature relative to the one or more biometric sensors is a change in a second direction (e.g., a counterclockwise rotation), the device optionally changes an appearance of the display element (e.g., 1110a, 1110b, 1110c, 1114a, 1116b, 1118a, 1118b) starting from a second end of the respective order (e.g., starting from 1114a) that is different from the first end of the respective order. For example, the device optionally elongates the line in the first progress indicator portion, starting from the second side of the respective order, moving to the first side of the respective order (to 1118 a). In some examples, a similar approach is taken when changing the appearance of the second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) or other progress indicator portions.
In some examples, the device updates the one or more visual features of the first progress indicator (e.g., 1114, 1116, 1118, 1120, or 1122) from a first state (e.g., "unregistered") to a second state (e.g., "registering") indicating that the first progress indicator partially satisfies the registration criteria. For example, the device enlarges, grows, or changes color of a display element in a portion of the progress indicator (e.g., 1114) toward which the biometric feature is currently oriented, such as a portion of the progress indicator (e.g., orientation of 1106 in fig. 11B) to which the user's face is pointing.
In some examples, after updating the one or more visual features of the first progress indicator portion, the device optionally detects a change in orientation of the biometric feature relative to the one or more biometric sensors so the biometric feature no longer satisfies the registration criteria for the first portion of the biometric feature corresponding to the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122). In response to detecting a change in the orientation of the biometric feature relative to the one or more biometric sensors, the device optionally updates (1226) the one or more visual features of the first progress indicator portion from the second state (e.g., "being enrolled") to a third state (e.g., "enrolled") indicating that the first portion of the biometric feature has been enrolled but no longer meets the enrollment criteria. For example, when the user's face leaves the progress indicator portion, the device optionally changes its appearance (e.g., color or size) again, and when the user orients the biometric feature away from that portion of the progress indicator (e.g., the orientation of 1106 in fig. 11D), optionally transitions the first portion of the progress indicator (e.g., 1114a) from a "tilted toward its" appearance to a registered appearance. One visual attribute (e.g., color) of the progress indicator optionally indicates the registration status (e.g., blue indicates "tilted towards it," green indicates "registered," gray indicates "unregistered"), while another visual attribute (e.g., line length) of the progress indicator indicates the direction of the orientation of the biometric feature. The progress is optionally advanced around a progress indicator (e.g., 1110) based on the direction and speed of the change in tilt. For example, the progress indicator line (e.g., 1110a, 1110b, 1110c, 1114a, 1116b, 1118a, 1118b) optionally bulges based on the direction and speed of movement of the biometric feature and/or changes color based on the direction and speed of movement of the biometric feature. Changing a portion of the progress meter corresponding to the current orientation of the biometric feature from the second visual state to the third visual state allows the user to quickly recognize that a portion of the biometric feature has been successfully registered. This also indicates to the user that they no longer need to move the biometric feature into that orientation during the enrollment process, thus directing the user's attention to enroll other portions of the biometric feature, reducing the amount of time required to complete the enrollment process. Thus, providing the user with improved visual feedback regarding the registration status of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In accordance with a determination that the change in orientation of the biometric feature satisfies the enrollment criteria for the second portion of the biometric feature corresponding to the second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122), the device updates (1228) one or more visual features of the second progress indicator portion. For example, determining that the change in orientation of the biometric feature that satisfies the enrollment criteria is optionally based on determining that the image data includes data from the user's face that corresponds to the user's face from a second different perspective view of a second different perspective angle (e.g., a left perspective of the face, such as when the user's face is tilted to the right). In some examples, updating the visual features of the second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) optionally includes some or all of the steps described above in connection with updating the visual features of the first progress indicator portion. Updating the visual state of the second portion of the progress meter corresponding to the current orientation of the biometric feature allows the user to identify that the second portion of the biometric feature is properly oriented for enrollment. This in turn indicates to the user how to change the orientation of the biometric feature to register other portions corresponding to other respective portions of the progress meter, reducing the amount of time required to complete the registration process. Thus, providing the user with improved visual feedback regarding the registration status of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in accordance with a determination that the enrollment completion criteria are met (e.g., all portions of the biometric feature have been enrolled, 1110 in fig. 11E), the device outputs an indication (e.g., 1124, 1126) that enrollment of the biometric feature is complete.
For example, the device optionally updates one or more visual features of the progress indicator (e.g., 1110) (e.g., merges multiple progress indicator display elements (e.g., 1110a, 1110b, 1110c, 1114a, 1116b, 1118a, 1118b) into a continuous shape such as a circle). In some examples, the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) and the second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) are visually discrete prior to detecting a change in orientation of the biometric feature relative to the one or more biometric sensors. In this case, updating the one or more visual features of the progress indicator includes visually merging the first progress indicator portion and the second progress indicator portion.
In some examples, the device modifies the representation of the biometric characteristic (e.g., 1106, 1132). In some examples, the representation of the biometric feature is obscured, faded, darkened, and/or otherwise obscured to indicate that additional information about the biometric feature is no longer being collected as part of the enrollment process.
In some examples, the device displays a confirmation affordance (e.g., 1136, 1164) and selection of the confirmation affordance causes the electronic device to display a completion interface (e.g., 1166). In some examples, the device displays a simulation of the representation of the biometric characteristic (e.g., 1168). In some examples, the simulation of the representation of the biometric characteristic is two-dimensional. In some examples, the simulation of the representation of the biometric feature is three-dimensional.
In some examples, the device outputs an indication (e.g., 1126, 1154, 1122, 1162, 1170) that the registration procedure is complete (e.g., a haptic output). In some examples, the device outputs a tactile output (e.g., 1126, 1154) indicating successful enrollment of the biometric feature. In some examples, the tactile output indicating successful registration of the biometric feature is the same as the tactile output used to indicate successful authentication of the biometric feature.
In some examples, after outputting the indication that the enrollment of the biometric feature is complete, the device displays a second biometric enrollment interface (e.g., 1138). In some examples, a second iteration of the registration is performed after the first registration is completed. This second iteration of registration is optionally performed in response to selection of an affordance (e.g., 1136). Performing a second scan of the biometric characteristic of the user allows the device to capture additional biometric data corresponding to different orientations or positions of the biometric characteristic that may not have been recorded during the first iteration of enrollment. Thus, performing the second scan of the user's biometric features allows for more efficient and/or secure biometric authentication at the device, enhances the operability of the device, and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In a second biometric enrollment interface, the device displays a second representation of the biometric characteristic (1140). The second representation of the biometric feature optionally has an orientation determined based on alignment of the biometric feature with one or more biometric sensors (e.g., 1103) of the device. In some examples, the second representation of the biometric feature is a representation of a field of view of the one or more cameras with similar visual processing as the first representation of the biometric feature (e.g., 1106).
In a second biometric enrollment interface, the device simultaneously displays a second progress indicator (e.g., 1142) including a third progress indicator portion (e.g., a first set of objects such as 1146 spaced around the representation of the biometric feature) at a first location on the display relative to a second representation (e.g., 1140) of the biometric feature and a fourth progress indicator portion (e.g., a second set of objects such as 1148 spaced around the representation of the biometric feature) at a second location on the display relative to the second representation (e.g., 1140) of the biometric feature. The second representation of the biometric characteristic is displayed on the display between the third location and the fourth location. In some examples, the third progress indicator portion corresponds to the same portion of the biometric characteristic that corresponds to the first progress indicator portion (e.g., 1114). In some examples, the fourth progress indicator portion corresponds to the same portion of the biometric characteristic that corresponds to the second progress indicator portion (e.g., 1118).
In some examples, the registration state of the first progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122) does not correspond to the registration state of the third progress indicator portion (e.g., 1146 or 1148).
In some examples, the first progress indicator portion of the progress indicator includes a first number of progress elements (e.g., 1114a) and the third progress indicator portion of the second progress indicator includes a second number of progress elements (e.g., 1142a, 1142b, 1142c) that is different from (e.g., greater than) the first number. In some examples, multiple (or all) of the progress indicator portions (e.g., 1146, 1148) in the second progress indicator (e.g., 1142) include more progress elements than corresponding progress indicator portions (e.g., 1114, 1118) in the first progress indicator (e.g., 1110).
In some examples, while simultaneously displaying the second representation (e.g., 1140) of the biometric feature and the second progress indicator (e.g., 1142), the device detects a second change in orientation of the biometric feature relative to the one or more biometric sensors (e.g., 1103). In response to detecting the second change in the orientation of the biometric feature relative to the one or more biometric sensors, and in accordance with a determination that the change in the orientation of the biometric feature satisfies the enrollment criteria for the first portion of the biometric feature, the device updates one or more visual features of a third progress indicator portion (e.g., 1146 or 1148). In accordance with a determination that the change in orientation of the biometric feature satisfies the enrollment criteria for the second portion of the biometric feature, the device updates one or more visual features of a fourth progress indicator portion (e.g., 1146 or 1148). For example, to proceed through the second enrollment step of the enrollment process, the user repeats the change in orientation of the biometric feature in the second enrollment step of the enrollment process, the change in orientation of the biometric feature being used to enroll the biometric feature in the first enrollment step of the enrollment process. Accordingly, updating the visual features of the third progress indicator portion and the fourth progress indicator portion optionally includes steps similar to those described above in connection with the first progress indicator portion and the second progress indicator portion (e.g., 1114, 1116, 1118, 1120, or 1122).
In some examples, after detecting the second change in orientation of the biometric feature relative to the one or more biometric sensors (e.g., 1103), and in accordance with a determination that the second set of enrollment completion criteria is met (e.g., all portions of the biometric feature have been enrolled), the device outputs a second indication (e.g., 1162, 1163) that enrollment of the biometric feature is complete. In some examples, registration does not actually occur; instead, the process is visually simulated. In some examples, the second indication is a visual, audible, and/or tactile output (e.g., 1163) indicating that the enrollment of the biometric characteristic is complete. In some examples, the second indication is the same as the indication (e.g., 1126) provided in accordance with the determination that the first set of registration completion criteria are satisfied.
Note that the details of the processes described above with respect to method 1200 (e.g., fig. 12A-12B) may also be applied in a similar manner to the methods described herein. For example, method 1200 optionally includes one or more of the features of the various methods described herein with reference to methods 800, 1000, 1400, 1600, 1800, 2000, 2200, 2500, and 2700. For example, a face registration confirmation interface as described in method 1000 may be applied with respect to the face registration user interface (e.g., 1104). As another example, a reminder as described in method 1400 can be applied with respect to the face registration user interface (e.g., 1104). As another example, a transformation may be applied with respect to a registration progressive meter (e.g., 1110) as described in method 800. For the sake of brevity, these details are not repeated in the following.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 12A-12B are optionally implemented by the components depicted in fig. 1A-1B. For example, display operation 1202, display operation 1208, detect operation 1216, update operation 1222, and update operation 1224 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Fig. 13A-13R illustrate example user interfaces for registering biometric features on an electronic device (e.g., device 100, device 300, device 500, device 700, device 900, or device 1100) according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the process in FIG. 14.
Fig. 13A shows an electronic device 1300 (e.g., portable multifunction device 100, device 300, device 500, device 700, device 900, or device 1100). In the illustrative example shown in fig. 13A-13R, the electronic device 1300 is a smartphone. In other examples, the electronic device 1300 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 1300 has a display 1302, one or more input devices (e.g., a touch screen, buttons, a microphone of the display 1302), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 903), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 1303 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
As shown in fig. 13A, device 1300 displays face registration user interface 1304 on display 1302. In some examples, the face registration user interface 1304 is displayed after the device 1300 detects successful alignment of the user's face with respect to its one or more cameras, as described above in connection with fig. 9A-9Y. In some examples, face registration interface 1304 has similar visual features to face registration interface 1104 described above in connection with fig. 11A. The face registration interface 1304 includes a user face image 1306. In some examples, the user facial image 1306 is an image of the user captured by one or more cameras (e.g., biometric sensor 1303) on the device 1300. For example, the user facial image 1306 is optionally a live preview of image data captured by the one or more cameras (e.g., digital viewfinders), which is continuously updated as the field of view and/or field of view content of the cameras changes. In some examples, the background content is removed such that only the user's face is visible in the face image 1306. The face registration interface 1304 also includes orientation guides 1308 that are superimposed (e.g., overlaid) on the user's facial image 1106. As described above in connection with fig. 7I-7K, the orientation guide 1308 is optionally a set of curves (e.g., crosshairs) that extend into the virtual z-dimension (e.g., along an axis perpendicular to the plane of the display) and intersect over the center of the user's facial image 1306. Thus, the curve of the orientation guide 1308 optionally appears to be convex outward relative to the plane of the display 1302 to give the user a sense of the position of the head in three-dimensional space.
Face registration user interface 1304 includes a registration progress meter 1310. Registration progressive meter 1310 includes a set of display elements (e.g., progression elements 1310a, 1310b, and 1310c) arranged around user facial image 1306 and orientation guide 1308. In the example of fig. 13A, the progression elements are a set of lines extending radially outward from the user face image 1306 and arranged in a circular pattern. In some examples, the progress elements 1310a, 1310b, 1310c, etc. indicate the orientation of the user's face required to register the corresponding facial feature. For example, as the user's head tilts upward, the progression elements in the upper portion of the registry 1310 optionally (e.g., in the manner of fig. 11B-11H) move, fill, stretch, and/or change color, which allows the one or more cameras on the device 1300 to capture image data of the underside of the user's face. In the example of fig. 13A, device 1310 displays that a progress element in registration progress meter 1310 is in an unregistered state (e.g., the progress element is grayed out).
In the example of fig. 13A, face registration interface 1304 includes a text prompt 1312 that instructs the user to begin moving their face relative to the device to advance registration progress meter 1310 (e.g., register their facial features). In some examples, device 1300 displays text prompt 1312 before any portion of the user's face is registered.
While displaying the face registration interface 1304, the device 1300 detects criteria for displaying a registration prompt (e.g., a reminder). In some examples, the enrollment prompting criteria include a requirement that the user's face has moved less than a first threshold amount within a predetermined period of time (as determined by the biometric sensor 1303).
In some examples, in response to detecting that the registration alert criteria are met, device 1300 displays an audio alert enablement interface 1314 as shown in fig. 13B and 13C. In the example of fig. 13B, reminder enabled interface 1314 includes a text prompt 1316 informing the user of the option to enable or disable the audio reminder. Thus, in some examples, reminder enabled interface 1314 includes yes affordance 1318 and no affordance 1320. In some examples, in response to activation (e.g., selection) of the no affordance 1320, device 1300 again displays facial registration interface 1304, allowing the user to proceed with registration of his or her facial features without the prompts and/or reminders described below. In the example of fig. 13C, however, device 1300 detects activation (e.g., selection) of the affordance 1310. In some examples, the activation is a user input (e.g., a tap or swipe gesture) at contact region 1322.
In response to detecting activation of the yes affordance 1320, the device 1300 displays a reminder that the registration interface 1324 is enabled, for example, as shown in the example of fig. 13D. In some examples, one or more of the reminder enabled registration interface 1324 or the below-described prompts are displayed regardless of whether the user has enabled the audio reminder (e.g., in response to detecting that the user's face has not moved sufficiently for a predetermined period of time). In the example of fig. 13D, the reminder enabled registration interface 1324 includes a user facial image 1326 having similar or identical visual features to the user facial image 1306. For example, in some examples, the user facial image 1326 is an image of the user captured by one or more cameras (e.g., biometric sensor 1303) on the device 1300. For example, the user facial image 1326 is optionally a real-time preview of image data captured by the biometric sensor 1303 (e.g., a digital viewfinder) that is continuously updated as the field of view and/or field of view content of the camera changes. In some examples, the reminder enabled registration interface 1324 includes a visual movement cue 1328 that is optionally overlaid (e.g., superimposed) on the user facial image 1326. In the example of fig. 13D, visual movement cue 1328 includes an arrow element that indicates a requested direction in which the user should move (e.g., rotate and/or tilt) his or her face in order to register a corresponding portion (e.g., an angular view) of the face. In some examples, the visual movement cue 1328 is partially transparent such that the underlying user facial image 1326 is also visible. In the example of fig. 13D, the arrow element of visual movement prompt 1328 indicates that the user moved their face to the right (e.g., rotated, tilted, or rotated) (e.g., toward the right portion of registration progressive meter 1330 in order to register an angular view of the left side of the user's face).
In some examples, device 1300 displays text prompt 1332 providing written instructions to the user that match visual movement prompt 1328. In the example of fig. 13D, text prompt 1332 provides written instructions to the user for turning their head to the right (e.g., in the same direction indicated by the arrow element in visual prompt 1328). In some examples, device 1300 also issues audio output 1334 corresponding to visual movement cues 1328 and/or text cues 1328. For example, if screen reader functionality is enabled, audio output 1334 is a verbal description of the requested movement (e.g., an audible narration of text prompt 1332). In some examples, audio output 1334 is emitted instead of or in addition to visual movement cues 1328 and/or text cues 1332. In some examples, device 1300 also emits tactile output 1336 (e.g., vibrating, e.g., instead of or in addition to audio output 1334). In some examples, audio output 1334 and/or tactile output 1336 are consistent with movement (e.g., animation) of visual movement cues 1328, as described in more detail below.
In some examples, device 1300 displays an animation of visual movement prompt 1328 to provide a further indication of the requested movement. In the example of fig. 13D-13E, device 1300 transitions display of the arrow element of visual cue 1328 into the requested direction of movement (e.g., to the right). In some examples, the visual cue 1328 also includes one or more lines (e.g., arcs) that extend over a central portion of the user facial image 1326. In some examples, the lines appear to extend out of the plane of the display 1302 into a virtual z-dimension (e.g., perpendicular to the display). In the example of fig. 13D-13E, device 1300 rotates an arc in the direction of the requested movement (e.g., to the right) to provide a visual demonstration of the requested movement in three dimensions (which accompanies movement of the arrow elements). In some examples, device 1300 continues to display text prompt 1332 while the animation of visual movement prompt 1328 is displayed. In some examples, while displaying the animation, device 1300 issues audio output 1334 and/or tactile output 1336 such that movement of the arrow and/or arc elements corresponding to visual cue 1328 is output.
In some examples, while displaying the visual movement cue 1328 and/or the text cue 1332, the device 1300 (e.g., again) detects that the orientation of the user's face relative to the biometric sensor 1303 has not changed for a predetermined amount of time. In response, device 1300 issues a haptic output (e.g., haptic output 1338 shown in fig. 13E). In some examples, the tactile output 1338 is generated as an error to indicate that the face registration has ceased (e.g., because the user has not moved his or her face for a predetermined amount of time).
In the example of fig. 13F, in response to detecting that the orientation of the user's face has not changed for a predetermined amount of time, device 1300 displays a second set of registration reminders that prompt the user to move his or her face in a different direction. In the example of fig. 13F, device 1300 displays the second visual movement cue 1340. The second visual movement cue 1340 has similar visual characteristics as the visual movement cue 1328, but corresponds to a second, different requested direction of movement (e.g., up instead of right) than the visual cue 1328. For example, the second visual movement cue 1340 includes a second arrow element that points in a different direction than the arrow element of the visual movement cue 1328 (e.g., top instead of right). Additionally, in some examples, second visual movement cue 1340 includes an arc element similar to the arc element of visual cue 1328, which is used to provide a visual demonstration of the requested movement in the second direction as described below with respect to fig. 13F and 13G.
In some examples, the second set of registration reminders includes a text prompt 1342 that provides written instructions to the user that match the visual movement prompt 1340. In the example of fig. 13F, the text prompt 1342 provides written instructions to the user for tilting their head upward (e.g., in a second direction indicated by the arrow element of the second visual prompt 1340). In the example of fig. 13F, the device 1300 also emits an audio output 1344 corresponding to the second visual movement cue 1340 and/or the text cue 1342. For example, if screen reader functionality is enabled, the audio output 1344 is a verbal description of the requested movement (e.g., an audible narration of the text prompt 1342). In some examples, device 1300 emits haptic output 1346 (e.g., vibration, e.g., instead of or in addition to audio output 1334).
As shown in fig. 13F-13G, in some examples, device 1300 displays an animation of visual movement cue 1340 to provide a further indication of movement in the second requested direction. In the example of fig. 13F-13G, device 1300 transitions display of the arrow element of second visual cue 1340 into a second requested direction of movement (e.g., upward). In the example of fig. 13F-13G, the animation also rotates the arc element of the second visual cue 1340 in a second requested direction of movement (e.g., up into the plane of the display) to provide a visual demonstration of the requested movement in three dimensions (which accompanies the movement of the arrow elements). In some examples, the device 1300 continues to display the text prompt 1340 while the animation of the visual movement prompt 1340 is displayed. In some examples, while displaying the animation, device 1300 issues audio output 1344 and/or tactile output 1346 such that movement of the arrow and/or arc element corresponding to visual cue 1340 is output.
Turning now to fig. 13H, device 1300 detects a change in the orientation of the user's face relative to biometric sensor 1303 (e.g., a movement in which the user is or has tilted his or her face upward, i.e., a second requested direction). In response to detecting the change in orientation, the device displays (e.g., again) the face registration interface 1304 described above with respect to fig. 13A. In the example of fig. 13H, device 1300 has updated user face image 1306 (e.g., displayed its movements) to reflect the change in orientation of the user's face. In some examples, the orientation guide 1308 tracks (e.g., moves with) the movement of the user face image 1306 to visually highlight tilting and rotational movement of the user face in three dimensions. For example, the center (e.g., intersection) of the orientation guide 1308 is optionally located at a center point on the user's facial image 1306 and moves with it. In some examples, the device 1300 also adjusts the curvature of the lines comprising the orientation guide 1308 to give the appearance of a three-dimensional rotation (e.g., up into the plane of the display). In some examples, the device 1100 highlights the orientation guide 1108 while the orientation guide is moving (e.g., while the orientation of the user's face is changing). For example, device 1300 optionally dims the orientation guide 1308 while it is in motion and/or displays a fading trajectory while tracking the movement of the user's face. In this case, the device 1300 optionally reduces such protrusion of the orientation guide 1308 relative to the user's face image 1306 when the user's face is not moving.
As shown in the example of fig. 13G, in response to detecting that the user's face is oriented toward the progress meter portion 1348 (e.g., a set of one or more progress elements such as 1310a, 1310b, 1310c), the device 1300 updates the display of the progress elements in the meter portion 1348 to the "registering" state by changing the appearance of the progress elements in the meter portion 1348. For example, device 1300 optionally enlarges and/or changes color of the progression element in the gauge portion 1348 when the user's face is oriented toward the gauge portion 1348. In some examples, when the progress element is updated to the "registering" state, the device 1300 elongates the progress scale and changes its color from gray to blue. In some examples, changing the display of the progression element to the "registering" state in this manner indicates that the device 1300 is capturing (e.g., registering) face imaging data for an angular view corresponding to the current orientation of the user's face. In the example of fig. 13G, device 1300 maintains progress elements in progress meter 1310 (e.g., progress elements that are not part of meter portion 1348) in an unregistered state (e.g., grayed out) to indicate that device 1300 has not detected a user's face in orientations corresponding to those progress elements. In some examples, the display of the gauge portion 1348 is updated in this manner only if the user's face is sufficiently rotated toward the gauge portion 1348 (e.g., only if the user's face is rotated by at least a threshold amount or angle).
Turning now to the example of fig. 13I, device 1300 detects that the user's face is no longer in an orientation corresponding to meter portion 1348 (e.g., the user has tilted their head down back to a neutral position). In response, device 1300 again changes the appearance of the progress element in meter portion 1348 to the "registered" state. In the example of fig. 13I, device 1300 updates the display of the progress scale in portion 1348 from the elongated "registering" state by shortening the progress scale and changing their color again. For example, the progression element in the "registered" state is the same length and/or size as the progression element in the "unregistered" state, but is displayed in green to indicate that a corresponding portion of the user's face (e.g., the angle view captured in fig. 13J) has been successfully registered as described above in connection with fig. 11J. In the example of fig. 13J, device 1300 maintains other progress elements in registration progress meter 1310 in an unregistered state to indicate that the device has not detected a user's face in an orientation corresponding to those progress elements. In response to detecting the change in facial orientation, device 1300 also moves orientation guide 1308 so that it tracks the movement of user facial image 1306 in the digital viewfinder.
Turning now to the example of fig. 13J, after detecting the change in orientation depicted in fig. 13I, the device (e.g., again) detects that the orientation of the user's face relative to the biometric sensor 1303 has not changed for a predetermined period of time. In response, device 1300 displays a reminder that registration interface 1350 is enabled. In some examples, a reminder that registration interface 1350 is enabled is automatically displayed. In some examples, the reminder enabled registration interface 1350 is displayed in response to detecting activation (e.g., selection) of an affordance (e.g., similar to what is affordance 1318 on reminder enabled interface 1314). In some examples, the alert enabled registration interface 1350 and its components (e.g., the user face representation 1352, the registration progressive 1354, the visual movement prompt 1356, and the text prompt 1358) have the same visual features as described above with respect to the alert enabled interface 1324 in fig. 13D. In the example of fig. 13J, however, the device 1300 displays that the progress element in the meter portion 1360 of the registration progress meter 1354 is in the "registered" state because the face orientation corresponding to the same portion of the progress meter 1330 has been registered (e.g., in the manner of fig. 13H).
In the example of fig. 13J-13K, device 1300 displays an animation of visual movement prompt 1356, which prompts the user to move his or her face to an orientation that has not yet been registered. For example, the animation of visual cue 1356 prompts the user to move his or her face in a first requested direction (e.g., to the right). The animation of visual movement cue 1356 has similar or identical features to the animation of visual movement cue 1328 described above with respect to fig. 13D-13E. For example, device 1300 transitions display of the arrow element of visual cue 1356 into the requested direction of movement (e.g., to the right) corresponding to the facial orientation that has not yet been registered. In the example of fig. 13J-13K, the animation also rotates the arc element of visual cue 1356 in the direction of the requested movement (e.g., to the right) to provide a visual demonstration of the requested movement in three dimensions (which accompanies the movement of the arrow elements). In some examples, upon displaying the animation of the visual movement prompt 1356, the device 1300 continues to display the text prompt 1358, which provides a written description of the requested movement. In some examples, while displaying the animation, the device 1300 issues an audio output 1362 and/or a haptic output 1364 such that movement of an arrow and/or arc element corresponding to the visual cue 1340 is output.
Turning to the example of fig. 13L, the device 1300 has (e.g., a third time) detected that the orientation of the user's face relative to the biometric sensor 1303 has not changed for a predetermined amount of time. In the example of fig. 13L, device 1300 displays accessibility registration interface 1368 in response to detecting that the orientation of the user's face has changed little. In some examples, the accessibility registration interface includes a user facial image 1370, optionally with similar or identical features to the user facial image 1308. In particular, the user facial image 1370 is optionally a real-time preview of the image data captured by the biometric sensor 1303. In the example of fig. 13L, accessibility registration interface 1368 includes a registration progress meter 1372, which is optionally displayed around a user facial image 1370. In some examples, display of the meter portion 1370 indicates an orientation and/or portion of a user's face that has been previously registered (e.g., when the device 1300 displays the registration interface 1304 or alerts that the registration interface 1324 and/or 1350 was enabled during a previous stage of registration). For example, device 1300 displays progress elements in portion 1374 of progress meter 1370 (which corresponds to meter portion 1348 and/or meter portion 1360). In the example of FIG. 13L, the accessibility registration interface 1368 also includes an accessibility option affordance 1378. In some examples, activation of the accessibility option affordance 1378 allows a user to set biometric (e.g., facial) authentication with only partial scans (e.g., after registering only a subset of facial orientations or portions that should be registered during a full scan).
In the example of fig. 13M, device 1300 detects activation (e.g., selection) of the accessibility option affordance 1378 (e.g., via user input 1380). In response to detecting activation of the accessibility option affordance 1378, the device 1300 displays a completion affordance 1382 on the accessibility registration interface as shown in fig. 13N. In some examples, activation of the completion affordance allows the device to continue using only a partial scan of its facial features.
In the example of fig. 13O, the device 1300 detects activation (e.g., selection) of the completion affordance 1382 by the user input 1384. In response to detecting activation of the completion affordance, device 1300 displays a partial scan validation interface 1386 as shown in fig. 13P. The partial scan confirmation interface includes a user facial image 1387, optionally with some or all of the visual features of the user facial image 1370. Because a portion of the user's face has been successfully registered, the device 1300 also displays a registration success indicator 1388, e.g., adjacent to and/or surrounding the user's face image 1387. In the example of fig. 13P, partial scan confirmation interface 1386 includes a text prompt 1389 that provides a written indication that image data corresponding to at least a portion of the user's face has been successfully captured and registered. In the example of fig. 13P, device 1300 displays registration completion affordance 1390.
In the example of fig. 13Q, device 1300 detects activation (e.g., selection) of registration completion affordance 1390 by user input 1392. In some examples, in response to detecting activation of the registration-completion affordance, device 1300 registers image data of one or more angular views (e.g., orientations) of the user's face captured during the registration process described above. Optionally, device 1300 issues haptic output 1393 to confirm completion of the registration process. In some examples, haptic output 1393 is the same haptic output issued in response to a successful biometric authorization at device 1300. In the example of fig. 13Q, the device 1300 replaces the display of the success indicator 1388 with a partial registration indicator 1391 of a neighboring face image 1387 that visually indicates the orientation of the user's face that has been successfully registered. In some examples, the size (e.g., arc length) and location of partial registration indicator 1391 corresponds to a portion of a progress meter (e.g., 1310, 1354, 1372) transitioning to a "registered" state during registration. In the example of fig. 13Q, device 1300 displays portion registration indicator 1391 in a similar location as meter portion 1374 to indicate that one or more facial orientations corresponding to meter portion 1374 were successfully registered.
In the example of fig. 13R, in response to detecting activation (e.g., selection) of registration completion affordance 1390 (e.g., via user input 1392), device 1300 displays registration completion interface 1394. As shown in fig. 13R, the registration completion interface 1394 includes a biometric authentication flag 1395. For example, the biometric authentication landmark 1395 is optionally a line drawing of all or part of a face (e.g., a stylized facial graphic). In the example of fig. 13R, registration completion interface 1394 also includes text prompt 1396, which indicates that the registration process is complete and facial authentication at the device is set and/or enabled. In some examples, registration completion interface 1394 also includes a completion affordance 1397, activation of which causes device 1300 to exit facial authentication settings. In some examples, the registration completion interface 1394 does not include the face image 1387.
Fig. 14 is a flow diagram illustrating a method for providing reminders for effectively registering biometric features on an electronic device, according to some examples. The method 1400 is performed at a device (e.g., 100, 300, 500, 1300) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2000 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 1400 provides an intuitive way for providing reminders for effectively registering biometric features on an electronic device. The method reduces the cognitive burden on the user when registering biometric features on the device, thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling users to more quickly and efficiently enroll biometric features conserves power and increases the interval between battery charges.
The device displays (1402) on the display a biometric enrollment user interface (e.g., 1304, 1324) for enrolling biometric features (e.g., a user's face, fingerprint, iris, handprint, or other physical biometric features that can be used to distinguish one person from another). Displaying the biometric enrollment user interface includes displaying a representation of the biometric characteristic (e.g., 1306, 1326, a representation of a head of a user of the device). The appearance of the representation of the biometric feature changes (1404) as the orientation of the biometric feature relative to the one or more biometric sensors (e.g., 1303) changes. For example, the orientation of the biometric feature relative to the one or more biometric sensors is optionally based on the alignment of the user's face in the image data captured by the one or more cameras (e.g., including camera data of the user's head located in the field of view of one or more of the cameras). Displaying a preview of the image captured by the biometric sensor provides the user with feedback regarding the location and orientation of his or her biometric feature relative to the biometric sensor of the device, enabling the user to more quickly and efficiently properly align his or her biometric feature with the sensor in order to properly register the biometric feature. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
Upon displaying the biometric enrollment user interface, the device detects (1406) that enrollment prompt criteria have been met for one or more portions of the biometric characteristic.
In some examples, the enrollment prompting criteria include (1408) a requirement that the biometric feature move less than a first threshold amount for at least a first threshold period of time (as determined by the one or more biometric sensors). Automatically enabling the enrollment reminder after detecting that the user's biometric feature has moved little reduces the time required to complete the enrollment process because the user, who is trying to quickly and automatically perform the desired movement, receives instructions on how to proceed with the enrollment process. Performing an optimized set of operations when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In response to detecting that the registration prompt criteria have been met for one or more portions of the biometric feature, the device outputs (1410) a respective prompt (e.g., 1328, 1332, 1334, 1336, 1340, 1342, 1344, 1346, e.g., a visual, audible, and/or tactile prompt) for moving the biometric feature in a respective manner. A respective prompt is selected (1412) based on an enrollment status of one or more portions of the biometric feature (e.g., whether the first portion and/or the second portion of the biometric feature have been enrolled). In particular, in accordance with a determination that registration-prompting criteria have been met for a first portion of a biometric feature that is registerable by moving the biometric feature in a first manner, the device outputs (1424) a prompt (e.g., 1328, 1332, 1334, 1336) for moving the biometric feature in the first manner. In accordance with a determination that registration-prompting criteria have been met for a second portion of the biometric feature that is registerable by moving the biometric feature in a second manner that is different from the first manner, outputting the respective prompt includes outputting (1426) a prompt (e.g., 1340, 1342, 1344, 1346) for moving the biometric feature in the second manner. Providing visual and/or audible cues for moving the biometric feature in a particular direction allows the user to quickly and intuitively realize how to locate the biometric feature so that the corresponding portion may be registered. These prompts allow the user to more quickly and efficiently move the biometric feature through the range of orientations required for the enrollment process than they otherwise would. Thus, providing improved visual and/or auditory feedback with instructions regarding the correct movement of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in accordance with a determination that the audible prompting criteria are met (e.g., determining whether screen reader functionality of the device is enabled), the device outputs an audible prompt (e.g., 1334) for moving the biometric feature in the first manner (e.g., in place of or in addition to one or more visual prompts). In some examples, in accordance with a determination that the audible prompting criteria are not satisfied, the device provides the user with an option to enable audible prompting for biometric enrollment. For example, the device displays an affordance (e.g., 1318) that, when selected by a user, causes an audible prompt to be enabled, or provides an audio prompt (e.g., 1334, 1344) describing steps for enabling audible prompts for biometric enrollment. Providing audible instructions for moving the biometric feature in a particular direction allows a user to quickly and intuitively recognize how to locate the biometric feature so that the corresponding portion may be registered. These prompts allow the user to more quickly and efficiently move the biometric feature through the series of orientations required for the enrollment process than they otherwise would. Thus, providing improved auditory feedback with instructions regarding the correct movement of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device outputs a respective prompt (e.g., 1328, 1332, 1334, 1336, 1340, 1342, 1344, 1346, e.g., a visual, audible, and/or tactile prompt) before any portion of the biometric feature has been registered. For example, the respective prompt optionally indicates (1422) that the user should begin tilting their head to begin the registration process.
In some examples, the device outputs a respective prompt (e.g., 1328, 1332, 1334, 1336, 1340, 1342, 1344, 1346, e.g., a visual, audible, and/or tactile prompt) after at least a portion of the biometric feature has been registered. For example, the prompt optionally indicates that the user should continue to tilt their head to continue the registration process. Automatically issuing a prompt to move the biometric feature in the second direction after the user has moved the biometric feature in the first direction allows the user to quickly and intuitively understand how to continue moving the biometric feature to continue the enrollment process. Assisting the user in understanding how to perform the required movement of the biometric feature in succession reduces the amount of time required to complete the enrollment of the biometric feature. Thus, performing an optimized set of operations when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device outputs a haptic output (e.g., 1336, 1346). In some examples, the tactile output is accompanied by an audible output (e.g., 1334, 1344). In some examples, the haptic output and/or the audio output are generated to coincide with movement of the visual cue (e.g., 1328, 1340). For example, the haptic output optionally corresponds to movement of an arrow or arc (e.g., arrow elements and arc elements in 1328 and/or 1340) in a direction in which the user is prompted to move the biometric feature.
In some examples, the respective prompt includes a tactile output (e.g., 1338, 1366) to indicate a failed biometric authentication to the device. For example, the tactile output generated as an error when biometric enrollment has ceased due to failure to change the orientation of the biometric feature relative to the one or more biometric sensors is the same as the tactile output used to indicate a failed biometric authentication.
In some examples, the device overlays a visual cue (e.g., 1328, 1340, 1356) on the representation of the biometric feature. For example, the visual cues are optionally arrows indicating respective ways (directions) of moving the biometric feature, such as up, down, left, right, at oblique angles between these directions. In some examples, the visual cue is partially transparent. Displaying a visual cue such as an arrow element in the requested direction of movement allows the user to quickly understand how to move the biometric feature so that a portion of the feature corresponding to the requested direction can be correctly registered. This allows the user to perform the requested move more quickly and efficiently, reducing the amount of time required for the registration process. Thus, providing improved visual cues showing the correct movement of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device displays (1414) an animated prompt (e.g., the animation of 1328, 1340, or 1356 described with respect to fig. 13D-13E, 13F-13G, or 13J-13K) to move the biometric feature in a corresponding manner. For example, the device optionally displays an animation (e.g., the animation of 1328 shown in fig. 13D-13E) that prompts movement in a first manner relative to a first portion of the biometric characteristic, and displays an animation (e.g., the animation of 1340 in fig. 13F-13G) that prompts movement in a second manner relative to a second portion of the biometric characteristic. In some examples, displaying the animated prompt includes displaying (1416) an arrow element (e.g., an arrow element of 1328, 1340, or 1356) indicating a respective manner of moving the biometric feature. Displaying an animation that intuitively shows the requested direction of movement allows the user to quickly understand how to move the biometric feature so that a portion of the feature corresponding to the requested direction may be properly registered. This allows the user to perform the requested move more quickly and efficiently, reducing the amount of time required for the registration process. Thus, providing improved visual feedback with intuitive illustrations of the correct movement of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the device output (1420) corresponds to at least one of a tactile output (e.g., 1336, 1346, 1364) or an audible output (e.g., 1334, 1344, or 1362) of the animation. For example, the animation optionally scales the biometric features. Alternatively and/or additionally, one or more elements of the registration user interface (e.g., 1324, 1350) optionally temporarily change state. Generally, the haptic output is synchronized with the animation. Providing tactile and/or audio output along with a visual illustration of the requested movement allows the user to quickly understand how to move the biometric feature so that a portion of the feature corresponding to the requested direction may be correctly registered. This allows the user to perform the requested move more quickly and efficiently, reducing the amount of time required for the registration process. Thus, providing improved tactile and/or auditory feedback with animation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the first manner of movement includes rotation about an axis parallel to the display (e.g., in the plane of the display 1302), and the second manner of movement includes rotation about an axis parallel to the display. In this case, the animated prompt (e.g., the animation of 1328, 1340, or 1356 described with respect to fig. 13D-13E, 13F-13G, or 13J-13K) includes (1418) a simulated rotation of the user interface element (e.g., the arc element of 1328, 1340, or 1356) about an axis parallel to the display. For example, if the user is being prompted to rotate the biometric feature clockwise about an axis parallel to the display, the animation optionally includes a clockwise movement of the user interface element about the axis parallel to the display. Likewise, if the user is being prompted to rotate the biometric feature counterclockwise about an axis parallel to the display, the animation optionally includes a counterclockwise movement of the user interface element about an axis parallel to the display. Displaying simulated rotations of the orientation element to illustrate the requested movement allows the user to quickly understand how to move the biometric feature so that a portion of the feature corresponding to the requested direction may be properly registered. This allows the user to perform the requested move more quickly and efficiently, reducing the amount of time required for the registration process. Thus, providing improved visual feedback with intuitive illustrations of the correct movement of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the biometric enrollment user interface includes orientation guidance (e.g., 1308) that is overlaid on the representation (e.g., 1306) of the biometric feature and that tilts in different directions as the representation of the biometric feature tilts in the different directions (e.g., as described above with reference to method 1200). In this example, in accordance with a determination that the enrollment prompt criteria have been met for a first portion of the biometric feature that is registerable by moving the biometric feature in a first manner, the animated prompt (e.g., the animation of 1328 or 1356 described with respect to fig. 13D-13E or 13J-13K) includes a portion of the orientation guide (e.g., the vertical component of 1308) moving in a direction that the orientation guide will move when the biometric feature is moved in the first manner. Displaying and/or rotating the orientation overlay on the representation of the biometric feature provides the user with feedback regarding the orientation of his or her biometric feature in three-dimensional space relative to the biometric sensor of the device, enabling the user to more quickly move the biometric feature through a desired range of orientations during the enrollment process. Thus, providing the user with improved visual feedback regarding the orientation of the biometric feature enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
Also, in accordance with a determination that the enrollment prompting criteria have been met for a second portion of the biometric feature that is registerable by moving the biometric feature in a second manner, the animation prompt (e.g., the animation of 1340 described with respect to fig. 13F-13G) includes movement of a portion of the orientation guide in a direction that the orientation guide will move when the biometric feature is moved in the second manner. In some examples, the orientation guide includes a first portion (e.g., a horizontal component of 1308, e.g., a first arc) and a second portion (e.g., a horizontal component of 1308, e.g., a second arc that intersects the first arc) and the animation hint (e.g., the animation of 1340 shown in fig. 13F-13G) includes moving the first portion of the orientation guide without moving the second portion, or moving the second portion without moving the first portion. In some examples, if the first portion of the orientation guide is moving, the second portion stops displaying. Similarly, if the second portion is moving, the first portion stops displaying. In some examples, if a portion of the biometric feature that is not visible when the feature is tilted up, down, left, or right needs to be registered, the animation moves in a diagonal direction to prompt the user to tilt the biometric feature in the diagonal direction.
In some examples, after outputting respective prompts (e.g., 1328, 1332, 1334, 1336, 1340, 1342, 1344, 1346) for moving the biometric feature in a respective manner, and in response to detecting movement of the biometric feature, the device registers a respective portion of the biometric feature. Optionally, the device updates the progress indicator (e.g., 1310, 1330) as described in method 1200. The device optionally ceases providing the prompt when registering the respective portion of the biometric characteristic. Updating the progress indicator during registration in this manner encourages the user to look at the display of the electronic device during registration to improve the ability to detect when a gaze is directed at the display (and thus detect whether the user is focusing on the device). Encouraging the user to look at the display of the electronic device enhances the operability of the device and makes the user-device interface more efficient (e.g., by ensuring that the user's gaze is directed at the display, thereby ensuring that the user's biometric features are properly registered), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, after registering the respective portions of the biometric characteristic, the device determines that registration prompt criteria have been met for one or more portions of the biometric characteristic. In response to determining that the enrollment prompting criteria have been met for one or more portions of the biometric feature (e.g., the user ceases responding during enrollment for a threshold period of time), the device outputs another respective prompt (e.g., 1356, 1358, 1362, 1364) for moving the biometric feature in a respective manner determined based on the one or more portions of the biometric feature that have met the enrollment prompting criteria. For example, the device begins to prompt the user to change the orientation of the biometric feature relative to the one or more biometric sensors to register portions of the biometric feature that have not yet been registered. In some examples, the prompt has similar features to the other prompts described above. In some examples, the prompt travels in a similar manner as the prompt described above. In some examples, the first prompt is provided in a first direction (e.g., 1356, 1358, 1362, 1364) after a first time period in which there is little or no movement of the biometric feature relative to the one or more biometric sensors, and the second prompt is provided in a second direction after a second time period (longer than the first time period) in which there is little or no movement of the biometric feature relative to the one or more biometric sensors, and the option (e.g., 1382, 1390) to complete the biometric enrollment without registering all of the portion of the biometric feature is provided after a third time period (longer than the first time period) in which there is little or no movement of the biometric feature relative to the biometric sensors. Automatically providing a prompt for moving the biometric feature in a different direction after detecting little or no movement of the biometric feature assists the user who is struggling or unable to perform a movement in the first direction by quickly and automatically providing instructions on how to proceed with the enrollment process. Performing an optimized set of operations when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, after outputting respective prompts (e.g., 1328, 1332, 1334, 1336, 1340, 1342, 1344, 1346, 1356, 1358, 1362, 1364) for moving the biometric feature in a respective manner, in accordance with a determination that the accessibility prompt criteria have been met, the device displays (1428) an option (e.g., 1378) for continuing registration without further change in orientation of the biometric feature relative to the one or more biometric sensors (e.g., 1303). The accessibility alert criteria include (1430) a requirement that an orientation of the biometric feature relative to the one or more biometric sensors has changed by less than a predetermined amount for a corresponding period of time. For example, after providing the second prompt (e.g., 1340, 1342, 1344, 1346), the biometric feature of the user has not been detected as being moving for a threshold period of time. In this case, an accessibility affordance (e.g., 1378) is displayed, and the user optionally selects (e.g., 1380) the accessibility affordance. In other words, the user may approve the use of biometric feature authentication for biometric features from one view in a range of orientations that is less than the range of orientations available. In some examples, the respective time period is greater than the second time period. For example, after a delay in which there is little or no movement of the biometric feature relative to the one or more biometric sensors (e.g., 1303), the device first prompts (e.g., using 1328, 1332, 1334, 1336) for movement of the biometric feature in a first direction; then, after a delay in which there is little or no movement, the device prompts (e.g., using 1340, 1342, 1344, 1346) the biometric feature for movement in a second direction; then, after an additional delay with little or no movement, the device provides an option (e.g., 1378) to proceed with enrollment without additional movement of the biometric feature relative to the one or more biometric sensors. In some examples, the accessibility prompt (e.g., 1378) is displayed after enough biometric features have been captured to ensure secure authentication with at least a portion of the biometric features (e.g., once an angle of the face has been captured and registered, a user with limited mobility may select the accessibility option to register the biometric features using only the registered angle).
In some examples, the device detects (1432) selection of an option to proceed with enrollment without further change in orientation of the biometric feature relative to the one or more biometric sensors. For example, in some examples, the device receives a user input (e.g., 1382) indicating a selection of an affordance (e.g., 1380) of an accessibility interface (e.g., 1368) for confirming registration of biometric data. In response to detecting selection (1434) of an option to proceed with enrollment without further change in orientation of the biometric feature relative to the one or more biometric sensors, the device aborts (1436) (e.g., skips) one or more steps in the biometric enrollment. For example, the device skips the display of a second biometric enrollment user interface (e.g., second enrollment interface 1138 in fig. 11H) that would be displayed during a standard enrollment process in which the biometric feature changes orientation as prompted by the device (e.g., if the user is enrolled via an accessibility interface, there is no second enrollment flow, as described with respect to the method of 1200).
In some examples, in response to detecting selection (1434) of an option to proceed with enrollment without further change in orientation of the biometric feature relative to the one or more biometric sensors, the device displays (1438) an indication of completion of enrollment of the biometric feature (e.g., 1391 and 1389) that includes information about which portions of the biometric feature have been enrolled. In some examples, the device displays an affordance (e.g., 1390) that, when selected, confirms partial registration of the biometric feature.
In some examples, in response to detecting selection of the option to proceed with enrollment without further change in orientation of the biometric feature relative to the one or more biometric sensors, once the biometric feature has been enrolled, the device outputs a tactile output (e.g., 1393) indicating successful biometric authentication of the biometric feature. For example, the tactile output generated when biometric enrollment is complete is optionally the same tactile output used to indicate successful authentication of the biometric feature.
Note that the details of the processes described above with respect to method 1400 (e.g., fig. 14A-14B) may also be applied in a similar manner to the methods described herein. For example, method 1400 optionally includes one or more of the features of the various methods described herein with reference to methods 800, 1000, 1200, 1600, 1800, 2000, 2200, 2500, and 2700. For example, the accessibility interface described in method 1000 may be applied with respect to an accessibility registration interface (e.g., 1368). As another example, orientation guidance as described in method 1200 may be applied relative to orientation guidance (e.g., 1308). For the sake of brevity, these details are not repeated in the following.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 14A-14B are optionally implemented by the components depicted in fig. 1A-1B. For example, display operation 1402, detect operation 1406, output operation 1408, output operation 1412, and output operation 1414 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Fig. 15A-15T illustrate example user interfaces for biometric authentication, according to some examples. As described in more detail below, the illustrative examples of user interfaces shown in fig. 15A-15T are used to illustrate the processes described below, including the processes in fig. 16A-16E.
Fig. 15A shows an electronic device 1500 (e.g., portable multifunction device 100, device 300, or device 500). In the illustrative example shown in fig. 15A to 15T, the electronic device 1500 is a smartphone. In other examples, the electronic device 1500 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 1500 has a display 1502, one or more input devices (e.g., a touchscreen of the display 1502, buttons 1504, a microphone (not shown)), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 1503), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 1503 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In FIG. 15A, electronic device 1500 displays an application interface 1506 on display 1502 that includes a login affordance 1508. In the example of fig. 15A, the application is a browser, which is displaying a website (e.g., onlinescore. In fig. 15B, electronic device 1500 detects activation of login affordance 1508 while displaying application interface 1506. As shown, the activation is a tap gesture 1510 on the login affordance 1508.
In fig. 15C, the electronic device 1500 initiates biometric authentication in response to detecting activation of the login affordance 1508. In some examples, initiating biometric authentication includes obtaining (e.g., capturing using the one or more biometric sensors) data corresponding to at least a portion of a biometric feature of the user (e.g., a user's face). In fig. 15C, initiating biometric authentication further includes displaying a biometric authentication interface 1512 having a biometric authentication token 1514. In the example of fig. 15C, the biometric authentication token 1514 is a simulation of a representation of a biometric feature (e.g., a face). As seen in fig. 15C, a biometric authentication interface 1512 is overlaid on at least a portion of the application interface 1506. In some examples, the biometric authentication interface is an operating system level interface (e.g., an interface generated by an operating system of the device), and the application interface 1506 is an application level interface (e.g., a user interface generated by a third party application that is separate from the operating system of the device).
While in some examples, the electronic device 1500 initiates biometric authentication in response to activation of the login affordance for the application, in other examples, the electronic device 1500 initiates (e.g., automatically starts) biometric authentication in response to loading the application and/or the application interface 1506. The application interface is displayed, for example, in response to loading the application (e.g., by selecting an icon associated with the application on the home screen of the electronic device 1500).
In some examples, including the example of fig. 15C, the biometric authentication interface is partially translucent. In some examples, the display (e.g., visual features) of the biometric authentication interface 1512 is based on the application interface 1506. By way of example, the one or more colors of the biometric authentication interface 1512 are based on the one or more colors of the application interface 1506. Referring to fig. 15C, the electronic device 1500 displays the application interface 1506 with the first color scheme and displays (e.g., displays using a contrasting color to the first color scheme) the biometric authentication interface 1512 based on the first color scheme. Referring to FIG. 15D, the electronic device 1500 displays an application interface 1507 having a second color scheme that is different from the first color scheme, and displays a biometric authentication interface 1512 based on the color scheme. Displaying the biometric authentication interface 1512 in this manner allows the biometric authentication interface 1512 to be easily recognized and viewed by a user when overlaid on an application interface.
In response to initiating biometric authentication, the electronic device 1500 captures and processes (e.g., analyzes) the biometric data to determine whether the biometric feature (or a portion thereof) satisfies the biometric authentication criteria based on the biometric data (e.g., determines whether the biometric data matches the biometric template within a threshold). In some examples, in response to obtaining the biometric data, the electronic device 1500 displays a biometric authentication animation that includes, for example, changing the size of the biometric authentication flag symbol. In some examples, as the electronic device processes the biometric data, the electronic device displays one or more biometric authentication landmark symbols and/or a biometric authentication animation (e.g., replacing the display of the biometric authentication landmark symbol 1514 with it) to indicate that the biometric data is being processed.
By way of example, in fig. 15E, in response to initiating biometric authentication, the electronic device displays a biometric authentication flag symbol 1514. Referring to fig. 15F-G, once the electronic device 1500 has obtained biometric data (e.g., obtained sufficient biometric data), the electronic device 1500 displays a biometric authentication animation that includes the biometric authentication flag symbols 1515 (fig. 15F) and 1516 (fig. 15G) as part of an animation in which the biometric authentication flag symbol 1514 is replaced with (e.g., transitioned to) the biometric authentication flag symbol 1517 (fig. 15H). Referring to fig. 15H, the electronic device 1500 displays a biometric authentication flag symbol 1517 to indicate that biometric data is being processed. In some examples, the biometric authentication landmark symbol 1517 includes a plurality of rings that, for example, spherically rotate when displayed.
In fig. 15I, the electronic device 1500 determines that the biometric characteristic satisfies the biometric authentication criteria. In response, the electronic device displays a biometric authentication flag 1518 in the biometric authentication interface 1512 (e.g., replacing the display of the biometric authentication flag 1517 with it), indicating that the biometric authentication was successful. Additionally or alternatively, the electronic device outputs a tactile output 1520 indicating that the biometric authentication was successful. After indicating that the biometric authentication is successful, the electronic device 1500 provides the application with authentication information indicating that the biometric characteristic satisfies the biometric authentication criteria and thus that the biometric authentication is successful.
As shown in fig. 15J, in response to the electronic device 1500 providing authentication information indicating that the biometric characteristic meets the biometric authentication criteria, the application displays a home interface 1522 (e.g., replaces the display of the application interface 1506 therewith). Referring to fig. 15K, after a predetermined amount of time, the electronic device 1500 stops the display of the biometric authentication interface. Thereafter, the user optionally uses the application as if the user had been authenticated directly with the application (e.g., using a username and password for an account associated with the application). In some examples, the electronic device 1500 stops displaying the biometric authentication interface 1512 a predetermined amount of time after the biometric authentication has been completed. In other examples, the electronic device 1500 stops displaying the biometric authentication interface 1512 a predetermined amount of time after the application has performed an operation, such as displaying an interface (e.g., the home interface 1522).
Alternatively, in fig. 15L, the electronic device 1500 (e.g., after displaying the biometric authentication landmark symbol 1517 of fig. 15G) determines that the biometric characteristic does not satisfy the biometric authentication criteria. In response, the electronic device displays a biometric authentication landmark, such as the biometric authentication landmark 1519 (e.g., replacing the display of the biometric authentication landmark 1517 with it) in the biometric authentication interface 1512 to indicate that the biometric authentication was unsuccessful (e.g., failed). In some examples, the biometric authentication flag symbol 1519 is associated with a biometric authentication failure animation. Referring to fig. 15L-M, in some examples, in response to an unsuccessful biometric authentication, the electronic device 1500 displays a biometric authentication failure animation in which the biometric authentication flag symbol 1519 is moved (e.g., rotated) edge-to-edge to simulate a "shaking head" effect and indicate that the biometric authentication was unsuccessful. Optionally, the electronic device 1500 outputs a tactile output 1526 indicating that the biometric authentication was unsuccessful. In some examples, the tactile output 1526 is the same as the tactile output 1520. In some examples, tactile output 1526 is different than tactile output 1520. In some examples, the tactile output 1526 is synchronized with a biometric authentication failure animation.
15N-15O illustrate an alternative biometric failure animation in which, in response to an unsuccessful biometric authentication (as determined with respect to FIG. 15E), the electronic device 1500 displays a biometric authentication token 1514 in the biometric authentication interface 1512 (e.g., replacing the display of the biometric authentication token 1517 (FIG. 15H) therewith). In some examples, during the display of the biometric authentication failure animation, the electronic device moves the biometric authentication interface 1512 on the display 1502. In some examples, the electronic device 1500 moves the biometric authentication interface 1512 edge-to-edge to simulate a "shake" effect and indicate that the biometric authentication was unsuccessful. In some examples, the electronic device only moves the biometric authentication flag symbol 1514 and does not move the biometric authentication interface 1512. In other examples, additional or alternative signposts are used in the biometric authentication failure animation.
As shown in fig. 15P, in some examples, after displaying one or more biometric authentication failure animations, the electronic device displays a biometric authentication interface 1512 having a biometric authentication flag symbol 1514. In this manner, the electronic device again displays the initial biometric authentication flag symbol 1514 indicating that the electronic device 1500 is capable of performing additional biometric authentication. In some examples, the electronic device performs additional iterations of biometric authentication, as described with respect to at least fig. 15E-N.
Referring to fig. 15Q, in some examples, in response to an unsuccessful biometric authentication, the electronic device 1500 displays a failure interface 1540 (e.g., replaces the display of the biometric authentication interface 1512 with it). In some examples, displaying the failure interface 1540 includes maintaining a display of the biometric authentication interface 1512. In some examples, the failure interface 1540 includes a biometric authentication token 1514, an alternative authentication affordance 1544, a retry affordance 1546, and a cancel affordance 1548. In some examples, activation of the cancel affordance 1548 causes the electronic device 1500 to cease display of the failure interface 1540.
Referring to fig. 15R, in some examples, in response to activation of the retry affordance 1546 (such as a tap gesture 1550), the electronic device 1500 performs another iteration of biometric authentication. In some examples, the electronic device 1500 displays one or more biometric authentication landmark symbols (e.g., 1515-1517) and/or a biometric authentication animation in the failure interface 1540 to indicate the progress and/or result of the biometric authentication. Referring to fig. 15S, in some examples, the electronic device performs biometric authentication only if a threshold number (e.g., 5) of failed biometric authentication attempts have not been made. In some examples, if the threshold number of failed biometric authentication attempts has been reached, the electronic device 1500 displays an indication 1560 that the threshold number has been reached and thus biometric authentication is not available (e.g., replaces the display of the biometric authentication flag symbol 1514 with it).
In some examples, in response to activation of alternative authentication affordance 1544 (such as flick gesture 1552), electronic device 1500 displays (e.g., replaces display of failure interface 1540 with) alternative authentication interface 1562 with which a user authenticates using an alternative form of authentication (e.g., fingerprint authentication, password authentication) different from the authentication associated with the biometric feature. As shown in fig. 15T, the user optionally authenticates by entering the appropriate credentials in username field 1564 and password field 1566, respectively. In some examples, failure interface 1540 is an operating system level interface that enables a user to authenticate with the operating system of electronic device 1500, and alternative authentication interface 1562 is an application level interface that enables a user to authenticate with an application.
Fig. 16A-16E are flowcharts illustrating methods for managing peer-to-peer transmissions using an electronic device, according to some examples. The method 1600 is performed at a device (e.g., 100, 300, 500, 1500) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2000 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 1600 provides an intuitive way to manage authentication of biometric features. The method reduces the cognitive burden of the user in managing the authentication of the biological identification characteristics, thereby creating a more effective human-computer interface and a more intuitive user experience. For battery-driven computing devices, enabling a user to more quickly and efficiently manage authentication of biometric features conserves power and increases the interval between battery charges.
Prior to displaying the application interface (e.g., 1506) and the biometric authentication interface (e.g., 1512), the electronic device (e.g., 100, 300, 500, 1500) loads (1602) an application (e.g., a browser application as discussed with respect to fig. 15A). In some examples, the application interface (e.g., 1506) is an interface of a third party application that is not initially installed on the electronic device (e.g., 100, 300, 500, 1500) and/or is not provided by the manufacturer of the device or the manufacturer of the operating system of the electronic device (e.g., 100, 300, 500, 1500). In some examples, the biometric authentication interface (e.g., 1512) is an operating system-generated asset that is not subject to control by an application corresponding to (e.g., generating) the application interface (e.g., 1506).
An electronic device (e.g., 100, 300, 500, 1500) simultaneously displays (1604) an application interface (e.g., 1506) corresponding to an application program and a biometric authentication interface (e.g., 1512) controlled by an operating system of the electronic device (e.g., 100, 300, 500, 1500) on a display (e.g., 1502). The simultaneous display of the application interface and the biometric authentication interface allows a user to quickly identify that the biometric authentication being requested is associated with the application corresponding to the application interface and further provides the user with more control of the device by helping the user avoid inadvertently using the application to perform an operation while allowing the user to identify that authentication is needed before the operation will be performed. Providing additional control over the device in this manner without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, a biometric authentication interface (e.g., 1512) is displayed (1606) over a portion of an application interface (e.g., 1506). In some examples, the biometric authentication interface (e.g., 1512) is not displayed over the entire application interface (e.g., 1506), and at least a portion of the application interface (e.g., 1506) remains displayed without being covered. In some examples, the biometric authentication interface (e.g., 1512) is at least partially translucent. In some examples, the biometric authentication interface (e.g., 1512) is at least partially translucent (or transparent) such that the application interface (e.g., 1506) is at least partially visible through the biometric authentication interface (e.g., 1512). In some examples, the biometric authentication interface (e.g., 1512) obscures underlying content such that an appearance of the biometric authentication interface (e.g., 1512) is based on a portion of the obscured content underlying the biometric authentication interface (e.g., 1512). In some examples, a biometric authentication interface (e.g., 1512) is displayed in response to the loading (1608) of the application. In some examples, the biometric authentication interface (e.g., 1512) is displayed in response to a user loading (e.g., initiating or continuing execution of) an application on the electronic device (e.g., 100, 300, 500, 1500). In some examples, the biometric authentication interface is loaded after the application is displayed (e.g., 1512). In some examples, the biometric authentication interface (e.g., 1512) and the application interface (e.g., 1506) are displayed simultaneously. In some examples, a biometric authentication interface (e.g., 1512) is displayed in response to detecting a user interaction (1610) with an application interface (e.g., 1506) corresponding to a request to access content requiring authentication. In some examples, the request for authentication is a selection of an authentication affordance (e.g., 1508) or performance of a gesture. In some examples, an application interface (e.g., 1506) includes an authentication affordance (1506) (e.g., a login affordance).
Upon displaying the biometric authentication interface (e.g., 1512), the electronic device (e.g., 100, 300, 500, 1500) prepares to use the one or more biometric sensors (e.g., 1503) prior to obtaining biometric data corresponding to at least a portion of the biometric feature. In some examples, in response to display of the login affordance (e.g., 1508), the electronic device (e.g., 100, 300, 500, 1500) prepares to use (e.g., readies) the one or more biometric sensors. In some examples, preparing to use the one or more biometric sensors (e.g., 1503) includes transitioning the sensor (e.g., 1503) from a low-power state (e.g., unpowered state or sleep state) to a low-latency state (e.g., partial-power state or full-power state, warm-up state). As such, the electronic device (e.g., 100, 300, 500, 1500) optionally reduces the amount of time required to perform biometric authentication when displaying a biometric authentication interface (e.g., 1512). In some examples, attempting biometric authentication using the one or more biometric sensors (e.g., 1503) takes a first amount of time when the one or more biometric sensors (e.g., 1503) are in a low power state, and attempting biometric authentication using the one or more biometric sensors (e.g., 1503) take a second amount of time that is less than the first amount of time when the one or more biometric sensors (e.g., 1503) are in a low latency state. Upon displaying a biometric authentication interface (e.g., 1512), the electronic device (e.g., 100, 300, 500, 1500) obtains (1612) biometric data corresponding to at least a portion of the biometric feature from the one or more biometric sensors (e.g., 1503). In some examples, the biometric feature is a face, and the biometric data is data corresponding to a portion of the face.
In response to obtaining biometric data corresponding to at least a portion of a biometric feature from the one or more biometric sensors, the electronic device (e.g., 100, 300, 500, 1500) determines (1614) whether the at least a portion of the biometric feature satisfies biometric authentication criteria based on the biometric data. Determining whether the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the obtained biometric data enables a fast and efficient authentication process that allows a user to easily provide and proceed with an authentication operation with minimal input. Reducing the number of inputs required to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the electronic device (e.g., 100, 300, 500, 1500) determines whether the user's face or fingerprint matches stored information about faces and/or fingerprints authorized for biometric authentication at the device (e.g., 100, 300, 500, 1500). In some examples, determining whether the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data includes displaying (1616) a biometric authentication analysis animation. In some examples, the biometric authentication animation includes displaying a sequence of interface objects (e.g., 1514, 1515, 1516, 1517, 1518, 1519) (e.g., a token). A first interface object (e.g., 1514) indicates that biometric authentication has been initiated, a second interface object (e.g., 1517) indicates that a device (e.g., 100, 300, 500, 1500) is processing biometric data, and a third interface object (e.g., 1518, 1519) indicates whether biometric authentication succeeded or failed. In some examples, the first interface object (e.g., 1514) is substantially square in shape and the second interface object (e.g., 1517) is substantially circular in shape. In some examples, displaying the biometric authentication analysis animation includes rotating one or more rings around an interface object (e.g., 1517) (e.g., a biometric authentication landmark symbol) of the biometric authentication animation. In some examples, the one or more rings are rotated while the device (e.g., 100, 300, 500, 1500) is processing the biometric data to determine whether the biometric data meets biometric authentication criteria. Rotation of the ring optionally simulates rotation of the ring around a sphere. In some examples, once a device (e.g., 100, 300, 500, 1500) has completed processing biometric data, the one or more rings are overlaid with one another to demonstrate that processing has completed. In some examples, displaying the biometric authentication analysis animation includes changing an appearance of an animated object (e.g., 1514, 1515, 1516, 1517, 1518, 1519) on a surface of a disk (e.g., 1512), where the surface of the disk has an appearance based on underlying content (e.g., 1506, 1507, 1522). In some examples, as the appearance of the animated object changes, the appearance of the surface of the disk changes. In some examples, the surface of the disk darkens when the animated object darkens and brightens when the animated object brightens. In some examples, the appearance of the disk surface (e.g., 1512) changes as the appearance of the animation object (e.g., 1514, 1515, 1516, 1517, 1518, 1519) changes, even when the underlying content (e.g., 1506, 1507, 1522) on which the appearance of the disk surface is based does not change. In some examples, the one or more colors of the biometric authentication analysis animation are based on one or more colors of the application interface (e.g., 1506). In some examples, the color of the animation is selected based on one or more colors of an application interface (e.g., 1506, 1507, 1522) or another interface associated with the application. The color is optionally derived, for example, based on the color of the controls and/or icons for the application. In this way, the animation is optionally visually coordinated with the application interface (e.g., 1506, 1507, 1522), providing a more robust user experience. In some examples, prior to displaying the biometric authentication analysis animation, the electronic device (e.g., 100, 300, 500, 1500) determines one or more colors of the animation based on an analysis of a color scheme of the application interface (e.g., 1506) or data corresponding to the application interface (e.g., 1506). In some examples, further in response to obtaining biometric data corresponding to at least a portion of the biometric feature from the one or more biometric sensors (e.g., 1503), the electronic device (e.g., 100, 300, 500, 1500) changes a size of an interface object (e.g., 1514) (e.g., a biometric authentication flag symbol) of the biometric authentication interface (e.g., 1512) from a first size to a second size and changes a size of the interface object (e.g., 1514) from the second size to the first size. In some examples, once the biometric data has been captured by the one or more biometric sensors (e.g., 1503), the interface object (e.g., 1514) (e.g., biometric authentication flag symbol) is increased from an initial size and then returned to the initial size to create a "bounce" effect.
In accordance with a determination (1636) that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data, the electronic device (e.g., 100, 300, 500, 1500) provides (1620) authentication information to the application indicating that the biometric authentication criteria have been satisfied for the one or more portions of the biometric characteristic. Providing authentication information to the application in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria enhances the security of the device and reduces the number of fraudulent transmissions that may occur. Enhancing device security and reducing the number of fraudulent transmissions enhances the operability of the device and makes the user-device interface more secure (e.g., by reducing fraud in operating/interacting with the device).
In some examples, the authentication information is provided by the operating system to an application program that generates the application interface (e.g., 1506). In some examples, further in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data, the electronic device (e.g., 100, 300, 500, 1500) maintains (1624) display of the biometric authentication interface (e.g., 1512) for a predetermined amount of time after providing the authentication information to the application. In some examples, further in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria based on the biometric data, the electronic device (e.g., 100, 300, 500, 1500) displays (1622) a biometric authentication success animation that includes a simulated first representation of the biometric characteristic (e.g., 1518) indicating that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria.
In some examples, in response to a successful biometric authentication, the device (e.g., 100, 300, 500, 1500) displays an animation that includes an interface object (e.g., 1518) indicating that the biometric authentication was successful. In some examples, further in accordance with a determination based on the biometric data that the at least a portion of the biometric feature satisfies the biometric authentication criteria, the electronic device (e.g., 100, 300, 500, 1500) provides a successful haptic output (e.g., 1520) indicating that the at least a portion of the biometric feature satisfies the biometric authentication criteria. Displaying an animation indicating that biometric authentication is successful provides visual feedback to the user of the operation being performed and enables the user to quickly recognize that the operation is successful. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user achieve the desired result (by providing feedback indicating an output that will cause the device to generate the desired result) and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
After maintaining display of the biometric authentication interface (e.g., 1512) for a predetermined amount of time, the electronic device (e.g., 100, 300, 500, 1500) stops (1626) displaying the biometric authentication interface (e.g., 1512). In some examples, the application receives an indication of authentication before the device (e.g., 100, 300, 500, 1500) stops displaying the biometric authentication interface (e.g., 1512); this allows the application to provide (e.g., display) an interface (e.g., 1522) of the application, such as a "home application" interface or a post-login interface, prior to transitioning from a biometric authentication interface (e.g., 1512). In some examples, the biometric authentication interface (e.g., 1512) stops displaying for a predetermined amount of time after authentication. In some examples, after the application has performed an operation according to biometric authentication (e.g., displaying an unlocked user interface (e.g., 1522)), the biometric authentication interface (e.g., 1512) stops displaying for a predetermined amount of time.
In accordance with a determination (1628) that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria based on the biometric data, the electronic device (e.g., 100, 300, 500, 1500) displays (1630) a biometric authentication failure animation that includes a simulated second representation (e.g., 1519) of the biometric characteristic that indicates that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria. Displaying a biometric authentication failure animation in accordance with a determination that the at least a portion of the biometric feature does not satisfy the biometric authentication criteria provides visual feedback to the user of a failure or error in the operation being performed and enables the user to quickly identify that the operation was unsuccessful. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user achieve the desired result (by providing feedback indicating an output that will cause the device to generate the desired result) and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in response to an unsuccessful biometric authentication, the device (e.g., 100, 300, 500, 1500) displays an animation that includes an interface object (e.g., 1519) indicating that the biometric authentication was unsuccessful. Displaying an animation including an interface object indicating that biometric authentication was unsuccessful in response to unsuccessful biometric authentication provides visual feedback to the user of a failure or error in an operation being performed and enables the user to quickly identify that the operation was unsuccessful. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user achieve the desired result (by providing feedback indicating an output that will cause the device to generate the desired result) and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, during the animation, the interface object (e.g., 1519) moves (e.g., tilts and/or shifts) in a predetermined manner (e.g., edge-to-edge) to indicate a failure. In some examples, a device (e.g., 100, 300, 500, 1500) generates a haptic output (e.g., 1526) or a sequence of haptic outputs corresponding to a biometric authentication failure animation (e.g., generates a haptic output as a simulated traverse of a biometric feature). Generating a haptic output or a sequence of haptic outputs corresponding to a biometric authentication failure animation further alerts the user that authentication was unsuccessful and enables the user to quickly recognize that authentication is still needed to proceed. Providing improved tactile feedback to the user enhances the operability of the device and makes the user-device interface more efficient, which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the simulated second representation (e.g., 1519) of the biometric feature is a three-dimensional object. Displaying the three-dimensional object as a second representation of the simulation of the biometric provides the user with easily identifiable visual feedback about the operational status (e.g., whether the transmission was successful or unsuccessful) and further enables the user to more easily perceive the object because the object is three-dimensional. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device, by enhancing the legibility of user interface elements to the user when the device is at a natural viewing angle), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the second representation (e.g., 1519) is a three-dimensional face performing panning. In some examples, displaying the biometric authentication failure animation includes alternating rotation of the second representation (e.g., 1519) between rotation in a first direction about an axis parallel to the display and rotation in a second direction about an axis parallel to the display (e.g., 1502). In some examples, displaying the biometric authentication failure animation includes highlighting a boundary of the biometric authentication interface (e.g., 1512) relative to the application interface (e.g., 1506). In some examples, the biometric authentication interface (e.g., 1512) or its boundaries are scaled down and/or retracted to create a visual "bounce" effect. In some examples, the electronic device (e.g., 100, 300, 500, 1500) provides a failed haptic output (e.g., 1526) that is different from a successful haptic output (e.g., 1520) further in accordance with a determination based on the biometric data that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria. In some examples, the electronic device (e.g., 100, 300, 500, 1500) displays (1632) a failure interface (e.g., 1540) further in accordance with a determination, based on the biometric data, that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria. In some examples, the failure interface (e.g., 1540) includes a visual indication that biometric authentication has failed. In some examples, the application interface (e.g., 1506) does not change (e.g., the application remains on the login interface (e.g., 1506) or the authentication user interface) when the biometric authentication fails. In some examples, when the biometric authentication fails, the application user interface (e.g., 1506) changes to indicate the failure of the biometric authentication. In some examples, the failure interface (e.g., 1540) includes a retry affordance (e.g., 1546) (1634). In some examples, the failure interface (e.g., 1540) includes a cancel affordance (e.g., 1548) (1636). In some examples, the failure interface (e.g., 1540) includes an alternative authentication affordance (e.g., 1544) (1638).
An electronic device (e.g., 100, 300, 500, 1500) receives (1640) an input (e.g., 1550) corresponding to a selection of a retry affordance (e.g., 1546). In response to receiving an input (e.g., 1550) corresponding to selection of the re-try affordance (e.g., 1546), the electronic device (e.g., 100, 300, 500, 1500) obtains (1642) second biometric data corresponding to at least a portion of the second biometric feature from the one or more biometric sensors (e.g., 1503). In some examples, the second biometric feature (e.g., the face) is the same biometric feature from which the initial biometric data was obtained. In some examples where the second biometric characteristic is the same biometric characteristic, the portion of the second biometric characteristic is a different portion of the same biometric characteristic from which the initial biometric data was obtained. In some examples, the portion is the same portion of the same biometric characteristic. In some examples, the second biometric characteristic is a different biometric characteristic than the initial biometric characteristic.
After obtaining second biometric data corresponding to at least a portion of the second biometric characteristic, the electronic device (e.g., 100, 300, 500, 1500) provides (1646) second authentication information to the application indicating that the one or more portions relative to the second biometric characteristic have met the second biometric authentication criteria in accordance with a determination that the at least a portion of the second biometric characteristic meets the second biometric authentication criteria based on the second biometric data. In some examples, the second biometric authentication criteria is the same as the initial biometric authentication criteria. In some examples, the second biometric authentication criteria is different from the initial biometric authentication criteria. In some examples, the second authentication information is the same as the authentication information. In some examples, the second authentication information is different from the authentication information. In some examples, the authentication information is provided by the operating system to an application program that generates the application interface (e.g., 1506).
The electronic device (e.g., 100, 300, 500, 1500) receives (1646) an input corresponding to a selection of a cancel affordance. In response to receiving the input corresponding to the selection of the cancel affordance, the electronic device (e.g., 100, 300, 500, 1500) ceases (1648) displaying the biometric authentication interface (e.g., 1512). In some examples, selection of the cancel affordance disassembles the failed interface (e.g., 1540) while maintaining the application interface (e.g., 1506). In some examples, selection of the disablement representation also causes the electronic device (e.g., 100, 300, 500, 1500) to provide information to the application indicating that the first and/or second biometric authentication criteria have not been met.
An electronic device (e.g., 100, 300, 500, 1500) receives (1650) an input (e.g., 1548) corresponding to selection of an alternative authentication affordance (e.g., 1544). Providing an alternative authentication affordance (e.g., to provide an alternative method for providing authentication, alternatively or in addition to biometric authentication) allows a user to easily provide authentication for operation using a different authentication method when the current authentication method is unsuccessful or continues to be unsuccessful. Providing additional control options in this manner (e.g., for providing authentication) without cluttering the UI with additional displayed controls enhances operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In response to receiving input (e.g., 1548) corresponding to selection of an alternative authentication affordance (e.g., 1544), the electronic device (e.g., 100, 300, 500, 1500) displays (1652) an alternative authentication interface (e.g., 1562). In some examples, an alternative authentication interface (e.g., 1562) is a non-biometric authentication interface (e.g., 1512). In some examples, an alternative authentication interface (e.g., 1562) allows a user to authenticate using a password and/or passcode. In some examples, the application determines which forms of authentication are accepted by the alternative authentication interface (e.g., 1562). In some examples, one or more preferences of the application determine which forms of authentication are accepted by the application. In some examples, an alternative authentication affordance (e.g., 1562) is included in a failure interface (e.g., 1540) in response to more than a predefined number of consecutive failures of biometric authentication (e.g., two failed authentication attempts, three failed authentication attempts, four failed authentication attempts, etc.). In some examples, an alternative authentication interface (e.g., 1562) is an application-level authentication interface (1654). In some examples, in response to receiving an input corresponding to selection of an alternative authentication affordance (e.g., 1544), the electronic device (e.g., 100, 300, 500, 1500) ceases (1656) display of the biometric authentication interface (e.g., 1512). In some examples, selection of the alternative authentication affordance (e.g., 1544) causes the device (e.g., 100, 300, 500, 1500) to cease displaying the alternative authentication affordance (e.g., 1544) and transition to an alternative authentication interface (e.g., 1562) that operates at an application level. Thus, the user optionally authenticates with the application using credentials associated with the application (e.g., the user optionally logs in using the username and password of the application). In some examples, an application-level alternative authentication interface (e.g., 1562) optionally includes an affordance for re-initiating biometric authentication. This, in turn, will cause the electronic device (e.g., 100, 300, 500, 1500) to redisplay the biometric authentication interface (e.g., 1512) and authenticate at the system or operating system level.
Note that the details of the processes described above with respect to method 1600 (e.g., fig. 16A-16E) may also be applied in a similar manner to the other methods described. For example, method 1600 optionally includes one or more of the features of the various methods described herein with reference to methods 800, 1000, 1200, 1400, 1800, 2000, 2200, 2500, and 2700. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication, such as the biometric authentication described with reference to fig. 15E-I. As another example, a biometric authentication interface as described in method 1800 may be used to implement a biometric authentication interface (e.g., 1512). For the sake of brevity, these details are not repeated in the following.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 16A-16E are optionally implemented by the components depicted in fig. 1A-1B. For example, providing operation 1620 and maintaining operation 1624 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
17A-17 AJ illustrate example user interfaces for biometric authentication, according to some examples. As described in more detail below, the exemplary examples of user interfaces shown in fig. 17A-17 AJ are used to illustrate the processes described below, including the processes in fig. 18A-18D.
Fig. 17A shows an electronic device 1700 (e.g., portable multifunction device 100, device 300, or device 500). In the illustrative example shown in fig. 17A to 17AJ, the electronic device 1700 is a smartphone. In other examples, the electronic device 1700 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 1700 has a display 1702, one or more input devices (e.g., a touchscreen of the display 1702, buttons 1704, a microphone), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 1703), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 1703 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In FIG. 17A, the electronic device 1700 displays a landing page interface for an application that includes a login affordance 1706 on the display 1702. As seen in fig. 17A, the application is a browser or mobile application, and the interface corresponds to a website (onlinescore. Upon displaying the landing page interface, the electronic device 1700 detects activation of the login affordance 1706. As shown in FIG. 17A, the activation is a tap gesture 1708 on the login affordance 1706.
In fig. 17B, in response to detecting a tap gesture 1708 on the login affordance 1706, the electronic device 1700 displays an application interface of the application (e.g., replaces a display of the landing page interface therewith) that includes an unsecure data-populatable field 1710 (labeled "username"), a secured data-populatable field 1712 (labeled "password"), and a submit affordance 1714. The electronic device also displays a biometric authentication token (e.g., an icon) in the secure data-fillable field 1712. As will be described in greater detail, the biometric authentication flag symbol indicates that the secure data-populatable field 1712 is associated with secure data and/or that biometric authentication is required to automatically populate the secure data-populatable field 1712.
Upon displaying the application interface, the electronic device 1700 detects a request to automatically populate the unsecure data-populatable field 1710. For example, as shown in fig. 17B, the request to automatically populate the unsecure data-populatable field 1710 is a tap gesture 1718 indicating a selection of the unsecure data-populatable field 1710.
In fig. 17C, in response to detecting a request to automatically populate an unsecure data-populatable field 1710, the electronic device 1700 displays (e.g., overlays on an application interface) an input interface 1720 that includes a keyboard (such as a software keyboard) and/or a keypad and an automatically populated affordance 1722. While displaying input interface 1720, electronic device 1700 detects activation of autofill affordance 1722. For example, as shown in FIG. 17A, the activation is a tap gesture 1724 on the autofill affordance 1722.
In fig. 17D, in response to detecting the tap gesture 1724, the electronic device displays a plurality of candidate input affordances 1725 for automatically populating the unsafe data-populatable field 1710 (e.g., replacing the automatically populating affordance 1722 and/or one or more other affordances of the input interface 1720). In the illustrated example, the populatable field 1710 is associated with a username. Thus, in some examples, each of the candidate input affordances 1725 serves as an index to a respective candidate username.
Upon displaying the candidate input affordance 1725 of the input interface 1720, the electronic device detects activation of the candidate input affordance 1725. For example, as shown in FIG. 17D, the activation is a tap gesture 1726 on the candidate input affordance 1725. In fig. 17E, in response to detecting the tap gesture 1726, the electronic device 1700 automatically populates the unsecure data-fillable field with the candidate input 1728 corresponding to the activated candidate input affordance 1725.
As described, in response to detecting the tap gesture 1724, the electronic device provides (e.g., displays) a candidate input affordance corresponding to a respective candidate input. In some examples, in response to detecting the tap gesture 1724, the electronic device determines whether multiple candidate inputs are available. If so, the electronic device 1700 provides the candidate input affordance as described. Any number of candidate input affordances are optionally provided in this manner. If not (e.g., only a single candidate input is available), the electronic device optionally automatically populates the unsecured data-populatable field 1710 without providing the candidate input.
Referring to fig. 17F, upon displaying the application interface, the electronic device 1700 detects a request to automatically populate the secured data populatable field 1712. For example, the request to automatically populate the secure data-populatable field 1712 is a tap gesture 1730 indicating a selection of the secure data-populatable field 1712.
In fig. 17G, in response to detecting a request to automatically populate the secure data-populatable field 1712, the electronic device 1700 initiates biometric authentication. In some examples, initiating biometric authentication includes obtaining (e.g., capturing using the one or more biometric sensors) data corresponding to a biometric characteristic of the user. In some examples, initiating biometric authentication also includes displaying a biometric authentication interface 1732 with a biometric authentication flag 1734. In some examples, the biometric authentication token 1734 is an analog of a representation of a biometric feature. In some examples, biometric authentication interface 1732 is overlaid on at least a portion of the application interface.
Referring to fig. 17H, in response to obtaining the data, the electronic device processes the biometric data, for example, to determine whether the biometric feature satisfies a biometric authentication criterion based on the biometric data (e.g., determine whether the biometric data matches the biometric template within a threshold). As the electronic device processes the biometric data, the electronic device optionally displays a biometric authentication landmark 1738 in the biometric authentication interface 1732 indicating that the biometric data is being processed (e.g., replaces the display of the biometric authentication landmark 1734 with it).
In fig. 17I, the electronic device 1700 determines that the biometric characteristic satisfies the biometric authentication criterion. In response, the electronic device displays a biometric authentication landmark 1740 in the biometric authentication interface 1732 indicating that the biometric authentication was successful (e.g., replacing the display of the biometric authentication landmark 1738 with it). Additionally or alternatively, the electronic device outputs a tactile output 1742 indicating that the biometric authentication was successful. After indicating that the biometric authentication is successful, the electronic device automatically populates the secure data-populatable field with an appropriate password 1743, as shown in fig. 17J. In some examples, in response to a successful biometric authentication, the electronic device further automatically populates a second populatable field (e.g., with username 1728), such as unsecure populatable field 1710. It should be appreciated that any number and/or type of populatable fields are optionally automatically populated in response to successful biometric authentication.
Upon displaying the application interface with the automatically populated populatable field 1710,1720, the electronic device detects activation of the submit affordance 1714. By way of example, as shown in FIG. 17J, the activation is a tap gesture 1744 on the submit affordance 1714. In response, the user optionally authenticates with the application, and the electronic device optionally shows a home interface, such as home interface 1782 of fig. 17S, referenced further below.
Alternatively, in fig. 17K, the electronic device 1700 determines that the biometric characteristic does not satisfy the biometric authentication criteria. In response, the electronic device displays a biometric authentication token 1746 in the biometric authentication interface 1732 (e.g., replacing the display of the biometric authentication token 1738 with it) indicating that the biometric authentication was unsuccessful (e.g., failed). Optionally, the electronic device outputs a tactile output 1750 indicating that the biometric authentication was unsuccessful. In some examples, tactile output 1750 is the same as tactile output 1742. In some examples, tactile output 1750 is different than tactile output 1742. After having indicated that the biometric authentication is unsuccessful, the electronic device stops the display of the biometric authentication interface, as shown in fig. 17L.
In some examples, the biometric authentication interface 1732 includes an animation and/or one or more of the biometric authentication landmark symbols of the biometric authentication interface 1732 are animated. By way of example, the biometric authentication landmark symbol 1738 includes a ring with a spherical rotation and/or the biometric authentication landmark symbol 1746 is moved edge-to-edge to simulate a "shaking" movement.
Referring to fig. 17M, in some examples, further in response to an unsuccessful biometric authentication, electronic device 1700 displays a failure interface, such as failure interface 1752. The failure interface includes a biometric authentication flag 1754, an alternative authentication affordance 1756, a retry affordance 1758, and a cancel affordance 1760. In some examples, activation of the retry affordance 1758 causes the electronic device to re-initiate biometric authentication, as described above. In some examples, the electronic device performs biometric authentication only if a threshold number of failed biometric authentication attempts have not been made. In some examples, activation of the cancel affordance causes the electronic device 1700 to cease display of the failure interface 1752.
Referring to fig. 17N, in response to activation of an alternative authentication affordance 1756 (such as a tap gesture 1762), the electronic device 1700 displays an alternative authentication interface 1766 (fig. 17O) (e.g., replacing display of the failure interface 1752 therewith) with which a user authenticates using an alternative form of authentication (e.g., fingerprint authentication, password authentication, pattern authentication, wherein pattern authentication includes selecting a plurality of items in a predefined pattern or moving a contact or other input in a predefined pattern) that is different from the authentication associated with the biometric feature. As shown in fig. 17O, the user optionally touches the fingerprint sensor 1764 of the electronic device with a finger to authenticate.
FIG. 17P illustrates another example failure interface 1766 that includes an alternative authentication affordance 1770. Referring to fig. 17Q, upon display of the failure interface 1766, the electronic device 1766 detects activation of an alternative authentication affordance 1770. By way of example, the activation is a tap gesture 1776 on the login affordance 1770. In response to detecting the tap gesture 1776, the electronic device 1700 displays an alternative authentication interface 1778. In some examples, the alternative authentication interface 1778 is a password interface through which a user may provide a password to authenticate.
In fig. 17R, in response to authentication (e.g., alternative authentication), the secure data-populatable field is automatically populated with a password 1743, and optionally the unsecure data-populatable field is automatically populated with a username 1728. In this way, the user may optionally utilize an auto-fill function despite unsuccessful biometric authentication. Upon displaying the application interface with the automatically populated populatable field 1710,1720, the electronic device detects activation of the submit affordance 1714. By way of example, the activation is a tap gesture 1780 on the submit affordance 1714. In response, the user optionally authenticates with the application, and the electronic device optionally shows a home interface, such as home interface 1782 of FIG. 17S.
In fig. 17T, the electronic device 1700 displays an application interface 1784 that includes secure data-populatable fields 1786 on the display 1702. In response to a request to auto-populate the secure data-populatable field 1786 (e.g., a selection of the secure data-populatable field 1786), the electronic device 1700 displays an input interface 1788 that includes an auto-populate affordance 1790, as shown.
Upon displaying the autofill affordance 1790 of the input interface 1788, the electronic device 1700 detects activation of the autofill affordance 1790. For example, as shown in fig. 17U, the activation is a tap gesture 1792 on the auto-fill affordance 1792.
With reference to fig. 17V-17X, in response to detecting the tap gesture 1792, the electronic device 1700 initiates biometric authentication to determine whether at least a portion of the biometric characteristic determined based on the biometric data corresponding to the biometric characteristic satisfies the biometric authentication criteria described with reference to at least fig. 17G-17I.
In fig. 17Z, in response to a successful biometric authentication, the electronic device 1700 displays a candidate selection interface 1794 that includes a plurality of candidate input affordances 1792 for automatically populating the secure data-populatable field 1786 (e.g., replacing display of the biometric authentication interface 1732 therewith). In some examples, the candidate selection interface 1794 is displayed without a keyboard. In the illustrated example, the populatable field 1786 is associated with a credit card (e.g., the populatable field 1786 is marked as associated with a financial transaction). Thus, in some examples, each of the candidate input affordances 1792 serves as an index to a respective credit card (e.g., a credit card number and/or one or more other respective candidate values associated with a credit card).
Upon displaying the candidate input affordance 1792, the electronic device 1700 detects activation of the candidate input affordance 1792. For example, as shown in fig. 17Z, the activation is a tap gesture 1795 on the candidate input affordance 1792. In fig. 17Z, in response to detecting the tap gesture 1795, the electronic device 1700 automatically populates the secure data-populatable field with a candidate input 1796 corresponding to the activated candidate input affordance 1792.
Upon displaying the application interface 1784 with the automatically populated populatable field 1786, the electronic device detects activation of the submit affordance 1798. By way of example, the activation is a tap gesture 1702A on the submit affordance 1798. In response, the automatically populated credit card is optionally submitted using an application, e.g., for authentication or payment purposes.
While described herein with respect to performing biometric authentication prior to providing a candidate input affordance when automatically populating a secure data-fillable field, it should be appreciated that in some examples, a candidate input affordance is provided prior to biometric authentication. Referring to FIG. 17AA, for example, in response to a request to automatically populate a secured data populatable field 1786, the electronic device 1700 displays an input interface including a plurality of candidate input affordances 1704A. In some examples, each of the candidate inputs 1704A is an index (e.g., a representation) of a candidate input value.
As shown in FIG. 17AB, upon displaying an input interface including a plurality of candidate input affordances 1704A, the electronic device detects activation of the candidate input affordance 1704A. By way of example, the activation is a tap gesture 1706A on the candidate input affordance 1704A. Referring to fig. 17AC-AE, in response, the electronic device performs biometric authentication, as described. In fig. 17AF, the electronic device 1700 has determined that the biometric authentication was successful and automatically populates the secure data populatable field 1786 with the selected candidate input corresponding to the selected candidate input affordance 1704A.
In fig. 17AG, the electronic device instead determines that the biometric authentication is unsuccessful. In response, the electronic device 1700 stops the display of the biometric authentication interface, as shown in fig. 17 AH.
As described above, the exemplary examples of the user interfaces shown in fig. 17A to 17AH described above relate to the exemplary examples of the user interfaces shown in fig. 18A to 18D described below. Accordingly, it should be appreciated that the processes described above with respect to the exemplary user interfaces shown in fig. 17A-17 AF and the processes described below with respect to the exemplary user interfaces shown in fig. 18A-18D are, for the most part, similar processes that similarly involve initiating and managing transmissions using an electronic device (e.g., 100, 300, 500, 700).
18A-18D are flow diagrams illustrating methods for performing biometric authentication using an electronic device, according to some examples. The method 1800 is performed at a device (e.g., 100, 300, 500, 1700) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 1800 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 1800 provides an intuitive way to perform authentication of biometric features. The method reduces the cognitive burden of the user in performing authentication of the biometric features, thereby creating a more efficient human-machine interface and a more intuitive user experience. For battery-driven computing devices, enabling a user to more quickly and efficiently manage authentication of biometric features conserves power and increases the interval between battery charges.
In some examples, an electronic device (e.g., 100, 300, 500, 1700) detects (1802) selection of a filable field (e.g., 1710, 1712, 1786). In some examples, in response to detecting selection of a filable field (e.g., 1710, 1712, 1786), the electronic device (e.g., 100, 300, 500, 1700) displays (1804) an input interface (e.g., 1720, 1788) that includes a plurality of user interface objects (e.g., 1725, 1793, 1704A) corresponding to candidate inputs for the filable field (e.g., 1710, 1712, 1786).
In some examples, prior to receiving a request (e.g., 1718, 1724, 1726, 1730, 1792, 1795, 1706A) to automatically populate the at least one fillable field (e.g., 1710, 1712, 1786), an electronic device (e.g., 100, 300, 500, 1700) receives a selection (e.g., 1718, 1730) of a fillable field (e.g., 1710, 1712, 1786). In some examples, the selection (e.g., 1718, 1730) of a fillable field (e.g., 1710, 1712, 1786) is a user selection of a fillable field (e.g., 1710, 1712, 1786) displayed in the application interface using an input device, such as a mouse or a button. In some examples, in response to selection of a filable field (e.g., 1710, 1712, 1786), the electronic device (e.g., 100, 300, 500, 1700) displays (1806) an auto-fill affordance (e.g., 1722, 1790). In some examples, the autofill affordance (e.g., 1722, 1790) is displayed in conjunction with a keyboard (or keypad).
In some examples, an electronic device (e.g., 100, 300, 500, 1700) displays (1808) an application interface on a display that includes a fillable field (e.g., 1710, 1712, 1786). Displaying an application interface that includes a tillable field provides visual feedback to a user indicating that a particular region of the application interface can be entered. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, displaying an application interface on a display (e.g., 1702) that includes a filable field (e.g., 1710, 1712, 1786) includes displaying (1810) the filable field (e.g., 1712, 1786) with a first visual treatment in association with a second type of data according to the filable field (e.g., 1712, 1786). Displaying the filable field with a particular visual treatment (e.g., a first visual treatment) in accordance with the filable field being associated with a particular type (e.g., a second type) of data provides visual feedback that allows a user to quickly and easily identify that the filable field is associated with a particular type of data. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the second type of data includes data for which authentication is required in order to be automatically populated, such as payment information, a password, and/or a username. In some examples, the first visual treatment is a visual effect, such as a particular color scheme, highlighting, or animation. In some examples, the first visual treatment includes a first color scheme, such as a pattern having one or more colors. In some examples, the first visual process includes a biometric authentication interface object (e.g., 1716) associated with (e.g., within or adjacent to) the fillable field (e.g., 1712, 1786).
In some examples, an electronic device (e.g., 100, 300, 500, 1700) displays a biometric authentication landmark (e.g., 1716) or icon in or near a field (e.g., 1712, 1786) associated with biometric authentication that is not displayed in or near a field (e.g., 1710) not associated with biometric authentication. Displaying a biometric authentication landmark symbol or icon in or near a field associated with a biometric authentication and not in or near a field not associated with a biometric authentication provides easily identifiable visual feedback regarding which fields are or need biometric authentication and which fields are or need not. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, displaying the application interface on the display that includes the filable field includes displaying (1812) the filable field (e.g., 1710) having a second visual treatment that is different from the first visual treatment in association with a first type of data according to the filable field (e.g., 1710). In some examples, the first type of data includes data for which authentication is not required in order to be automatically populated, such as contact information including a name, address, telephone number, zip code, and so forth. In some examples, the second visual treatment is an absence of the first visual treatment. In some examples, the electronic device (e.g., 100, 300, 500, 1700) highlights the fillable field (e.g., 1712, 1786) with a different color, biometric authentication flag symbol (e.g., 1716), and/or text indicating the fillable field (e.g., 1712, 1786) is optionally automatically filled in response to a successful biometric authentication. In some examples, the second visual treatment includes a second color scheme different from the first color scheme. Thus, in some examples, the electronic device (e.g., 100, 300, 500, 1700) displays a field (e.g., 1712, 1786) associated with a biometric authentication using a different color than a field (e.g., 1710) not associated with the biometric authentication.
In some examples, displaying an application interface on the display that includes the filable fields (e.g., 1710, 1712, 1786) includes displaying (1814) a web page that includes the filable fields (e.g., 1710, 1712, 1786). In some examples, the application interface also includes a submission affordance (e.g., 1714, 1798) associated with a fillable field (e.g., 1710, 1712, 1786).
In some examples, while displaying the application interface, the electronic device (e.g., 100, 300, 500, 1700) receives (1816) a request (e.g., 1718, 1724, 1726, 1730, 1792, 1795, 1706A) to automatically populate a fillable field (e.g., 1710, 1712, 1786) of the application interface. In some examples, the request is a selection (e.g., 1724, 1792) of an auto-fill affordance (e.g., 1722, 1790), a selection (e.g., 1718, 1730) of a field, a selection (e.g., 1726, 1795, 1706A) of a candidate text input, loading a web page, or any combination thereof. In some examples, receiving the request to automatically populate the at least one filable field (e.g., 1710, 1712, 1786) of the application interface includes receiving (1818) a selection of an automatically populated affordance (e.g., 1722, 1790) displayed on a display (e.g., 1702) of an electronic device (e.g., 100, 300, 500, 1700). In some examples, in response to selection (e.g., 1710, 1712, 1786) of the field (e.g., 1710, 1712, 1786), the electronic device (e.g., 100, 300, 500, 1700) displays a keyboard (or keypad) that includes an affordance (e.g., 1722, 1790) for automatically populating the fillable field (e.g., 1710, 1712, 1786). In response to selection of the affordance, the electronic device (e.g., 100, 300, 500, 1700) initiates biometric authentication. In some examples, receiving the request to automatically populate the at least one filable field (e.g., 1710, 1712, 1786) of the application interface includes receiving (1820) a selection (e.g., 1718, 1730) of the filable field (e.g., 1710, 1712, 1786).
In some examples, in response to selection of a fillable field (e.g., 1710, 1712, 1786), the electronic device (e.g., 100, 300, 500, 1700) initiates biometric authentication without displaying an input interface (e.g., 1720, 1788). Initiating biometric authentication without displaying an input interface in response to selection of a populatable field enables a user to quickly and efficiently initiate biometric authentication with minimal input. Reducing the number of inputs required to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, an input interface (e.g., 1720, 1788) is displayed in response to selection of a first type of field (e.g., 1786) (e.g., a credit card field) and not displayed in response to selection of a second type of field (e.g., 1712) (e.g., a password field). In some examples, receiving the request to automatically populate the at least one filable field (e.g., 1710, 1712, 1786) of the application interface includes receiving (1822) a selection (e.g., 1726, 1795, 1706A) of an index (e.g., 1725, 1793, 1704A) corresponding to a candidate input associated with the second type of data. In some examples, an electronic device (e.g., 100, 300, 500, 1700) provides one or more indices (e.g., 1725, 1793, 1704A) corresponding to one or more candidate inputs that are available to automatically populate (e.g., upon selection) a filable field (e.g., 1710, 1712, 1786). In some examples, the reference is, for example, a reference to a credit card (e.g., "CC 1") or a reference to a password ("Facebook password"). In some examples, the index is the candidate itself (e.g., an email address such as "test @ test. In some examples, selection (e.g., 1726, 1795, 1706A) of an index (e.g., 1725, 1793, 1704A) to a candidate input is selection of an affordance of a software keyboard. In some examples, the keyboard is a keypad. In some examples, receiving the request to automatically populate the at least one filable field of the application interface includes selection of a filable field of the web page (1824). In some examples, receiving the request to automatically populate the populatable field of the application interface includes receiving (1826) a selection (e.g., 1726, 1795, 1706A) of a user interface object (e.g., 1725, 1793, 1704A) corresponding to a respective candidate input of the plurality of candidate inputs. In some examples, in response to selection of a filable field, the electronic device (e.g., 100, 300, 500, 1700) provides a candidate input (e.g., 1725, 1793, 1704A) for selection by a user. Thereafter, the electronic device (e.g., 100, 300, 500, 1700) proceeds with biometric authentication. In some examples, an electronic device (e.g., 100, 300, 500, 1700) identifies all of the filable fields (e.g., 1710, 1712, 1786) and/or determines candidate inputs for one or more of the fields (e.g., 1710, 1712, 1786) when an application interface is loaded. In some examples, autofilling in this manner reduces the number of inputs required to autofill a fillable field (e.g., 1710, 1712, 1786). In some examples, the request to automatically populate the fillable fields (e.g., 1710, 1712, 1786) is based on a detection of loading a web page that includes the fillable fields (e.g., 1710, 1712, 1786).
In some examples, in response to receiving a request (1828) to automatically populate the populatable field (e.g., 1710, 1712, 1786) of the application interface, in accordance with a determination that the populatable field (e.g., 1710, 1712, 1786) of the application interface is associated with the first type of data, the electronic device (e.g., 100, 300, 500, 1700) automatically populates (1830) the populatable field (e.g., 1710, 1712, 1786) with the first type of data. Automatically populating the filable field with a particular type of data (e.g., a first type of data) in accordance with a determination that the filable field of the application interface is associated with the particular type of data (e.g., the first type of data) allows a user to bypass having to manually enter data in the filable field of the application interface. Reducing the number of inputs required to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the first type of data includes non-secure or non-secure (e.g., non-biometric secure) data. In some examples, the non-secure data is a given name of the user, a nickname, a publicly available telephone number, or a preference associated with a particular field (e.g., a shoe size for a shoe size field). In some examples, automatically populating a fillable field (e.g., 1710, 1712, 1786) includes populating the field with data stored by or accessible to an electronic device (e.g., 100, 300, 500, 1700) without requiring further authentication (e.g., further biometric authentication) in response to a request (e.g., 1718, 1724, 1726, 1730, 1792, 1795, 1706A).
In some examples, further in response to a request to automatically populate a fillable field (e.g., 1710, 1712, 1786) of the application interface, in accordance with a determination that the fillable field (e.g., 1710, 1712, 1786) of the application program is associated with a second type of data (1832), the electronic device (e.g., 100, 300, 500, 1700) displays (1834) a biometric authentication interface (e.g., 1732) when data corresponding to the biometric feature is obtained from the one or more biometric sensors (e.g., 1703) (e.g., during at least a portion of the obtaining process). Displaying the biometric authentication interface in accordance with a determination that the populatable field of the application is associated with a particular type of data (e.g., a second type of data) enhances device security by requiring security verification measures when the data is of the particular type (e.g., the second type). Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, the second type of data is secure data (e.g., biometric-secure data). In some examples, the secure data includes password information, credit card information, non-public user information such as a phone number that is not logged into a phone book, or medical information. In some examples, an electronic device (e.g., 100, 300, 500, 1700) displays a biometric authentication interface (e.g., 1732) when performing biometric authentication. In some examples, a biometric authentication interface is displayed over at least a portion of the application interface. In some examples, displaying the biometric authentication interface includes displaying a biometric authentication animation. In some examples, the biometric authentication animation includes an initial animation (e.g., display of a first biometric authentication landmark symbol (e.g., 1734)), a process animation (e.g., rotation of a ring indicating that biometric data is being processed), and a success animation or a failure animation. In some examples, the failure animation is the same as the initial animation. This feature is described in more detail above with reference to fig. 15A to 15T. In some examples, the biometric authentication interface includes a simulated representation of the biometric feature (e.g., 1734, 1738, 1740, 1746) (1836). In some examples, the biometric authentication interface includes a simulated representation of a biometric feature (e.g., 1734, 1738, 1740, 1746) that indicates a state of a biometric authentication sequence. In some examples, the biometric feature is a face, and the representation (e.g., 1734, 1738, 1740, 1746) is a simulation of the face.
In some examples, further in response to the request to automatically populate the populatable field and in accordance with a determination that the populatable field of the application is associated with the second type of data, the electronic device (e.g., 100, 300, 500, 1700) determines whether a plurality of candidate inputs (e.g., associated with the second type of data) are stored on the electronic device (e.g., 100, 300, 500, 1700). Further, in some examples, in accordance with a determination that the electronic device (e.g., 100, 300, 500, 1700) has stored thereon a plurality of candidate inputs (e.g., 1793, 1704A) associated with the second type of data, the electronic device (e.g., 100, 300, 500, 1700) displays the plurality of candidates. Further, in some examples, the electronic device (e.g., 100, 300, 500, 1700) receives a selection of a candidate input of the displayed plurality of candidate inputs. Further, in some examples, in response to receiving a selection of a candidate input (e.g., 1704A), the electronic device (e.g., 100, 300, 500, 1700) obtains data corresponding to at least a portion of the biometric feature from the one or more biometric sensors (e.g., 1703). In some examples, automatically populating the filable fields (e.g., 1712, 1786) with the second type of data includes automatically populating the filable fields (e.g., 1712, 1786) with the selected candidate input (e.g., 1704A). In some examples, prior to performing biometric authentication, the electronic device (e.g., 100, 300, 500, 1700) determines whether a plurality of candidate inputs are stored on the electronic device (e.g., 100, 300, 500, 1700). In some examples, once the user has selected the candidate input (e.g., 1704A), the electronic device (e.g., 100, 300, 500, 1700) performs biometric authentication.
In some examples, the electronic device (e.g., 100, 300, 500, 1700) automatically populates (1840) the fillable field (e.g., 1710, 1712, 1786) with the second type of data further in response to a request to automatically populate the fillable field (e.g., 1710, 1712, 1786) and in accordance with a determination that the at least a portion of the biometric characteristic determined based on the data corresponding to the biometric characteristic obtained from the one or more biometric sensors satisfies the biometric authentication criteria (1838). Automatically populating the filable field with a particular type (e.g., a second type) of data in accordance with a determination that the at least a portion of the biometric characteristic satisfies the biometric authentication criteria allows a user to bypass having to manually enter data in the filable field. Reducing the number of inputs required to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, if the biometric authentication is successful, the electronic device (e.g., 100, 300, 500, 1700) automatically populates the fillable field (e.g., 1712, 1786) with the information in response to the request. The auto-fill fillable field allows the user to bypass having to manually enter data in the fillable field. Reducing the number of inputs required to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, in response to receiving a request (e.g., 1718, 1724, 1726, 1730, 1795, 1706A) to automatically populate a fillable field (e.g., 1712, 1786) of an application interface, an electronic device (e.g., 100, 300, 500, 1700) obtains data corresponding to at least a portion of a biometric feature from the one or more biometric sensors. In some examples, the data obtained from the one or more biometric sensors is obtained prior to receiving a request to automatically populate a fillable field of an application interface. In some examples, the data obtained from the one or more biometric sensors is obtained in response to receiving a request to automatically populate a populatable field (e.g., 1712, 1786) of an application interface. In some examples, the data obtained from the one or more biometric sensors (e.g., 1703) is obtained in accordance with a determination that the populatable field (e.g., 1712, 1786) of the application is associated with the second type of data. In some examples, an electronic device (e.g., 100, 300, 500, 1700) automatically populates a filable field (e.g., 1712, 1786) without displaying an input interface (e.g., 1720) (e.g., a keyboard or keypad) in response to a request to automatically populate the filable field. In some examples, the one or more biometric sensors (e.g., 1703) include a camera (e.g., an IR camera or a thermal imaging camera). In some examples, the data corresponding to biometric features obtained from the one or more biometric sensors (e.g., 1703) includes biometric data obtained using a camera. In some examples, the biometric feature is a face. In some examples, the data corresponding to biometric features obtained from the one or more biometric sensors (e.g., 1703) includes biometric data associated with a portion of a face, and the biometric authentication criteria includes a requirement that the biometric data associated with the face match biometric data associated with an authorized face in order to satisfy the biometric authentication criteria.
In some examples, in accordance with a determination that the electronic device (e.g., 100, 300, 500, 1700) has access to a single candidate value of the second type for populating the fillable field (e.g., 1712, 1786), the electronic device (e.g., 100, 300, 500, 1700) automatically populates the fillable field (e.g., 1712, 1786) with data of the second type. In some examples, in accordance with a determination that the electronic device (e.g., 100, 300, 500, 1700) has access to a plurality of candidate values of the second type for automatically populating the fillable field (e.g., 1712, 1786), the electronic device (e.g., 100, 300, 500, 1700) displays a representation of a plurality of the plurality of candidate values. In some examples, the candidate values are stored directly on the device and/or are otherwise accessible by the electronic device (e.g., 100, 300, 500, 1700) from another electronic device (e.g., 100, 300, 500, 1700) connected to the electronic device (e.g., 100, 300, 500, 1700). In some examples, upon displaying the representations (e.g., 1725, 1793, 1704A) of the plurality of candidate values, the electronic device (e.g., 100, 300, 500, 1700) receives a selection (e.g., 1726, 1795, 1706A) of the representations (e.g., 1725, 1793, 1704A) of the respective ones of the plurality of candidate values and, in some examples, automatically populates the populatable fields (e.g., 1712, 1786) with the respective candidate values. In some examples, the electronic device (e.g., 100, 300, 500, 1700) determines whether the electronic device (e.g., 100, 300, 500, 1700) has access rights to multiple instances of the second type of data. In some examples, in response to a successful biometric authentication, the electronic device (e.g., 100, 300, 500, 1700) determines whether a plurality of candidate inputs of data (e.g., candidate credit cards), such as biometric security, are stored on the device. If so, the electronic device (e.g., 100, 300, 500, 1700) presents (e.g., displays) each of the candidates (e.g., 1725, 1793, 1704A) to the user. In response to a user selection (e.g., 1726, 1795, 1706A) of one of the candidates (e.g., 1725, 1793, 1704A), the electronic device (e.g., 100, 300, 500, 1700) automatically populates the field (e.g., 1712, 1786) with the selected candidate.
In some examples, in accordance with a determination that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria based on data obtained from the one or more biometric sensors, the electronic device (e.g., 100, 300, 500, 1700) foregoes (1842) automatically populating the fillable field (e.g., 1712, 1786) with the second type of data. Forgoing using the selected candidate to automatically populate the field in accordance with a determination that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria provides visual feedback by allowing the user to identify that the authentication is unsuccessful, and further provides enhanced device security by forgoing automatically populating the filable field without successful authentication. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device, by enhancing the legibility of user interface elements to the user when the device is at a natural viewing angle), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, improving the security measures of the device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, in accordance with a determination that the at least a portion of the biometric feature does not satisfy the biometric authentication criteria based on data obtained from the one or more biometric sensors, the electronic device (e.g., 100, 300, 500, 1700) displays (1844) an indication that the at least a portion of the biometric feature does not satisfy the biometric authentication criteria. Displaying an indication that the at least a portion of the biometric characteristic does not meet the biometric authentication criteria provides visual feedback by allowing the user to quickly identify that the authentication was unsuccessful. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device, by enhancing the legibility of user interface elements to the user when the device is at a natural viewing angle), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, in response to a failed biometric authentication, the electronic device (e.g., 100, 300, 500, 1700) provides an indication of the failure. In some examples, if a threshold number of biometric attempts has been reached, the electronic device (e.g., 100, 300, 500, 1700) displays a message indicating that the "biometric feature is not recognized" or indicating that "biometric authentication is inactive". In some examples, after the failure, the electronic device (e.g., 100, 300, 500, 1700) removes any biometric authentication interface displayed over the application interface and/or displays a biometric authentication retry affordance (e.g., 1758) (e.g., in a fillable field (e.g., 1712)) whose selection retries the biometric authentication. In some examples, in response to determining that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria, the device displays a keypad or keyboard for entering data (e.g., a username, a password, contact information, credit card information, etc.) into the tillable fields (e.g., 1712, 1786).
In some examples, the electronic device (e.g., 100, 300, 500, 1700) ceases to display the biometric authentication interface in accordance with a determination that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria based on the data obtained from the one or more biometric sensors. In some examples, the electronic device (e.g., 100, 300, 500, 1700) stops displaying the biometric authentication after the failed biometric authentication. Accordingly, the electronic device (e.g., 100, 300, 500, 1700) resumes display of an application interface, such as a login interface (e.g., 1714) of an application program.
In some examples, the electronic device (e.g., 100, 300, 500, 1700) displays an input interface (e.g., 1720) in accordance with a determination that the at least a portion of the biometric characteristic does not satisfy the biometric authentication criteria based on data obtained from the one or more biometric sensors. In some examples, the input interface (e.g., 1720) includes a keypad or keyboard that includes character entry keys for entering a password or passcode.
In some examples, the electronic device (e.g., 100, 300, 500, 1700) prompts the user for an alternative form of authentication in accordance with a determination that biometric authentication is not available. Prompting the user for an alternative form of authentication in dependence upon determining that biometric authentication is not available allows the user to readily provide authentication for operation using different authentication methods. Providing additional control options in this manner (e.g., for providing authentication) without cluttering the UI with additional displayed controls enhances operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, biometric authentication fails because a threshold number of failed biometric authentication attempts have been reached since the last successful authentication with the device, or because the biometric sensor cannot be used due to heat, cold, lighting (e.g., there is not enough light or too much light for the device to detect the characteristics of the biometric feature), or other environmental conditions. In some examples, the electronic device (e.g., 100, 300, 500, 1700) receives an alternative form of authentication after prompting the user for the alternative form of authentication (e.g., a password, or a different form of biometric authentication such as a fingerprint). In some examples, further after prompting the user for an alternative form of authentication, in response to receiving the alternative form of authentication, the electronic device (e.g., 100, 300, 500, 1700) automatically populates the populatable field (e.g., 1712, 1786) in accordance with a determination that the alternative form of authentication is consistent with authorized authentication information (e.g., a previously stored fingerprint, password, or password). In some examples, further after prompting the user for an alternative form of authentication, and further in response to receiving the alternative form of authentication, in accordance with a determination that the alternative form of authentication is inconsistent with authorized authentication information, the electronic device (e.g., 100, 300, 500, 1700) forgoes automatically populating the populatable field (e.g., 1712, 1786).
In some examples, after responding to a request to automatically populate a fillable field of an application interface, an electronic device (e.g., 100, 300, 500, 1700) receives a subsequent request to load a web page. In some examples, further in response to the subsequent request to load the web page after responding to the request to automatically populate the tillable fields (e.g., 1710, 1712, 1786) of the application interface, the electronic device (e.g., 100, 300, 500, 1700) attempts biometric authentication to automatically populate the tillable fields (e.g., 1710, 1712, 1786) in the application interface in accordance with a determination that the subsequent request to load the web page satisfies authentication retry criteria. In some examples, further in response to the request to automatically populate the tillable fields (e.g., 1710, 1712, 1786) of the application interface, and further in response to the subsequent request to load the web page, in accordance with a determination that the subsequent request to load the web page does not satisfy the authentication retry criteria, the electronic device (e.g., 100, 300, 500, 1700) forgoes attempting biometric authentication to automatically populate the tillable fields (e.g., 1710, 1712, 1786) in the application interface. In some examples, loading a web page conditionally triggers automatic population based on predetermined criteria. For example, a loaded web page is considered a request to automatically populate a fillable field in the web page when the web page is first loaded, but is not considered a request to automatically populate a fillable field in the web page when the web page is loaded a second time within a predetermined amount of time (e.g., within 5 minutes, 1 hour, or 1 day). In some examples, the authentication retry criteria include at least one of a requirement that the web page not be loaded within a predetermined amount of time or a requirement that the web page not be loaded during the same session. In some examples, the requirement is that the load is a first instance of the load within a predetermined time and/or the load is a first instance of the load within a session.
In some examples, after automatically populating a fillable field (e.g., 1710, 1712, 1786) with a first type of data or a second type of data, an electronic device (e.g., 100, 300, 500, 1700) receives a selection to submit an affordance (e.g., 1714, 1798). In some examples, in response to receiving a selection of a submit affordance, the electronic device (e.g., 100, 300, 500, 1700) ceases to display the application interface. In some examples, further in response to receiving a selection of the submit affordance, the electronic device (e.g., 100, 300, 500, 1700) displays a second interface (e.g., 1782) generated by the application. In some examples, displaying the second interface includes replacing a login user interface of the application with a user interface (e.g., 1782) of the application that includes the protected information.
It is noted that details of the processes described above with respect to method 1200 (e.g., fig. 18A-18D) may also be applied to the methods described herein in a similar manner. For example, method 1800 optionally includes one or more of the features of the various methods described herein with reference to methods 800, 1000, 1200, 1400, 1600, 2000, 2200, 2500, and 2700. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to fig. 17G-K. As another example, in response to receiving an input prior to completion of the biometric authentication process, one or more plug-in interfaces as described in methods 2000 and 2700 are optionally displayed. For the sake of brevity, these details are not repeated here.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 18A-18D are optionally implemented by the components depicted in fig. 1A-1B. For example, display operation 1808, receive operation 1816, and autofill operation 1830 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Fig. 19A-19 AB illustrate exemplary user interfaces for biometric authentication, according to some examples. As described in more detail below, the exemplary examples of user interfaces shown in fig. 19A-19 AB are used to illustrate the processes described below, including the processes in fig. 20A-20F.
Fig. 19A shows an electronic device 1900 (e.g., portable multifunction device 100, device 300, or device 500). In the illustrative example shown in fig. 19A-19 AB, electronic device 1900 is a smartphone. In other examples, electronic device 1900 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 1900 has a display 1902, one or more input devices (e.g., a touchscreen of the display 1902, buttons 1904, a microphone (not shown)), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 1903), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 1903 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In fig. 19A, the electronic device wakes up from a low power (e.g., display off) state. As shown, in some examples, the electronic device 1900 wakes up in response to a lift gesture 1906 performed by the user. Referring to FIGS. 19B-D, in response to the lift gesture 1906, the electronic device 1900 transitions from a low power state to a medium power state (e.g., the display is dimmed). For example, in FIG. 19B, the display 1902 of the electronic device 1900 is disabled, and in response to the lift gesture 1906, the electronic device 1900 gradually increases the brightness of the display 1902 over a predetermined period of time as shown in FIGS. 19C-D. In some examples, the brightness of the display 1902 is increased according to a function, such as a linear function. In some examples, when biometric authentication (e.g., facial recognition authentication) is enabled, the device locks immediately upon pressing a hardware button (e.g., a sleep/wake button), and in some examples, the device locks each time it transitions to a sleep mode.
Referring to fig. 19C-19D, in some examples, upon transitioning to a medium power state (e.g., a state in which the display is on but not at full operational brightness) and/or operating in the medium power state, the electronic device displays a lock interface 1910. The lock interface includes, for example, a lock status indicator 1912, and optionally one or more notifications 1914. As shown, notification 1914 is a message notification associated with the instant messaging application that indicates that the electronic device has received a new message from a contact ("John Appleseed") stored on the electronic device. In some examples, the medium power state is a locked state. Thus, when operating in a medium power state, the electronic device 1900 operates in a safe manner. By way of example, when operating in a medium power state, the electronic device does not display the content of the message associated with notification 1914. In some examples, the locked state also corresponds to a restriction on access to other data (including other applications) and/or a restriction on permissible inputs.
In some examples, the electronic device 1900 also displays a flashlight affordance 1907 and a camera affordance 1908. In some examples, activation of the flashlight affordance 1907 causes the electronic device to launch a flashlight application. In some examples, activation of the camera affordance 1908 causes the electronic device 1900 to load a camera application.
In some examples, after transitioning to (e.g., in response to) the medium power state, the electronic device 1900 initiates biometric authentication (e.g., facial recognition authentication). In some examples, initiating biometric authentication includes obtaining (e.g., capturing using the one or more biometric sensors) data corresponding to at least a portion of a biometric feature of the user. In some examples, when a face (of a user) is detected, biometric authentication confirms attention (of the user) and unlocking intent by detecting that the user's eyes are open and pointing at the device.
Referring to fig. 19E-19G, if electronic device 1900 determines that the biometric authentication was successful, the electronic device transitions from the medium power state to the high power state (e.g., the display is not dimmed). For example, in fig. 19D, the display of the electronic device 1900 is in a medium power state, and in response to a successful biometric authentication, the electronic device 1900 gradually increases the brightness of the display 1902 over a predetermined period of time, as shown in fig. 19E-G. In some examples, the brightness of the display 1902 is increased according to a function, such as a linear function.
In some examples, upon transitioning from the medium power state to the high power state, electronic device 1900 displays an unlock interface 1920. In some examples, upon displaying the unlock interface 1920, the electronic device displays an animation indicating that the electronic device is transitioning to a high power state. As shown in fig. 19E-G, upon transitioning, the electronic device displays an animation in which lock status indicator 1912 transitions to unlock status indicator 1922 (fig. 19G). In some examples, displaying the animation includes displacing the locked status indicator 1912 and/or increasing its size to display the unlocked status indicator 1913 (fig. 19E), and raising and rotating the latch of the unlocked status indicator to display the unlocked status indicator 1921 (fig. 19F) and the unlocked status indicator 1922 (fig. 19G), respectively. In some examples, a degree of ambiguity of one or more objects of the locked state interface 1910 and/or the unlocked state interface 1920 is changed during the animation. In some examples, upon transitioning to or in response to the high power state, electronic device 1900 also outputs tactile output 1926 (fig. 19G).
In some examples, the high power state is an unlocked state. Thus, when operating in a high power state, the electronic device 1900 operates in an unsecure manner (e.g., authenticating that a user has access to secure data). By way of example, as shown in fig. 19G, when operating in the high power state, the electronic device displays the content of the message associated with notification 1914.
In some examples, biometric authentication (e.g., facial recognition authentication) adds its stored mathematical representation over time in order to improve unlocking performance and keep up with natural changes in the user's face and appearance. In some examples, upon successful unlocking, biometric authentication optionally uses a newly computed mathematical representation-if of sufficient quality-for a limited number of additional unlocks before discarding the data. In some examples, if the biometric authentication fails to identify the user, but the matching quality is above a certain threshold and the user immediately (e.g., within a predefined threshold amount of time) fails to follow up by entering an alternative authentication (e.g., password, pattern, fingerprint), the device makes another capture of the biometric data (e.g., via one or more cameras or other biometric sensors that capture facial recognition data) and augments its registered biometric authentication (e.g., facial recognition authentication) data with the newly computed mathematical representation. In some examples, the new biometric authentication (e.g., facial recognition authentication) data is optionally discarded after a limited number of unlocks and if the user stops matching it. These incremental processes allow biometric authentication (e.g., facial recognition authentication) to keep up with significant changes in the user's facial hair or makeup usage while minimizing false acceptance.
Referring to fig. 19E-19G, if electronic device 1900 determines that the biometric authentication was unsuccessful, electronic device 1900 does not transition to the high power state and, in some examples, remains in the medium power state. In some examples, while electronic device 1900 remains in the medium power state, electronic device 1900 remains in a locked state. To indicate that biometric authentication failed, electronic device 1900 simulates shaking of lock status indicator 1912, for example, by alternating the position of lock status indicator 1912 between two positions on lock status interface 1910. In some examples, the electronic device 1900 also outputs a tactile output 1918 to indicate that the biometric authentication was unsuccessful.
As described, the electronic device 1900 is in a locked state when in the medium power state, and thus, the secure data on the electronic device cannot be accessed when the electronic device is in the medium power state. By way of example, in fig. 19I, the electronic device detects user input 1930 near an edge of the display 1902. As shown in fig. 19I-K, user input 1930 is a swipe gesture, which in some examples is a request to access the home screen interface of electronic device 1900. However, because the electronic device 1900 is in the medium power and locked state, in response to the swipe gesture, the electronic device 1900 slides the locked state interface 1910 in an upward direction to display (e.g., expose) an alternative authentication interface 1932 with which the user authenticates using an alternative form of authentication (e.g., password authentication) that is different from the authentication associated with the biometric feature. Alternative authentication interface 1932 includes a lock state indicator 1934 and a prompt 1936 indicating to the user that entry of a valid password causes electronic device 1900 to be unlocked (and optionally, transition to a high power state).
In some examples, an alternative form of authentication (e.g., a password, or pattern) is required in some cases to unlock the device. In some examples, an alternative form of authentication is required if it has just been turned on or restarted. In some examples, an alternative form of authentication is required if the device is not unlocked for more than a predetermined amount of time (e.g., 48 hours). In some examples, an alternative form of authentication is required if the device is not unlocked using the alternative form of authentication for a predetermined amount of time (e.g., 156 hours). In some examples, an alternative form of authentication is required if the device is unlocked without using the alternative form of authentication for a predetermined amount of time (e.g., six and a half days) and biometric authentication (e.g., facial recognition authentication) is not used to unlock the device for a past predetermined amount of time (e.g., a past 4 hours). In some examples, if the device has received a remote lock command, an alternative form of authentication is required. In some examples, an alternative form of authentication is required after five unsuccessful attempts to match a face on a device (via facial recognition authentication). In some examples, an alternative form of authentication is required after initiating a power off/emergency SOS on a device and then canceling the power off/emergency SOS.
Referring to fig. 19L-19M, a valid password (or passcode) is received by the electronic device 1900 at least in part in response to a tap gesture 1938 (fig. 19L) and optionally one or more other inputs indicating additional alphanumeric digits of the valid password. As shown in fig. 19N, once a valid password has been received, the electronic device is unlocked and a home screen interface 1933 is displayed (e.g., replacing the display of an alternative authentication interface therewith).
In fig. 19O-19R, the device operates in a high power (e.g., unlocked) state and the received input is a request to access secure data on the electronic device 1900. By way of example, as shown in FIG. 19O, electronic device 1900 is operating in a high power state, and a swipe gesture 1944 is received as shown in FIG. 19P that is a request to access the home screen interface of electronic device 1900. As further shown in fig. 19P-R, in response to the swipe gesture 1944, electronic device 1900 slides unlock state interface 1920 in an upward direction to display (e.g., expose) home screen interface 1946.
19S-19U illustrate various ways in which an electronic device transitions from a high power (e.g., unlocked state) to a locked state (such as a medium power state or a low power state). In fig. 19S, upon displaying the unlock state interface 1920 (as described at least with respect to fig. 19G), the electronic device 1900 receives an activation of the unlock state indicator 1922. In some examples, the activation of the unlock screen indicator 1922 is a tap gesture 1948. As shown in fig. 19V, in response to activation of the unlock state indicator 1922, the electronic device transitions to a medium power state and optionally displays the lock state indicator 1912 and/or provides a tactile output 1952. In some examples, upon transitioning to the medium power state, the electronic device displays an animation indicating that the electronic device 1900 is transitioning to the medium power state (or low power state).
In fig. 19T, electronic device 1900 receives an activation of button 1904 while the home screen interface 1946 is displayed and while in the high-power unlocked state. In some examples, the activation of button 1904 is a press and/or depression of button 1904. In response to activation of button 1904, the electronic device transitions to a low power state (as described at least with reference to FIG. 19B). In FIG. 19U, upon displaying the home screen interface 1946, the electronic device 1900 receives activation of the unlock screen indicator 1950 of the home screen interface 1946. In some examples, activation of unlock screen indicator 1922 is a tap gesture 1950. In response to activation of the unlock state indicator 1922, the electronic device transitions to the medium power state and optionally displays the lock state indicator 1910 (fig. 19V).
In fig. 19W, the electronic device 1900 displays a device setup interface 1954. The device settings interface includes gaze enabled settings 1955, which when enabled require the user to be looking at the device for successful biometric authentication. When this setting is disabled, biometric authentication can be successful even if the authorized user is not looking at the device. Device settings interface 1954 also includes biometric authentication enablement settings 1956 that, when enabled, enable biometric authentication on electronic device 1900. When the biometric authentication enable setting 1956 is disabled, biometric authentication is not available on the electronic device 1900.
For example, in fig. 19W, electronic device 1900 receives an activation of biometric authentication enablement setting 1956. In some examples, activation of the biometric authentication enabled setting 1956 is a tap gesture 1958. Because the biometric authentication enable setting 1956 is enabled as shown in fig. 19W, the biometric authentication enable setting 1956 is disabled in response to a tap gesture 1958, as shown in fig. 19X. In some examples, therefore, any request for access to secure data on the electronic device 1900 requires the user to authenticate using an alternative form of authentication. For example, referring to fig. 19Y-Z, the electronic device 1900 detects a user input 1930 near an edge of the display 1902. As shown in fig. 19I-K, user input 1930 is a swipe gesture, which in some examples is a request to access the home screen interface of electronic device 1900. Referring to fig. 19AA, because the biometric authentication enable setting 1956 is disabled, in response to the swipe gesture 1930, the electronic device 1900 slides the lock state interface 1910 in an upward direction to display (e.g., expose) an alternative authentication interface 1932 with which a user can provide a password to unlock the electronic device 1900.
In some examples, one or more elements displayed by the electronic device 1900 are based on context. As shown in fig. 19AB, for example, the lock status indicator displayed by the electronic device is based on the location and/or type of the electronic device 1900 in some cases.
20A-20F are flow diagrams illustrating methods for performing biometric authentication using an electronic device, according to some examples. The method 2000 is performed at a device (e.g., 100, 300, 500, 1900) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2000 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 2000 provides an intuitive way to perform authentication of biometric features. The method reduces the cognitive burden of the user in performing authentication of the biometric features, thereby creating a more efficient human-machine interface and a more intuitive user experience. For battery-driven computing devices, enabling a user to more quickly and efficiently manage authentication of biometric features conserves power and increases the interval between battery charges.
In some examples, the electronic device performs a biometric enrollment process before detecting that the device wake criteria have been met. In some examples, during biometric enrollment, the device needs that the face being enrolled should include facial features indicating a face looking at the electronic device during enrollment of the face in order to proceed with biometric enrollment of the face. In some examples, if the face is not looking at the electronic device during enrollment, the device outputs a tactile, audio, and/or visual alert during enrollment.
In some examples, an electronic device (e.g., 100, 300, 500, 1900) detects (2002) that a device wake-up criterion has been met. In some examples, in response to detecting that the device wake criteria have been met, the electronic device transitions (2004) the electronic device from a first visual state (e.g., a low power state) to a second visual state (e.g., a medium power state). Transitioning from a first visual state (e.g., a low power state) to a second visual state (e.g., a medium power state) in response to detecting that device wake criteria have been met allows a user to bypass providing one or more inputs to transition a device from the first state to the second state by manually providing the one or more inputs. Performing an operation (automatically) when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the wake criteria are met when the electronic device is lifted, when a button (e.g., 1904) is pressed, and/or when a notification (e.g., 1914) is displayed. In some examples, the first visual state is a display off state, or a state in which a display of the electronic device is 10% of a maximum brightness state. In some examples, the second visual state is a higher brightness state of the display than the first visual state (e.g., 10% if the display is off in the first state; 20% if the display is at 10% in the first state). In some examples, the second visual state includes a first introduction screen (e.g., 1910) displayed at a first brightness (2006). In some examples, while in the second visual state, the electronic device displays (2010) a fourth user interface object (e.g., 1912) that indicates a visual state of the electronic device. In some examples, while in the second visual state, the electronic device displays (2012) a fifth user interface object (e.g., 1912) that indicates a visual state of the electronic device. In some examples, while the electronic device is in the first visual state (2008) (e.g., while the device is in the locked state), one or more features of the electronic device (e.g., a display (e.g., 1902), the one or more biometric sensors (e.g., 1903), a microphone, access to sensitive data such as content of messages and applications, the ability to perform destructive actions such as deleting photos or communications, and the ability to perform communication operations such as sending new information and sharing content stored on the device) are disabled (e.g., powered off or operating under reduced functionality). In some examples, transitioning to the second visual state includes enabling the one or more disabled functions of the electronic device. In some examples, transitioning to the second visual state includes the device entering a state in which the one or more disabling components of the electronic device are enabled. In some examples, enabling the one or more disable functions includes enabling a display (e.g., 1902) of the electronic device, the one or more biometric sensors (e.g., 1903), and/or a microphone.
In some examples, after transitioning the device to the second visual state (2014), in accordance with a determination that the selectable option of the electronic device (e.g., 1955) is enabled, the electronic device uses the first set of criteria as biometric authentication criteria when determining whether the biometric authentication criteria have been met. When determining whether the biometric authentication criteria have been met, using the first set of criteria as the biometric authentication criteria in accordance with a determination that the selectable options (e.g., 1955) of the device are enabled allows the user to easily provide authentication information to the device with minimal input. Performing an operation (automatically) when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the first set of criteria includes a requirement that the user's face be looking at a display of the electronic device (e.g., when determining whether to unlock the device and/or transition from the second visual state to the third visual state). In some examples, further after transitioning the device to the second visual state, when determining whether the biometric authentication criteria have been met, in accordance with a determination that the selectable option of the electronic device is not enabled, the electronic device uses the second set of criteria as the biometric authentication criteria. In some examples, the second set of criteria does not include a requirement that the user's face be looking at the display of the electronic device (e.g., when determining whether to unlock the device and/or transition from the second visual state to the third visual state). In some cases, the user enables gaze detection requirements (e.g., 1955), for example, using an accessibility option, where the device requires the user to look at the device during biometric authentication in order for the user's face to be recognized by the device.
In some examples, after transitioning to the second state, the electronic device determines (2016), via the one or more biometric sensors, whether biometric capture criteria are met. In some examples, the electronic device determines whether a biometric feature is present, for example, in a field of view of the one or more biometric sensors. In some examples, determining whether the biometric capture criteria are met includes determining (2018) whether the biometric capture criteria are met a first predetermined amount of time after the transition to the second visual state. In some examples, the electronic device detects the biometric characteristic immediately after transitioning to the second state. In some examples, the electronic device detects the biometric characteristic a period of time after transitioning to the second state. In some examples, in accordance with a determination that the biometric capture criteria are satisfied, the electronic device provides (2020), by the one or more biometric sensors, biometric data associated with the biometric feature. In some examples, once the electronic device has transitioned to the second visual state (recall that the one or more biometric sensors were enabled prior to or during the transition), the electronic device captures biometric data using the one or more biometric sensors that were enabled.
In some examples, the electronic device transitions (2022) the electronic device from the second visual state to a third visual state (e.g., a high power state) in accordance with a determination that a biometric authentication criterion has been met based on biometric data provided by the one or more biometric sensors (e.g., a biometric feature such as a face is authenticated by the device). Transitioning the device from the second visual state (e.g., a medium power state) to the third visual state (e.g., a high power state) in accordance with a determination that biometric authentication criteria have been met based on biometric data provided by the one or more biometric sensors allows a user to bypass providing one or more inputs to transition the device from the second state to the third state by manually providing the one or more inputs. Performing an operation (automatically) when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, a display of the electronic device is turned on with a second, relatively high brightness when the electronic device is in the third visual state. In some examples, the transition from the second visual state to the third visual state is a continuation of the transition from the first visual state to the second visual state (2024). In some examples, during transitions from the first visual state to the second visual state and from the second visual state to the third visual state, the display continues to brighten from off to low brightness and finally to high brightness in response to authentication. In some examples, the transition to the second visual state transitions to a particular brightness, and the transition from the second visual state to the third visual state transitions from the particular brightness. In some examples, each increment is made according to the same function. In some examples, the transition to the second visual state includes enlarging at least a respective user interface element (e.g., 1912) in the first visual state, and the transition to the third visual state includes further enlarging the respective user interface element (e.g., 1912, 1913, 1921). In some examples, the second visual status indicates that the device is in a locked state and the third visual status indicates that the device is in an unlocked state.
In some examples, the electronic device displays (2026) an unlock animation that includes a fifth user interface object (e.g., 1912) further in accordance with a determination that the biometric authentication criteria has been met based on the biometric data provided by the one or more biometric sensors. Displaying an unlocking animation that includes a user interface object (e.g., fifth user interface object 1912) in accordance with a determination that biometric authentication criteria have been met based on biometric data provided by the one or more biometric sensors provides visual feedback by allowing a user to quickly identify that authentication is successful and that the device has therefore been unlocked. Providing the user with improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide appropriate input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the fifth user interface object is a lock. In some examples, the unlock animation is based on a context of the electronic device, such as location or type. In some examples, the fifth user interface object (e.g., 1912, 1922) has a first (e.g., locked) state when the electronic device is in the second visual state and a second (e.g., unlocked) state (2028) when the electronic device is in the third visual state. In some examples, the visual state element transitions from the first state to the second state (2030) during an unlock animation (including animations 1912, 1913, 1921, 1922). In some examples, to demonstrate that biometric authentication has been successful, the electronic device displays an animation (including animations 1912, 1913, 1921, 1922) in which the lock is unlocked.
In some examples, the third visual state includes a second introduction screen (2032) at a second brightness higher than the first brightness. In some examples, the first introduction screen (e.g., 1910) and the second introduction screen (e.g., 1920) are the same screen except for the brightness degree of each screen.
In some examples, transitioning from the second visual state to the third visual state includes adjusting (2034) (e.g., increasing) a size of a first user interface object (e.g., 1912) displayed on a display of the electronic device. In some examples, the electronic device adjusts the size of all displayed user interface objects. In some examples, the electronic device adjusts the size of less than all of the displayed user interface elements. In some examples, the first user interface object (e.g., 1912) is a lock icon, and adjusting the size of the first user interface object includes increasing (2036) the size of the first user interface object. In some examples, transitioning from the second visual state to the third visual state includes changing a degree of blur of a second user interface object displayed on a display of the electronic device. In some examples, one or more blur parameters, such as a blur radius and/or a blur magnitude, of one or more displayed user interface objects (e.g., wallpaper) are increased and/or decreased. In some examples, the blur parameters of all user interface objects are changed. In some examples, the blur parameters of less than all of the user interface objects are changed. In some examples, the first user interface object and the second user interface object are the same element. In some examples, transitioning from the second visual state to the third visual state includes translating a position of a third user interface object displayed on a display of the electronic device from a first position (e.g., a transition position; a transition position without rotation) to a second position. In some examples, the lock icon is moved near an edge of a display of the electronic device before or during the unlock animation. In some examples, transitioning the device from the second state to the third visual state includes outputting a haptic output (e.g., 1926). In some examples, the electronic device outputs a tactile output indicating that the biometric authentication criteria have been met when the unlock animation is displayed.
In some examples, the third visual state corresponds to an unlocked state (2038). In some examples, while in the third visual state (e.g., while the device is unlocked), the electronic device receives (2040) a lock input (e.g., 1948, press on button 1904, 1952). In some examples, the lock input is a press of a button (e.g., 1904), such as a hardware button, or a selection of an affordance (e.g., 1922, 1950) indicating an intent to lock the electronic device. Further, while in the third visual state, the electronic device transitions (2042) from the third visual state to the locked state in response to receiving a lock input. In some examples, the device is locked in response to one or more particular inputs.
In some examples, while in the locked state, the device is prevented from performing one or more operations available in the unlocked state (e.g., displaying a home screen, displaying the content of a notification, launching an application, sending a communication). Preventing the device from performing one or more operations available in the unlocked state while in the locked state enhances device security by prohibiting certain functions or operations from being performed on the device while the device is in the locked state rather than in the unlocked state. Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, the electronic device displays (2044) a locking animation that includes a sixth user interface object (e.g., 1912, 1922) that indicates a visual state of the electronic device. Displaying a lock animation that includes a particular user interface object (e.g., sixth user interface object, 1912, 1922) provides visual feedback by allowing a user to quickly recognize that the device is in a locked state. Providing the user with improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide appropriate input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the sixth user interface object is a lock. In some examples, the sixth user interface object has a first appearance (e.g., an open lock) when the electronic device is in the third visual state and a second appearance (e.g., a closed lock) when the electronic device is in the locked state (2046). In some examples, the sixth user interface object transitions from the first appearance to the second appearance during the locking animation (2048). In some examples, to demonstrate that the electronic device has been locked, the electronic device displays an animation in which the lock is locked. In some examples, transitioning the device from the third visual state to the locked state includes outputting (2050) a haptic output (e.g., 1952). In some examples, the haptic output includes a single tap. In some examples, the haptic output includes a multiple tap. In some examples, the haptic output is timed to synchronize with an animation of the sixth user interface object moving back and forth (e.g., the lock shaking back and forth). In some examples, displaying the lock animation includes displaying a current time. In some examples, the electronic device displays a time when transitioning to the locked state.
In some examples, the biometric authentication criteria include a requirement that the user be looking at a display of the electronic device with a face that is consistent with the one or more authorized faces. Including the requirement that the user is looking at the display of the device with a face that is consistent with one or more authorized faces for the biometric authentication criteria, enhances device security by allowing authentication success only by (the face of) the authorized user of the device. Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently. In some examples, unlocking the electronic device requires the user to be looking at the electronic device.
In some examples, the electronic device maintains (2054) the electronic device in the second visual state in accordance with a determination that biometric authentication criteria (2052) have not been met based on biometric data provided by the one or more biometric sensors. Maintaining the device in the second visual state in accordance with a determination that biometric authentication criteria have not been met based on biometric data provided by the one or more biometric sensors enhances device security by inhibiting the device from transitioning to a state requiring authentication if appropriate authentication criteria are not met. Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, if the biometric characteristic is not authenticated, the display of the device is not further illuminated as it would in response to authentication of the biometric characteristic. In some examples, when biometric authentication criteria have not been met before the device receives an explicit request (e.g., 1930) to unlock the device (e.g., a swipe gesture from a lower portion of the device, a press of a home button, or other input indicating that the user wants to view and/or interact with content that is not available when the device is in a locked state), the device displays an unlock interface (e.g., 1932) when attempting to authenticate the user via one or more forms of authentication, such as biometric authentication, password authentication, pattern authentication, and the like. Examples of authenticating a user in response to a request to unlock a device via different forms of authentication will be described in more detail with reference to fig. 26A to 26 AS. In some examples, the electronic device alternates (2056) the location of the fourth user interface object (e.g., 1912) between the first location and the second location further in accordance with determining that the biometric authentication criteria have not been met based on the biometric data provided by the one or more biometric sensors. In some examples, to demonstrate that biometric authentication has failed, the electronic device shakes a lock icon displayed in the introduction interface. In some examples, a tactile output is provided in conjunction with the lock icon being shaken (e.g., 1918). In some examples, no haptic output is provided.
In some examples, the electronic device detects that the lock condition has been met while the device is in the unlocked state. In some examples, in response to detecting that the lock condition has been met, in accordance with a determination that the lock condition is an explicit lock input (e.g., 1922, press of button 1904, 1952) (e.g., press of a power button, tap of a lock icon, etc.), the electronic device transitions the device from the unlocked state to the locked state and outputs a corresponding lock indication (e.g., 1912). In some examples, the respective lock indication includes a visual, audio, and/or tactile output indicating that the device has transitioned from the unlocked state to the locked state. In some examples, further in response to detecting that the lock condition has been met, in accordance with a determination that the lock condition is a conservative lock condition (e.g., overriding the proximity sensor, receiving no input for a long period of time, etc.), the electronic device transitions the device from the unlocked state to the locked state without outputting a corresponding lock indication.
In some examples, the electronic device detects a request to display a biometric authentication settings interface after detecting that the device wake criteria have been met. In some examples, the request to display the biometric authentication settings interface includes: a swipe from an edge of the display to display a control panel user interface comprising a plurality of controls including a control associated with enabling or disabling biometric authentication; a long press of one or more hardware buttons, the long press causing the device to display a settings user interface, the settings user interface including one or more controls, the one or more controls including a control associated with enabling or disabling biometric authentication; or navigating in one or more menus in a setup application to a set of controls associated with biometric authentication, the set of controls including one or more controls including a control associated with enabling or disabling biometric authentication. In some examples, the electronic device displays a biometric authentication settings interface (e.g., 1954) in response to a request to display the biometric authentication settings interface. In some examples, while displaying the biometric authentication settings interface, the electronic device receives a first user input corresponding to a request to disable biometric authentication (e.g., 1958). In some examples, the electronic device disables biometric authentication in response to receiving the first user input. In some examples, the electronic device receives a request to unlock the device while biometric authentication is disabled and while the device is in a locked state. In some examples, in response to receiving a request to unlock the device, the electronic device outputs a prompt to authenticate with a different form of authentication than biometric authentication (e.g., "enter password to unlock" as shown in fig. 19 AA). In some examples, the different forms of authentication are passwords, fingerprints, and the like.
In some examples, in accordance with a determination that a first predetermined amount of time after the transition to the second visual state does not satisfy the biometric capture criteria, the electronic device determines whether a second predetermined amount of time after the first predetermined amount of time has elapsed satisfies the biometric capture criteria. In some examples, the delay between attempts to detect biometric features becomes longer and longer. In some examples, biometric authentication is disabled once a biometric authentication attempt threshold has been reached.
Note that the details of the processes described above with respect to method 1200 (e.g., fig. 20A-20F) may also be applied in a similar manner to the methods described herein. For example, method 2000 optionally includes one or more of the features of the various methods described herein with reference to methods 800, 1000, 1200, 1400, 1600, 1800, 2200, 2500, and 2700. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to fig. 19A-H. As another example, in response to receiving an input prior to completing the biometric authentication process, one or more of the plugin interfaces described in method 2700 are optionally displayed. For the sake of brevity, these details are not repeated here.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 20A-20F are optionally implemented by the components depicted in fig. 1A-1B. For example, detection operation 2002, conversion operation 2004, and conversion operation 2022 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Fig. 21A-21 AQ illustrate example user interfaces for biometric authentication, according to some examples. As described in more detail below, the illustrative examples of user interfaces shown in fig. 21A-21 AQ are used to illustrate the processes described below, including the processes in fig. 22A-22F.
Fig. 21A shows an electronic device 2100 (e.g., portable multifunction device 100, device 300, or device 500). In the illustrative example shown in fig. 21A-21 AQ, the electronic device 2100 is a smartphone. In other examples, the electronic device 1500 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 2100 has a display 2102, one or more input devices (e.g., a touchscreen of the display 2102, buttons 2104, a microphone), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 2103), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 2103 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In fig. 21A-21C, the electronic device 2100 performs biometric authentication with a user of the electronic device 2100. Referring to fig. 21A, the electronic device 2100 is in a locked state. As shown, in some examples, the display 2102 of the electronic device 2100 is disabled while in the locked state. In other examples, while in the locked state, the display 2102 of the electronic device 2100 is enabled and the electronic device 2100 displays a locked state interface (e.g., the locked state interface 2110 of fig. 21C) indicating that the electronic device 2100 is in the locked state. While the device 2100 is in the locked state, the electronic device initiates biometric authentication. In fig. 21B, the electronic device 2100 initiates biometric authentication in response to detecting a wake condition (e.g., the user moving the device in a predetermined manner). It should be appreciated that the electronic device initiates biometric authentication in response to any number of wake conditions, including but not limited to: movement of the device (e.g., lifting), pressing of a device button, or touching of the display 2102.
In some examples, initiating biometric authentication includes obtaining (e.g., capturing using the one or more biometric sensors 2103) data corresponding to at least a portion of a biometric feature of the user. In response to initiating biometric authentication, the electronic device 1500 obtains (e.g., captures) and processes (e.g., analyzes) the biometric data, for example, to determine whether the biometric feature (or a portion thereof) satisfies a biometric authentication criterion based on the biometric data (e.g., determines whether the biometric data matches the biometric template within a threshold). In some examples, biometric authentication requires a user to be looking at the device during biometric authentication. Thus, as shown in fig. 21B, when the user lifts the device 2100, the user's gaze 2106 is directed at the electronic device.
In FIG. 21C, in response to the wake condition, the electronic device 2100 displays a lock status interface 2110 that includes a lock status indicator 2112. In some examples, upon displaying the locked state interface 2110, the electronic device 2100 also displays a flashlight affordance 2107 and a camera affordance 2108. In some examples, activation of flashlight affordance 2107 causes the electronic device to launch a flashlight application. In some examples, activation of the camera affordance 2108 causes the electronic device 2100 to load a camera application.
In fig. 21D, the electronic device 2100 determines that the biometric authentication is successful and, in response, displays an unlock state interface 2120. In some examples, the display of the unlock state interface 2120 includes a display of an unlock animation, as described with reference to fig. 19D-G. While displaying the unlocked state interface 2120, the electronic device 2100 also displays the flashlight affordance 2107 and the camera affordance 2108 (e.g., maintains their display). In some examples, the electronic device 2100 outputs a tactile output 2126 in response to determining that the biometric authentication is successful.
In fig. 21E, the electronic device 2100 determines that the biometric authentication was unsuccessful. In response, the electronic device 2100 maintains the display of the lock status interface 2110. In some examples, the electronic device displays a shake animation in which the lock status indicator 2112 is moved edge-to-edge to simulate a "shake" effect to indicate that biometric authentication was unsuccessful. The electronic device 2100 also outputs a tactile output 2118 to indicate that the biometric authentication was unsuccessful.
In some examples, one or more operations that are accessible during display of the locked state interface 2110 do not require authentication and thus may be performed while the electronic device is in the locked state. By way of example, no authentication is required to load the flashlight application in response to activation of the flashlight affordance 2107. As another example, referring to fig. 21F, in some examples, the electronic device 2100 detects activation of the camera affordance 2108 while in the locked state. As shown, the activation of the camera affordance 2108 is a tap gesture 2130 on the camera affordance 2108. In fig. 21G, in response to detecting activation of the camera affordance 2108, the electronic device 2100 displays a camera application interface 2132 associated with a camera application on the display 2102 (e.g., in place of the display of the lock status interface 2110).
Referring to fig. 21H, in some examples, while displaying the unlock state interface 2120, the electronic device 2100 displays a prompt 2124 indicating that the device is unlocked and/or that providing a predetermined type of input (e.g., a swipe gesture) will allow the user to access secure content such as a home screen interface (e.g., home screen interface 2129 of fig. 21I). For example, as shown, the electronic device 2100 detects user input 2128, e.g., near an edge of the display 2102. The user input 2128 is a swipe gesture, which in some examples is a request to access the home screen interface of the electronic device 2100, and in response to the swipe input 2128, the electronic device displays the home screen interface 2129 of fig. 21I (e.g., replaces the display of the unlock interface 2120 with it). In some examples, displaying the home screen interface 2129 includes sliding the unlocked state interface 2120 in an upward direction to display (e.g., expose) the home screen interface 2129, as similarly described with reference to fig. 19P-R.
In fig. 21J, the electronic device is in a locked state (as described with reference to fig. 21A-C and 21E), for example, in response to a failed biometric authentication and displays a locked state interface 2110 when in the locked state. Upon displaying the lock status interface 2110, the electronic device 2100 displays a prompt 2133 indicating that the device is locked and/or that providing a predetermined type of input (e.g., a swipe gesture) will allow the user to authenticate with the electronic device 2100 (and unlock the electronic device). For example, as shown, the electronic device 2100 detects user input 2134, e.g., near an edge of the display 2102. The user input 2134 is a swipe gesture, which in some examples is a request to access the home screen interface of the electronic device 2100. Because the device is in a locked state (e.g., the user is not authenticated with the electronic device 2100), the electronic device displays an alternate authentication interface 2140 (e.g., replaces the display of the lock interface 2120 with it) in response to the swipe input 2128, as shown in fig. 21K. In some examples, alternative authentication interface 2140 includes a lock state indicator 2142 that indicates that the electronic device 2100 is in a locked state.
In fig. 21K-21M, the electronic device 2100 performs biometric authentication while displaying an alternative authentication interface 2140. In particular, the electronic device 2100 detects and/or obtains biometric data of a face while the alternative authentication interface 2140 is displayed. The electronic device 2100 then processes the biometric data to determine whether the biometric data satisfies biometric authentication criteria. As shown in fig. 21L, the electronic device 2100 displays a biometric authentication process designator 2144 (e.g., replacing the display of the lock status indicator 2142 therewith) to indicate that the electronic device is processing biometric data. In fig. 21M, the electronic device 2100 determines that the biometric authentication performed during display of the alternative authentication interface 2140 was successful. Accordingly, the electronic device 2100 displays the biometric authentication success flag 2146 (e.g., replaces the display of the biometric authentication processing flag 2144 therewith) to indicate that the biometric authentication is successful. In some examples, the electronic device 2100 also completes the progress of the password progress indicator and optionally provides a tactile output 2141 to indicate a successful biometric authentication.
Alternatively, referring to fig. 21N-21P, the user enters a password to authenticate with the electronic device 2100 during display of the alternative authentication interface 2140. As shown in fig. 21, the electronic device 2100 displays an alternative authentication interface 2140 and, as shown in fig. 21O, the password is received at least partially in response to a tap gesture 2148 and optionally one or more inputs indicating additional alphanumeric digits of the password. In fig. 21P, the electronic device 2100 determines that the password is valid and, in response, displays a notification 2150 indicating that the password is valid and that the user has authenticated with the electronic device 2100.
In some examples, the electronic device 2100 selectively stores and/or updates biometric data in response to entry of a valid password. For example, in response to entry of a valid password, the electronic device 2100 obtains biometric data (e.g., facial biometric data) and compares the biometric data to biometric data stored in the electronic device. In some examples, if the obtained biometric data is sufficiently similar to the stored biometric data, the electronic device stores the obtained biometric data and/or updates previously stored biometric data to improve biometric authentication. In fig. 21P, the electronic device determines that the biometric data obtained in response to entry of a valid password is sufficiently similar to the stored biometric data. In response, the electronic device stores the obtained biometric data and/or updates the stored biometric data, and displays an indication 2152 that the biometric data has been updated. In this way, the electronic device 2100 provides adaptive biometric authentication.
As described with reference to fig. 21A-21C, in some examples, the electronic device 2100 performs biometric authentication in response to a wake condition. In some examples, the electronic device receives a request to access secure content (e.g., content that requires authentication to access) before biometric authentication has been completed, such as a swipe gesture requesting access to the home screen. Thus, referring to fig. Q-S, in response to receiving a request to access secure content, the electronic device 2100 displays a plug-in interface to indicate that the electronic device has not completed biometric authentication. In fig. 21Q, the electronic device displays a plug-in interface 2154 that includes an alternative authentication affordance 2156 and a biometric authentication flag symbol 2160 indicating initiation of biometric authentication. Activation of alternative authentication affordance 2156 causes the electronic device to display (e.g., replace display of plug-in interface 2154 with) an alternative authentication interface (e.g., alternative authentication interface 2140 of fig. 21K). In some examples, the biometric authentication flag symbol 2160 is an emulation of a representation of a biometric feature.
Once the electronic device 2100 has obtained the biometric data in response to initiating the biometric authentication, the electronic device processes the biometric data, as described. In some examples, as the electronic device processes the biometric data, the electronic device displays the biometric authentication flag symbol 2162 (e.g., replaces the display of the biometric authentication flag symbol 2160) to indicate that the biometric data is being processed. In some examples, the biometric authentication flag symbol 2162 includes multiple rings that, for example, spherically rotate when displayed.
In fig. 21S, the electronic device 2100 determines that the biometric data satisfies the biometric authentication criteria. In response, the electronic device 2100 displays the biometric authentication flag symbol 2163 in the drop-in interface 2154 (e.g., replacing the display of the biometric authentication flag symbol 2162 with it), indicating that the biometric authentication was successful. In some examples, the electronic device ceases display of the alternative authentication affordance 2156. Additionally or alternatively, the electronic device displays the unlock status indicator 2122 (e.g., replacing the display of the lock status indicator 2161 with it) and/or outputs a tactile output 2164 indicating that the biometric authentication was successful.
As described, in some cases, the electronic device receives a request to access secure content before biometric authentication has been completed. In some examples, the electronic device receives the request after the electronic device begins processing the biometric data but before the biometric authentication is completed. In such cases, the electronic device optionally displays the drop-in interface 2154 with the biometric authentication logo symbol 2162, and omits first displaying the biometric authentication logo symbol 2160.
In some examples, one or more functions of the electronic device are selectively enabled based on whether the user is looking at the electronic device 2100. Referring to fig. 21T-Y, in some examples, some functions are disabled without the user looking at the electronic device 2100 and are enabled when the user is looking at the electronic device 2100. In fig. 21T, the user's gaze 2165 is not directed at the electronic device 2100. In response to determining that the gaze 2165 is not directed at the electronic device 2100, the electronic device 2100 disables the respective functions associated with the flashlight affordance 2107 and the camera affordance 2108, as shown in fig. 21U. When the functionality associated with the flashlight affordance 2107 and the camera affordance 2108 is disabled (e.g., when the user is not looking at the device 2100), the electronic device receives an activation of the camera affordance 2108. As shown, the activation is a tap gesture 2166 on the camera affordance 2108. Because the functionality associated with the affordance is disabled, the electronic device forgoes responding to the tap gesture 2166 (e.g., forgoes loading the camera application).
In fig. 21V, a user's gaze 2168 is directed at the electronic device. In response to determining that gaze 2168 is directed at electronic device 2100, electronic device 2100 enables the respective functionality associated with flashlight affordance 2107 and camera affordance 2108, as shown in fig. 21W. In FIG. 21X, the electronic device 2100 detects activation of the camera affordance 2108. As shown, the activation is a tap gesture 2170 on the camera affordance 2108. In response to activation of the camera affordance 2108, the electronic device displays a camera application interface 2132 (e.g., display in place of the lock status interface 2110) associated with the camera application on the display 2102 (fig. 21Y).
Referring to FIG. 21Z, in some examples, the electronic device 2100 displays the lock state interface 2110 including the notification affordance 2174 while in the lock state. As described, in response to a wake condition, the electronic device initiates biometric authentication. Upon displaying the lock status interface 2110 and prior to completing biometric authentication, the electronic device 2100 receives a request to access secure content. By way of example, in FIG. 21AA, electronic device 2100 detects activation of notification affordance 2174. As shown, the activation of notification affordance 2174 is a tap gesture 2176.
Referring to fig. 21AB, in response to activation of the notification affordance 2174, the electronic device 2100 displays an inserted biometric authentication interface 2178 with a biometric authentication progress indicator 2182 and an alternative authentication affordance 2180 (e.g., replacing display of the lock state interface 2110 therewith). In some examples, the biometric authentication progress indicator 2182 includes a biometric authentication flag symbol, such as the biometric authentication flag symbol 2183, that indicates the progress of biometric authentication. In some examples, the biometric authentication progress indicator further identifies secure content associated with the request for access to the secure content (e.g., the "message"). Activation of the alternative authentication affordance 2180 causes the electronic device to display an alternative authentication interface, an example of which is described further below.
While the plug-in biometric authentication interface 2178 is displayed, the electronic device 2178 continues to perform biometric authentication initiated in response to the wake condition. In some examples, initiating biometric authentication includes obtaining (e.g., capturing using the one or more biometric sensors) data corresponding to a biometric characteristic of the user. Referring to fig. 21AC, in response to obtaining the data, the electronic device processes the biometric data, e.g., to determine whether the biometric feature satisfies a biometric authentication criterion based on the biometric data (e.g., to determine whether the biometric data matches the biometric template within a threshold). As the electronic device 2100 processes the biometric data, the electronic device 2100 optionally displays the biometric authentication marker 2184 in the plug-in biometric authentication interface 2178 (e.g., replacing the display of the biometric authentication marker 2183 with it), indicating that the biometric data is being processed.
In fig. 21AD, the electronic device 2100 determines that the biometric characteristic satisfies the biometric authentication criterion. In response, the electronic device displays (e.g., replaces the display of) the biometric authentication marker 2185 in the plug-in biometric authentication interface 2178 indicating that the biometric authentication is successful. Additionally or alternatively, the electronic device displays the unlock status indicator 2122 (e.g., replacing the display of the lock status indicator 2112 with it) and/or outputs a tactile output 2164 indicating that the biometric authentication was successful.
As shown in fig. AE-AF, in response to determining that biometric authentication is successful, electronic device 2100 displays an instant messaging application interface 2194 (e.g., replacing the display of plug-in biometric authentication interface 2178 therewith). In some examples, displaying the instant messaging application interface 2194 includes sliding the biometric authentication interface 2178 in an upward direction to display (e.g., expose) the instant messaging application interface 2194, as similarly described with reference to fig. 19P-R.
The diagram AG-AI describes the display of an alternative way in which the progress of biometric authentication is displayed. As described with reference to fig. 21AA (and as shown in fig. 21 AG), prior to completing biometric authentication, the electronic device 2100 receives a request to access secure content while displaying the lock state interface 2110. In some examples, the request is an activation 2176 of notification affordance 2174. In response to activation 2176 of notification affordance 2174, the electronic device maintains display of the lock state interface 2110. In addition, as shown in fig. 21AH, the electronic device 2100 displays a biometric authentication flag symbol 2184 (e.g., replacing the display of the lock status indicator 2110 therewith) to indicate that biometric data is being processed. In fig. 21AI, the electronic device determines that the biometric authentication was successful and, in response, displays an unlock status indicator 2122 (e.g., replacing the display of the biometric authentication flag symbol 2184 therewith). Optionally, the electronic device 2100 also outputs a tactile output 2193 indicating that the biometric authentication was successful. In some examples, because the electronic device transitioned to the unlocked state in response to determining that the biometric authentication was successful, electronic device 2100 displays notification affordance 2175 (e.g., replaces the display of notification affordance 2174 with it). In some examples, the notification affordance 2174 identifies secure content (e.g., "John applied … meeting where …").
In fig. 21AJ, in response to processing the biometric data (as described with reference to fig. AC), the electronic device determines that the biometric authentication was unsuccessful. In response, the electronic device 2100 displays a biometric authentication landmark 2189 in the biometric authentication interface 2178 indicating that the biometric authentication was unsuccessful (e.g., replacing the display of the biometric authentication landmark 2184 therewith). Additionally or alternatively, the electronic device alternates the position of the status indicator 2112 to simulate a "shake effect" to indicate that biometric authentication was unsuccessful and/or to output a tactile output 2193 indicating that biometric authentication was unsuccessful.
Upon display of the plug-in biometric authentication interface 2178, the electronic device detects activation of the alternative authentication affordance 2180. In some examples, the activation of alternative authentication affordance 2108 is a tap gesture 2192. Referring to fig. 21AK, in response to activation of alternative authentication affordance 2180, the electronic device displays an alternative authentication affordance 2198. In some examples, alternative authentication affordances 2198 include an indicator 2199 that identifies secure content associated with a request to access the secure content (e.g., a "message").
Referring to fig. 21 AL-21 AM, a valid password (or passcode) is received by electronic device 2100 at least partially in response to a tap gesture 2102A (fig. 21L) and optionally one or more other inputs indicating additional alphanumeric digits of the valid password. As shown in fig. 21N-21O, once a valid password has been received, the electronic device is unlocked and displays an instant messaging application interface 2194 (e.g., replacing the display of an alternative authentication interface 2198 therewith). In some examples, displaying the instant messaging application interface 2194 includes sliding an alternative authentication interface 2198 in an upward direction to display (e.g., expose) the instant messaging application interface 2194, as similarly described with reference to fig. 19P-in reference to fig. 19R.
In some examples, the electronic device also determines that a threshold number of biometric authentication attempts have been reached in response to determining that the biometric authentication was unsuccessful. Thus, as shown in fig. 21AP, the electronic device 2100 indicates that the threshold has been reached using the biometric authentication progress indicator 2182 ("facial authentication temporarily disabled"). As illustrated in diagram AK, upon display of plug-in biometric authentication interface 2178, the electronic device detects activation of alternate authentication affordance 2180 and, in response to activation of alternate authentication affordance 2180, displays alternate authentication affordance 2198. As shown in fig. 21AQ, if the electronic device determines that a threshold number of biometric authentication attempts has been reached, an indicator 2199 identifies that biometric authentication is re-enabled in response to entry of a valid password ("enter password to re-enable facial authentication").
22A-22F are flow diagrams illustrating methods for performing biometric authentication using an electronic device, according to some examples. The method 2200 is performed at a device (e.g., 100, 300, 500, 2100) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2200 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
Method 2200 provides an intuitive way to perform authentication of biometric features, as described below. The method reduces the cognitive burden of the user in performing authentication of the biometric features, thereby creating a more efficient human-machine interface and a more intuitive user experience. For battery-driven computing devices, enabling a user to more quickly and efficiently manage authentication of biometric features conserves power and increases the interval between battery charges.
In some examples, an electronic device (e.g., 2100) receives (2202) a request to perform an operation that does not require authentication while the device is in a locked state. In some examples, the electronic device performs the operation without waiting for authentication in response to a request to perform an operation that does not require authentication. Performing an operation that does not require authentication without waiting for authentication allows the user to more quickly access the operation without having to provide additional input (e.g., input that instructs the device to proceed). Performing an operation (automatically) when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the request to perform the operation that does not require authentication includes a request (e.g., 2130) to enable a camera of the electronic device and/or access a camera function of the device, such as displaying a camera user interface (e.g., 2132) for capturing images and/or video with the device. In some examples, the operation that does not require authentication includes displaying an application user interface that includes one or more constrained features that are constrained without successful authentication (e.g., sharing captured photos or videos, viewing photos or videos captured during previous use of the camera application when the device is unlocked), and the device attempts biometric authentication while displaying the application user interface. Displaying an application user interface that includes one or more constrained features that are constrained without successful authentication provides visual feedback by allowing a user to quickly see which features of the application are currently constrained without proper authentication. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. The restricted feature is enabled if the biometric authentication is successful when the application user interface is displayed, and remains disabled if the biometric authentication is unsuccessful (e.g., the user is prevented from sharing captured photos or videos, viewing photos or videos captured during previous use of the camera application when the device is unlocked, and optionally prompted to provide authentication in response to attempting to use any of the restricted features).
In some examples, while the electronic device is in the locked state, the electronic device displays one or more affordances (e.g., 2107, 2108) on a touch-sensitive display (e.g., 2102) for performing operations that do not require authentication (e.g., a flashlight affordance for enabling a flashlight mode of operation in which a light on the device is turned on and/or a camera affordance for accessing camera functionality of the device, such as displaying a camera user interface for capturing images and/or video with the device).
In some examples, while displaying the one or more affordances for performing operations that do not require authentication, the electronic device detects activation (e.g., 2130) of a respective affordance (e.g., 2107, 2108) of the one or more affordances for performing operations that do not require authentication. In some examples, in response to detecting activation of a respective affordance of the one or more affordances for performing operations that do not require authentication, in accordance with a determination that a face is looking at a display of the electronic device when the activation of the respective affordance is detected (e.g., a determination that a face having facial features indicating that the face is looking at the electronic device is in a field of view of one or more cameras or one or more biometric sensors of the device), the electronic device performs an operation associated with the respective affordance. Performing the operation associated with the respective affordance in accordance with a determination that the face (e.g., of the user) is looking at the display of the device reduces power usage and extends battery life of the device (e.g., perform the operation when the device detects that the user is looking at the device, and not perform the operation if the user is not looking at the device (which optionally indicates that the affordance was inadvertently selected).
In some examples, if the flashlight affordance (e.g., 2107) is activated when the face is looking at the display of the electronic device, the electronic device enables a flashlight mode of operation in which a light on the device is turned on, and/or if the camera affordance (e.g., 2108) is activated when the face is looking at the display of the electronic device, the electronic device accesses camera functionality of the device, such as displaying a camera user interface for capturing images and/or video with the device. In some examples, in accordance with a determination that a face is not looking at a display of the electronic device when activation of the respective affordance is detected (e.g., a determination that a face is not detected or detected in a field of view of one or more cameras or one or more biometric sensors of the device but has facial features indicating that the face is looking away from the electronic device), the electronic device forgoes performing an operation associated with the respective affordance. In some examples, if the flashlight affordance is activated when the face is not looking at the display of the electronic device, the electronic device forgoes enabling a flashlight mode of operation in which a light on the device is turned on, and/or if the camera affordance is activated when the face is not looking at the display of the electronic device, the electronic device forgoes accessing camera functionality of the device, such as displaying a camera user interface for capturing images and/or video with the device.
In some examples, while the electronic device is in the locked state, the electronic device detects a condition associated with performing a biometric authentication check using a biometric sensor without explicit input from the user requesting biometric authentication (e.g., the user lifts the device 2100 to the position shown in fig. 21B). Conditions associated with using a biometric sensor to perform a biometric authentication check without explicit input from the user requesting biometric authentication include raising the device and/or pressing a display wake-up button (e.g., 2104).
In some examples, the one or more biometric sensors include a contactless biometric sensor (e.g., 2103) (e.g., a facial recognition sensor) configured to capture biometric data (2204) associated with biometric features located within a predetermined distance range from the contactless biometric sensor (e.g., 2103). In some examples, the biometric sensor includes a camera. In some examples, the biometric sensor includes a light projector (e.g., an IR flood or structured light projector).
In some examples, without successful authentication, the device is constrained to not perform more than a predefined number of biometric authentication checks (2206). Restricting the device from performing more than a predefined number of biometric authentication checks without successful authentication enhances device security by limiting fraudulent authentication attempts on the device. Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, successful authentication includes successful authentication by biometric authentication or any other form of authentication, such as with a password, or pattern. In some examples, in response to detecting the condition, the device performs less than a predefined number of biometric authentication checks, such that at least one biometric authentication check is retained for use in responding to detecting the request to perform the respective operation. In some examples, the electronic device tracks a number of failed authentication attempts, such as a number of consecutive failed attempts during which there is no successful authentication (e.g., biometric authentication or other authentication, such as password authentication). In some such examples, if the maximum number of failed attempts has been reached, the device does not perform biometric authentication until a successful non-biometric authentication is received. In some examples, a request to perform an operation requiring authentication after the maximum number of failed biometric authentication checks has been reached triggers the display of an alternative authentication user interface (e.g., a password, pattern, or other authentication interface).
In some examples, the electronic device detects (2208) a display wake condition while a display (e.g., 2102) of the electronic device is disabled. In some examples, the display wake condition includes movement of the device in a predefined manner, such as movement of the device beyond a threshold amount, movement of the device into an orientation associated with waking the device, activation of a display wake button, or a gesture, such as a tap, on the touch-sensitive surface.
In some examples, in response to detecting the condition, the electronic device performs (2210) a first biometric authentication check. Performing a biometric authentication check in response to detecting a condition (e.g., a wake condition) allows a user to provide authentication information to a device with minimal input and respond more quickly and efficiently to detecting a wake condition. Performing an operation (automatically) when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, performing the first biometric authentication check includes capturing (2212) first biometric data using the one or more biometric sensors. In some examples, the electronic device initiates a first biometric authentication procedure that includes capturing first biometric data using the one or more biometric sensors. In some examples, performing the first biometric authentication check includes transitioning (2216) the device from the locked state to the unlocked state after capturing the first biometric data (2214) (e.g., in response to capturing the first biometric data or in response to a request to unlock the device) in accordance with a determination that the first biometric data satisfies biometric authentication criteria. Transitioning the device from the locked state to the unlocked state in accordance with a determination that the first biometric data satisfies the biometric authentication criteria enhances device security by unlocking the device when the authentication process is successful (but, in some examples, if the authentication is unsuccessful, the device is prohibited from being unlocked). Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, the electronic device determines whether the first biometric data meets biometric authentication criteria. In some examples, the biometric authentication criteria include a criterion that is met when the first biometric data matches biometric data (e.g., facial feature data, fingerprint data, iris data) corresponding to an authorized user. In some examples, while in the unlocked state and prior to detecting the request to perform the respective operation, the electronic device outputs (2218) a prompt (e.g., a visual, audio, or tactile output) in accordance with the prompt criteria, the prompt corresponding to the instruction to provide the request to perform the respective operation. In some examples, the device is in an unlocked state after detecting the face of the authorized user. In some examples, the electronic device displays an instruction (e.g., 2124) to "swipe up" to access the home screen (e.g., 2129). In some examples, the prompt criteria include a requirement that a user's gaze (e.g., 2168) be directed at the electronic device (2220). In some examples, the prompting criteria include a requirement that the device detect facial features (2222) that indicate a face looking at the electronic device (e.g., detecting that a user's gaze is directed at the electronic device) for at least a predetermined amount of time.
In some examples, in accordance with a determination that the first biometric data does not satisfy the biometric authentication criteria, the electronic device maintains (2224) the device in a locked state. Maintaining the device in the locked state in accordance with a determination that the first biometric data does not satisfy the biometric authentication criteria enhances device security by preventing fraudulent and/or unauthorized access to the device. Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently. In some examples, the device remains locked in response to a failed authentication if the biometric data corresponding to the biometric feature does not match the biometric authentication template.
In some examples, after performing the first biometric authentication check, the electronic device detects (2226), via the device, the request (e.g., 2134, 2176) to perform the respective operation without receiving further authentication information from the user. In some examples, detecting, via the device, the request to perform the respective operation without receiving further authentication information from the user includes detecting (2228) a request to display content that is unavailable for display when the electronic device is in the locked state. In some examples, the user input is an input that requires access to secure data (such as a home screen or an application presenting secure data). In some examples, the request to perform the respective operation includes a swipe gesture on the device, a swipe gesture from an edge of the device (e.g., 2134), or a press of a home button. In some examples, the request to perform the respective operation includes at least one of: a selection of a notification (e.g., 2176) (e.g., the request to perform the respective operation is a request to display additional information associated with the notification (such as an elongated appearance or application corresponding to the notification)); a swipe gesture (e.g., the request to perform the respective operation is a swipe up on a display of the electronic device; in some examples, the request to perform the respective operation is a swipe up starting from an edge of the display of the electronic device; in some examples, the swipe is specifically a request to display a home screen of the electronic device); movement of the electronic device in a predetermined manner (e.g., the request to perform the corresponding operation includes lifting the electronic device); and a selection of an affordance (2230) (e.g., the request to perform the corresponding operation includes a selection of an affordance displayed by the electronic device (including a "lock" affordance displayed when the electronic device is in a locked state)).
In some examples, while performing the first biometric authentication check, the electronic device receives (2232) a second request (e.g., 2134) to perform a second operation without receiving further authentication information from the user. In some examples, the second request is a swipe gesture, a selection of a notification, and the like. In some examples, in response to receiving the second request (2234) to perform the second operation, in accordance with a determination that the second request to perform the second operation was received after determining that the first biometric data did not satisfy the biometric authentication criteria, the electronic device displays (2236) a second alternative authentication interface (e.g., 2140). In some examples, the second alternative authentication interface is a password, pattern, or fingerprint authentication user interface and is displayed without performing the second operation when the biometric authentication has failed at least once. In some examples, in accordance with a determination that a second request to perform a second operation is received prior to evaluating the first biometric data (e.g., prior to determining whether the first biometric data satisfies the biometric authentication criteria), the electronic device displays (2238) a biometric authentication indicator (e.g., 2156, 2162) without displaying a second alternative authentication interface, the biometric authentication indicator including an indication that biometric authentication is being attempted. In some examples, the second alternative authentication interface is a password, pattern, or fingerprint authentication user interface and is not displayed and does not perform the second operation when the device has not come to complete the first biometric authentication attempt. In some examples, if the user swipes upward while the electronic device is performing the first iteration of biometric authentication, the electronic device displays an intrusive interface (e.g., 2154) in which the processing status of biometric authentication is indicated. In some examples, in response to receiving the second request to perform the second operation in accordance with the determination that the second request to perform the second operation was received after the first biometric data was determined to satisfy the biometric authentication criteria, the electronic device performs the second operation without displaying an alternative authentication interface (e.g., a password, pattern, or fingerprint authentication user interface is displayed if the biometric authentication has failed at least once). In some examples, the biometric authentication indicator displayed in response to receiving the second request to perform the second operation in accordance with the determination that the second request to perform the second operation was received prior to evaluating the first biometric data includes an indication of an application associated with the notification (e.g., 2182). In some examples, if the user selects the notification while the device is performing the first biometric authentication check, the device indicates the application associated with the notification. By way of example, if the user selects a messaging notification, the device displays an indication to an instant messaging application, such as "perform biometric authentication to view message" or "FaceID to view message".
In some examples, in response to detecting the request (2240) to perform the respective operation, in accordance with a determination that the respective operation does not require authentication, the electronic device performs the respective operation (2242). Performing the respective operation without successful authentication in accordance with a determination that the respective operation does not require authentication allows the user to more quickly access the operation without having to provide additional input (e.g., input that the pointing device is to proceed with). Performing an operation (automatically) when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, if the operation does not require authentication, the electronic device performs the operation regardless of whether the device is in a locked state or an unlocked state. In some examples, the device does not check the authentication if the corresponding operation (such as using a camera or making an emergency call) does not require authentication.
In some examples, in accordance with a determination that the respective operation requires authentication and the device is in an unlocked state, the electronic device performs the respective operation (2244). Performing the respective operation when the device is in the unlocked state in accordance with a determination that the respective operation requires authentication allows the user to more quickly access the operation without having to provide additional input (e.g., input that instructs the device to proceed). Performing an operation (automatically) when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in accordance with a determination that the respective operation requires authentication and the device is in a locked state (2246), the electronic device captures second biometric data using the one or more biometric sensors without explicit input from the user requesting a second biometric authentication check (2248). Capturing the second biometric data without explicit input from the user requesting a second biometric authentication check in accordance with a determination that the respective operation requires authentication and the device is in a locked state enhances device security by requiring successful authentication while the device is in the locked state and thus preventing fraudulent and/or unauthorized access to the device. Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, after capturing the second biometric data, the electronic device performs (2250) a second biometric authentication check. In some examples, the first biometric data and the second biometric data are compared to the same set of biometric criteria. In some examples, the first biometric data and the second biometric data are compared to a respective set of biometric criteria. In some examples, performing the second biometric authentication check includes performing (2252) a respective operation in accordance with the determination that the second biometric data satisfies the biometric authentication criteria. In some examples, the electronic device optionally further transitions the device from the locked state to the unlocked state. In some examples, performing the second biometric authentication check includes forgoing (2254) performing the respective operation (and optionally maintaining the device in the locked state) in accordance with a determination that the second biometric data does not satisfy the biometric authentication criteria. In some examples, aborting execution of the respective operation includes maintaining (2256) the device in a locked state. In some examples, the aforementioned performing the respective operation includes displaying (2258) an alternative authentication user interface (e.g., 2140, 2198). In some examples, the alternative authentication interface is a password, pattern, or fingerprint authentication user interface. In some examples, while displaying the alternative authentication user interface, the electronic device detects an alternative authentication attempt (e.g., a password input including tap gesture 2102A) corresponding to the alternative authentication user interface. In some examples, an alternative authentication attempt corresponding to an alternative authentication user interface is to enter a password, enter a pattern, or a fingerprint detected on a fingerprint sensor. In some examples, in response to detecting an alternative authentication attempt corresponding to an alternative authentication user interface, the electronic device performs a respective operation in accordance with a determination that the authentication attempt was successful (e.g., the provided authentication information is consistent with stored authentication information (such as a stored password, a stored pattern, or stored fingerprint information)) and that biometric data corresponding to the alternative authentication attempt (e.g., the second biometric data or biometric data captured soon after the authentication attempt was in progress, such as when a last character submission button of the password or password was selected) satisfies a first similarity criterion with respect to the stored biometric data corresponding to an authorized user of the device. In some examples, the electronic device stores additional information based on biometric data corresponding to alternative authentication attempts as biometric data that can be used in future biometric authentication attempts to identify an authorized user of the device. In some examples, the electronic device learns changes in the user's face for authenticating the user in future authentication attempts. In some examples, further in response to detecting an alternative authentication attempt corresponding to an alternative authentication user interface, in accordance with a determination that the authentication attempt is successful and that biometric data corresponding to the alternative authentication attempt satisfies a first similarity criterion with respect to stored biometric data corresponding to an authorized user of the device, the electronic device outputs an output (e.g., a visual, audio, and/or tactile output) indicating that information (e.g., a biometric template) used in future biometric authentication attempts to identify the authorized user of the device has been modified. In some examples, the electronic device displays an indication that the biometric data has been updated to better identify the user's face.
In some examples, the electronic device performs a respective operation in accordance with a determination that the authentication attempt was successful (e.g., the provided authentication information is consistent with stored authentication information, such as a stored password, a stored pattern, or stored fingerprint information) and that biometric data corresponding to an alternative authentication attempt (e.g., the second biometric data or biometric data captured while the authentication attempt was in progress, such as when the last character submission button of the password or password was selected, or shortly thereafter) does not satisfy a first similarity criterion with respect to the stored biometric data corresponding to an authorized user of the device. Performing the respective operations in accordance with a determination that the authentication attempt was successful and that the biometric data corresponding to the alternative authentication attempt does not satisfy the first similarity criteria with respect to the stored biometric data provides the user with an alternative method for an operation (e.g., a locking operation) for the access device that requires successful authentication when the biometric data does not correspond to the stored biometric data. Providing additional control options with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the electronic device does not store additional information based on biometric data corresponding to alternative authentication attempts as biometric data that may be used in future biometric authentication attempts to identify an authorized user of the device. In some examples, in response to detecting an alternative authentication attempt corresponding to an alternative authentication user interface, in accordance with a determination that the authentication attempt was unsuccessful (e.g., the provided authentication information is inconsistent with a stored authentication, such as a stored password, a stored pattern, or stored fingerprint information), the electronic device forgoes performing the respective operation and does not store additional information based on biometric data corresponding to the alternative authentication attempt as biometric data that may be used in future biometric authentication attempts to identify an authorized user of the device).
In some examples, in response to detecting a request to perform a respective operation and in accordance with a determination that the respective operation requires authentication and the device is in a locked state, the electronic device displays an alternative authentication interface. Providing an alternative authentication interface (e.g., to provide an alternative method for providing authentication, alternatively or in addition to biometric authentication) allows a user to easily provide authentication for operation using a different authentication method when the current authentication method is unsuccessful or continues to be unsuccessful. Providing additional control options in this manner (e.g., for providing authentication) without cluttering the UI with additional displayed controls enhances operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in response to a user requesting access to secure data after a failure of a first iteration of biometric authentication, the electronic device displays an alternative authentication interface (e.g., 2140, 2198), such as a password or passcode interface. Displaying an authentication interface (such as a password or password interface) in response to a user requesting access to secure data after a first iteration of biometric authentication fails provides a user with a quick alternative method for accessing an operation (e.g., a locking operation) of a device that requires successful authentication when biometric data is unsuccessful. Providing additional control options with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, an alternative authentication interface is displayed after an additional time delay and/or after another biometric authentication attempt has failed (e.g., AS described in more detail with reference to fig. 26A-26 AS). In some examples, the biometric authentication criteria include that authentication using an alternative authentication interface has not begun such that the biometric authentication criteria are met (2260). In some examples, biometric authentication is at least partially attempted (e.g., in response to) while an alternative authentication user interface is displayed: in accordance with a determination that the biometric authentication is successful and authentication using the alternative authentication interface has not begun, the electronic device performs a corresponding operation; and in accordance with a determination that authentication using the alternative authentication interface has begun (e.g., a determination that at least a portion of a credential, such as a partial password, pattern, or password, has been received using the alternative authentication interface), the electronic device forgoes performing the respective operation based on the biometric authentication. In some examples, the electronic device waits to perform the second biometric authentication until the user has finished providing the password. In some examples, the device delays displaying the alternative authentication user interface until after the second biometric authentication check has failed (e.g., AS described in more detail with reference to fig. 26A-26 AS), and the device performs a third biometric authentication check after the alternative authentication user interface has been displayed.
In some examples, the second biometric authentication check is performed while an alternative authentication interface (e.g., 2140) is displayed (2262). Performing the second biometric authentication check while displaying the alternative authentication interface enhances the operability of the device by, in some examples, completing the second biometric authentication check before the user has completed providing manual alternative authentication input, thereby making the user-device interface more efficient.
In some examples, the alternative authentication interface is a password, pattern, or fingerprint authentication user interface. In some examples, performing at least a portion of the second biometric authentication check includes performing at least a portion of the second biometric authentication check while displaying the alternate authentication interface. In some examples, biometric authentication is performed during password entry. In some examples, a biometric authentication UI is displayed on the password entry interface (e.g., biometric progress indicators 2142, 2144, and 2146).
In some examples, the electronic device determines that the biometric authentication criteria have been met when an alternative authentication user interface (e.g., a password, pattern, or fingerprint authentication user interface) is displayed. In some examples, in response to determining that the biometric authentication criteria have been met, the electronic device performs a corresponding operation. Performing the respective operations in response to determining that the biometric authentication criteria have been met when displaying the alternative authentication user interface enhances operability of the device by completing a second biometric authentication check before the user completes providing manual alternative authentication input in some examples, thereby making the user-device interface more efficient. In some examples, the biometric authentication criteria include a requirement that the user has not used an alternative authentication interface to enter at least a portion of the credential. In some examples, the biometric authentication criteria include a requirement that the user has not used an alternative authentication interface to enter at least a portion of the credential. In some examples, if the user has begun to enter an alternative form of authentication (such as a password, pattern, or fingerprint), if there is a successful biometric authentication, then the corresponding action is not performed.
In some examples, in response to detecting a request to perform a respective operation and in accordance with a determination that the respective operation requires authentication and the device is in the locked state, the electronic device displays an authentication indication (e.g., a progress indicator or another indication that biometric authentication is being attempted) for biometric authentication without displaying an option to proceed with an alternative form of authentication. In some examples, the electronic device displays an authentication indication for biometric authentication without displaying an alternative authentication interface and/or without displaying selectable options for displaying an alternative authentication interface. In some examples, when attempting biometric authentication in response to a request to perform a corresponding operation, the device foregoes providing an option for an alternative form of authentication in order to indicate to the user that the biometric authentication has not failed (e.g., AS described in more detail with reference to fig. 26A-AS).
In some examples, the electronic device receives a request to store additional information for use in biometric authentication while the device is in the unlocked state (e.g., in a biometric enrollment user interface in a device settings user interface or a system preferences user interface). In some examples, the electronic device captures third biometric data in response to a request to store additional information for use in biometric authentication. In some examples, the electronic device stores additional information based on the third biometric data that is usable in future biometric authentication attempts to identify the authorized user of the device in accordance with a determination that the third biometric data satisfies a second similarity criterion with respect to the stored biometric data corresponding to the authorized user of the device further in response to the request to store the additional information for biometric authentication. Storing additional information based on third biometric data that can be used in future biometric authentication attempts to identify an authorized user of the device while the device is in the unlocked state (e.g., and disabling such operation while the device is in the locked state) enhances device security and thus future access to the device by preventing fraudulent and/or unauthorized attempts to the stored biometric authentication data on the device. Improving the security measures of a device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, the electronic device learns changes in the user's face for authenticating the user in future authentication attempts. In some examples, the similarity between the third biometric data and the stored biometric data required by the second similarity criterion is less than the similarity required by the first similarity criterion. In some examples, while the device is in the unlocked state and in the biometric data enrollment user interface, the device is configured to accept additional biometric data corresponding to a biometric feature that is more different than a currently enrolled biometric feature than when the device is learning a biometric feature detected when an alternate authentication is successfully provided after the biometric authentication has failed. In some examples, in accordance with a determination that the third biometric data does not satisfy the second similarity criteria with respect to the stored biometric data corresponding to the authorized user of the device, the electronic device stores additional information based on the third biometric data that may be used in future biometric authentication attempts to identify the authorized user of the device. In some examples, the electronic device learns changes in the user's face for authenticating the user in future authentication attempts.
It is noted that details of the processes described above with respect to method 1200 (e.g., fig. 22A-22F) may also be applied to the methods described herein in a similar manner. For example, method 2200 optionally includes one or more of the features of the various methods described herein with reference to methods 800, 1000, 1200, 1400, 1600, 1800, 2000, 2500, and 2700. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to fig. 21C-E. As another example, in response to receiving an input prior to completion of the biometric authentication process, one or more plug-in interfaces as described in methods 2000 and 2700 are optionally displayed. For the sake of brevity, these details are not repeated here.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 22A-22F are optionally implemented by the components depicted in fig. 1A-1B. For example, detection operation 2202, execute operation 2210, capture operation 2212, convert operation 2216, hold operation 2224, detect operation 2226, execute operation 2250, and abort operation 2254 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Fig. 23A-23Q illustrate exemplary user interfaces for managing biometric authentication, according to some examples. As described in more detail below, the exemplary examples of user interfaces shown in fig. 23A-23Q relate to the exemplary examples of user interfaces shown in fig. 24A-24 BC, which in turn are used to illustrate the processes described below, including the processes in fig. 25A-25C.
Fig. 23A shows an electronic device 2300 (e.g., portable multifunction device 100, device 300, or device 500). In the illustrative example shown in fig. 23A to 23Q, the electronic device 2300 is a smartphone. In other examples, the electronic device 2300 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 2300 has a display 2302, one or more input devices (e.g., a touch screen of the display 2302, buttons 2304, a microphone (not shown)), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 2303), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 2303 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In fig. 23A, electronic device 2300 displays photo gallery user interface 2310 on display 2302. In some examples, the photo library user interface 2310 slides into the display from an edge of the display (e.g., slides up from a bottom edge of the display) to replace the display of previous interfaces, applications, and/or virtual keyboards. In some examples, the photo library user interface 2310 slides up in response to a request to open the photo library application. In some examples, the photo library user interface 2310 slides up in response to a request to transfer photos to participants of a session in an instant messaging application.
In some examples, as shown in fig. 23A, the photo library user interface 2310 includes a plurality of selectable preview images corresponding to photos stored on the electronic device 2300 (or accessible to the device via a remote server). In some examples, as also shown in fig. 23A, the plurality of selectable preview images are organized based on time (e.g., the date the photograph was taken) and/or based on location (e.g., the location where the photograph was taken). For example, the plurality of selectable preview images 2312A through 2312F shown under the header 2312 correspond to photographs taken at Cupertino, CA for 4 months and 30 days, and the plurality of selectable preview images 2314A through 2314C shown under the header 2314 correspond to photographs taken at San Francisco, CA for yesterday.
In some examples, upon launching the photo library application, electronic device 2300 displays a selectable preview image of photo library user interface 2310 that may be selected (to be transmitted). For example, the plurality of selectable preview images includes selectable preview images 2314A through 2314C. As shown in fig. 23A, a plurality of selectable preview images 2314A-2314C are selectable by a user to be transmitted to participants via one or more applications, such as an instant messaging application or an email application.
In fig. 23B, while displaying the photo library user interface 2310 with selectable preview images 2314A-2314C (corresponding to photos selected to be transferred), the electronic device 2300 detects user activation 2301 of a transmission affordance 2316 (e.g., a button) for initiating transmission of photos corresponding to the selectable preview images 2314A-2314B. For example, the user activation 2301 is a tap gesture on the transport affordance 2316.
In FIG. 23C, in response to detecting activation of transmission affordance 2316, electronic device 2300 provides alert 2318. As shown in fig. 23C, in some examples, prompt 2318 instructs the user to provide one or more activations of button 2304, such as two presses of button 2304. In some examples, the prompt 2318 is highlighted relative to one or more other displayed objects. Highlighting a cue in this manner includes, for example, darkening, blurring, and/or otherwise obfuscating one or more portions of the photo gallery user interface 2310.
As also shown in fig. 23C, further in response to detecting activation of the transmission affordance 2316, the electronic device 2300 displays an application selection interface 2320 that includes a plurality of application affordances 2320A-2320H. In some examples, each of the application affordances 2320A-2320H corresponds to an application that may be used to transfer an image (to a different device outside of the electronic device 2300), such as an image corresponding to the selectable preview images 2314A-2314C.
In some examples, the button 2304 has a fixed position relative to the display 2302 and/or one or more other components of the electronic device 2300. In some examples, the prompt 2318 is also displayed in a fixed position relative to the display 2302 and/or one or more other components of the electronic device. Thus, the prompt 2318 is displayed at a predetermined position relative to the button 2304.
In fig. 23D, electronic device 2300 detects user activation 2306 of button 2304 while prompt 2318 (which is optionally overlaid over the photo library user interface) is displayed. In some examples, as shown in fig. 23D, the user activation is a double press of button 2304. In some examples, the two presses of the button 2304 include a first press of the button and a second press of the button that occurs within a predetermined amount of time (e.g., 1 second).
In response to detecting the one or more activations of button 2304, electronic device 2300 initiates biometric authentication (e.g., facial recognition authentication) for a biometric feature (e.g., face) of the user. As shown in fig. 23E, in some examples, upon initiating biometric authentication, a biometric authentication interface 2322 is provided (e.g., displayed on display 2302). In some examples, during biometric authentication, a biometric authentication interface is overlaid over an application interface, such as photo gallery user interface 2310. In some examples, biometric authentication includes simulation of a representation of a biometric feature, such as the token 2324. Further in response to the one or more activations of the button 2304, one or more biometric sensors 2303 of the electronic device 2300, such as one or more cameras or facial recognition sensors (e.g., included in the one or more biometric sensors 2303), are activated.
In some examples, once the one or more biometric sensors 2303 are activated, the electronic device 2300 obtains (e.g., captures) biometric data corresponding to a biometric feature associated with the user. In some examples, the biometric feature captures biometric data using the one or more biometric sensors 2303 of the electronic device (and/or biometric sensors of one or more cameras). Optionally, a light-emitting device such as an IR floodlight or a structured light projector is used to help illuminate the biometric feature. In other examples, the electronic device receives biometric data from another device.
In some examples, once the electronic device 2300 has obtained the biometric data, the electronic device processes (e.g., analyzes) the biometric data to determine whether the biometric authentication was successful. In some examples, the determination includes determining whether the biometric data matches a biometric template associated with the user. The biometric template is optionally stored on the electronic device 2300.
In some examples, as shown in fig. 23F, in processing the biometric data, the biometric authentication interface 2322 indicates that the biometric data is being processed by the electronic device, for example, by using the biometric authentication interface to display one or more rotating rings 2326, as described with reference to fig. 17A-AI. In some examples, the one or more rotating rings 2326 replace the token 2324 within the biometric authentication interface.
If the electronic device 2300 determines that the biometric authentication is successful (e.g., the biometric data matches a biometric template associated with the user), the electronic device transitions from a state in which a function (e.g., image transmission) is disabled to a state in which the function is enabled. By way of example, successful biometric authentication enables the electronic device to transmit (e.g., share) images, such as images corresponding to the selectable preview images 2314A-2314C. In some examples, the electronic device also indicates that the biometric authentication was successful, e.g., by displaying a simulation of the representation of the biometric characteristic in the biometric authentication interface. As shown in fig. 23G, in some examples, the biometric authentication interface 2322 includes a token 2328 that indicates (to the user) that the biometric authentication was successful. In some examples, the landmark symbols 2328 replace one or more rotating rings 2326 within the biometric authentication interface 2322.
Referring to fig. 23H, after image transmission has been enabled on the electronic device 2300 in response to a successful biometric authentication, and while the application selection interface 2320 is displayed, the electronic device detects user activation 2305 of the application affordance (to launch the corresponding application). For example, the enabled affordance is an application affordance 2320A. Activation of the application affordance 2320A launches an application 2330 (e.g., an instant messaging application) corresponding to the application affordance 2320A and/or causes the electronic device to transmit images corresponding to the selectable preview images 2314A-2314C using the application 2330 (e.g., concurrently with launching the application), as shown in fig. 23I.
If the electronic device 2300 determines that biometric authentication is not successful (e.g., the biometric data does not match the biometric template associated with the user), the electronic device does not transition between states (e.g., from a state in which a function (such as authorized image transfer) is disabled to a state in which the function is enabled) but remains in the same state. In some examples, the electronic device also indicates (to the user) that the biometric authentication was unsuccessful, e.g., by displaying a simulation of the representation of the biometric feature in a biometric authentication interface. As shown in fig. 23J, in some examples, the biometric authentication interface 2322 includes a flag 2332 indicating that biometric authentication was unsuccessful. A designator 2332 indicates that, for example, a biometric feature was not recognized by the electronic device.
In fig. 23K, after image transmission has not been enabled on the electronic device 2300 in response to an unsuccessful biometric authentication, and while the application selection interface 2320 is displayed, the electronic device detects a user activation 2307 of the application affordance 2320A. In some examples, as shown in fig. 23L, in response to detecting activation of the application affordance 2320A of the application-selection user interface 2320, the electronic device 2300 displays an alternative authentication affordance 2334 (e.g., a password affordance). In some examples, upon display of the alternative authentication affordance 2334, the electronic device detects user activation 2309 of the alternative authentication affordance 2334. Activation of the alternative authentication affordance causes an alternative authentication interface 2336 (e.g., password interface, passcode interface) to be displayed, as shown in fig. 23M.
In some examples, electronic device 2300 performs biometric authentication during display of alternative authentication interface 2336. In some examples, the electronic device obtains and processes biometric data to determine whether the obtained biometric data matches a biometric template associated with the user. As such, in some examples, alternative authentication interface 2336 includes a simulation of a representation of a biometric characteristic, such as a landmark symbol 2338 (e.g., corresponding to landmark symbol 2324), as shown in fig. 23M. In some examples, alternative authentication interface 2336 indicates that biometric data is being processed by the electronic device, e.g., by displaying one or more rotating rings 2340 (e.g., corresponding to one or more rotating rings 2326), as shown in fig. 23N and described with reference to fig. 17A-AI. In some examples, one or more rotating rings 2340 replace the flag symbol 2338 within an alternative authentication interface.
If the electronic device 2300 determines that the biometric authentication is unsuccessful (e.g., the biometric data does not match a biometric template associated with the user), the electronic device remains in a state in which functionality (e.g., image transmission) is disabled. In some examples, the electronic device also indicates that the biometric authentication was unsuccessful, e.g., by displaying a simulation of the representation of the biometric characteristic in an alternative authentication interface 2336. As shown in fig. 23O, in some examples, alternative authentication interface 2336 includes a flag 2342 (e.g., corresponding to flag 2328) that indicates that biometric authentication is unsuccessful (to the user). In some examples, the glyph 2342 replaces one or more rotating rings 2340 within an alternative authentication interface.
In some examples, electronic device 2300 performs password authentication during display of alternative authentication interface 2336 in addition to or instead of biometric authentication. Accordingly, the electronic device receives and processes the password data to determine whether the received password data matches a login password associated with the user. Thus, in some examples, alternative authentication interface 2336 includes an indication of a received password entry, such as password indication 2344, as shown in fig. 23P.
As described above, if the electronic device 2300 determines that the biometric authentication and/or the password authentication are successful, the electronic device transitions from a state in which a function (e.g., image transmission) is disabled to a state in which the function is enabled. For example, as shown in fig. 23Q, successful biometric identification and/or password authentication enables the electronic device to transmit (e.g., share) images, such as images corresponding to the selectable preview images 2314A-2314C, via an application 2330 (e.g., an instant messaging application).
As described above, the exemplary examples of the user interface shown in fig. 23A to 23Q described above relate to the exemplary examples of the user interface shown in fig. 24A to 24BC described below. Accordingly, it should be appreciated that the processes described above with respect to the exemplary user interfaces shown in fig. 23A-23Q and the processes described below with respect to the exemplary user interfaces shown in fig. 24A-24 BC are, for the most part, similar processes that similarly involve initiating and managing biometric authentication using an electronic device (e.g., 100, 300, 500, 2300, or 2400).
Fig. 24A-24 BC illustrate an example user interface for managing biometric authentication, according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 25A-25C.
Fig. 24A shows an electronic device 2400 (e.g., portable multifunction device 100, device 300, or device 500). In the exemplary example shown in fig. 24A to 24BC, the electronic device 2400 is a smartphone. In other examples, the electronic device 2400 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 2400 has a display 2402, one or more input devices (e.g., a touch screen of the display 2402, buttons 2404, a microphone (not shown)), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 2403), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In fig. 24A, the electronic device 2400 displays a tutorial user interface 2410 on the display 2402. In some examples, the tutorial user interface 2410 slides into the display from the edge of the display (e.g., slides up from the bottom edge of the display) to replace the display of the previous interface, application, and/or virtual keyboard. In some examples, the tutorial user interface 2410 slides up in response to a request to proceed with the payment transaction (e.g., with a different device, such as a transaction terminal).
In some examples, as shown in fig. 24A, the tutorial user interface 2410 includes a text indication 2410A indicating to the user that an input (e.g., two presses of the button 2404) can be performed to proceed with the payment transaction. In some examples, as also shown in fig. 24A, the tutorial user interface 2410 includes a graphical indication 2410B corresponding to the textual indication 2410A indicating (to the user) that an input (e.g., two presses of the button 2404) may be performed to proceed with the payment transaction.
In some examples, in response to detecting activation of the affordance (e.g., the "continue" affordance) on the tutorial user interface 2410, the electronic device 2400 displays on the display 2402 a payment user interface 2412 that is overlaid with a prompt 2416 instructing the user to provide one or more activations of the button 2404 (e.g., two presses of the button 2404), including a representation of a payment account 2414 currently selected for the payment transaction, as shown in fig. 24B. In some examples, the prompt 2416 is highlighted (on the payment user interface 2412) relative to one or more other displayed objects. Highlighting the prompt in this manner includes, for example, darkening, obscuring, and/or otherwise obfuscating one or more portions of the payment user interface 2412.
In some examples, the location of the button 2404 that is prompted 2416 to request activation (e.g., to proceed with a payment transaction) is highlighted by a dynamic indication 2418. For example, as illustrated by the transition from fig. 24B to fig. 24C, the dynamic indication 2418 highlights the location of the button 2404 on the device by continuously changing size on the display adjacent to the location of the button 2404 (e.g., continuously alternating between becoming wider and becoming narrower, or otherwise continuously changing size), thereby allowing the user to more easily locate the button corresponding to the request of the prompt 2416.
In fig. 24D, upon displaying the prompt 2416, the electronic device 2400 detects an activation 2401 of the button 2404. In some examples, as shown in fig. 24D, the activation is a double press of button 2404. In some examples, the two presses of button 2404 include a first press of the button and a second press of the button that occurs within a predetermined amount of time (e.g., 1 second).
In response to the one or more activations of the button 2404, the electronic device 2400 removes the prompt 2416 (and any corresponding highlighting of the prompt) and the display of the dynamic indication 2418 overlaid on the payment user interface 2412, as shown in fig. 24E, and initiates biometric authentication (e.g., facial recognition) for a biometric feature (e.g., face) of the user, as shown in fig. 24F. In some examples, the biometric feature is at least a portion of a face (e.g., a user's face), and the biometric authentication involves facial recognition of at least a portion of the (user's) face.
As shown in fig. 24F, in some examples, a biometric authentication interface 2420 is provided when biometric authentication is initiated. In some examples, during biometric authentication, a biometric authentication interface is overlaid on the payment user interface 2412. In some examples, biometric authentication includes simulation of a representation of a biometric feature, such as a token 2422. Further in response to the one or more activations of button 2404, one or more biometric sensors, such as one or more cameras or facial recognition sensors (e.g., included in one or more biometric sensors 2403), of electronic device 2400 are activated. In some examples, the electronic device displays the biometric authentication interface 2420 in a center region of the display and displays a representation of the payment account 2414 to the top of the display (e.g., by replacing or moving up).
In some examples, once the one or more biometric sensors are activated, the electronic device 2400 obtains (e.g., captures) biometric data corresponding to a biometric feature associated with the user. In some examples, the biometric feature captures biometric data using one or more biometric sensors 2403 of the electronic device (and/or biometric sensors of the one or more cameras). Optionally, a light-emitting device such as an IR floodlight or a structured light projector is used to help illuminate the biometric feature. In other examples, the electronic device receives biometric data from another device.
In some examples, once the electronic device 2400 has obtained the biometric data, the electronic device processes (e.g., analyzes) the biometric data to determine whether the biometric authentication was successful. In some examples, the determination includes determining whether the biometric data matches a biometric template associated with the user. The biometric template is optionally stored on the electronic device 2400.
In some examples, as shown in fig. 24G, biometric authentication interface 2420 indicates that biometric data is being processed by the electronic device, e.g., by using the biometric authentication interface to display one or more rotating rings 2424. In some examples, one or more rotating rings 2424 replace landmark symbols 2422 within a biometric authentication interface.
If the electronic device 2400 determines that the biometric authentication is successful (e.g., the biometric data matches a biometric template associated with the user), the electronic device transitions from a first state in which a function (e.g., authorization to transmit payment credentials) is disabled to a second state in which the function is enabled. In some examples, the first state is a state in which a secure element of the device is disabled from releasing secure data (e.g., payment credentials of a payment account provisioned on the device), and the second state is a state in which the secure element is enabled to release the secure data.
In some examples, successful biometric authentication authorizes the electronic device to transmit account credentials related to the payment transaction. In some examples, the electronic device also indicates (to the user) that the biometric authentication was successful, e.g., by displaying a simulation of the representation of the biometric feature in the biometric authentication interface. As shown in fig. 24H, in some examples, the biometric authentication interface 2420 includes a flag symbol 2426 that indicates (to the user) that the biometric authentication was successful. In some examples, landmark symbols 2426 replace one or more swivel rings 2424 within a biometric authentication interface.
In fig. 24I, after the relay electronic device 2400 determines that the biometric authentication was successful, the electronic device indicates (to the user) that authorization has been provided to proceed with the payment transaction using the currently selected payment account (e.g., payment account 2414), and thus the payment transaction may be initiated. In some examples, the electronic device 2400 displays a textual indication 2428A and/or a graphical indication 2428B to indicate that a payment transaction may be initiated. In some examples, the textual indication 2428A and/or the graphical indication 2428B replace the biometric authentication interface 2420 on the payment user interface 2412, as shown in fig. 24I. In some examples, the graphical indication 2428B replaces the landmark symbol 2426 within the payment user interface 2412.
In fig. 24J, upon display of the payment user interface 2412 (with the payment account 2414 selected and authorized for payment transactions), the electronic device 2400 (e.g., via wireless transmission radio of the device) detects a second device 2430 (e.g., a transaction terminal) that is different from the electronic device. In response to detecting the second device (e.g., transaction terminal), the electronic device 2400 (e.g., via wireless transmitting radio of the device) transmits payment credentials associated with the payment account 2414 to the second device to complete the payment transaction.
In some examples, after successful transmission of the payment credentials to the second device 2430, the electronic device 2400 updates a textual indication 2428A (e.g., for "payment complete") and/or a graphical indication 2428B (e.g., for a checkmark) within the payment user interface 2412 to indicate (to the user) that the payment transaction has been successfully completed, as shown in fig. 24K.
In some examples, a different payment account may be selected for the transaction before proceeding with the payment transaction with the second device (e.g., transaction terminal) using payment account 2414. In some examples, as shown in fig. 24L, the electronic device 2400 displays within the payment user interface 2412 (e.g., at a bottom area of the interface) one or more representations of payment accounts (e.g., payment accounts 2432A-2432C) that are different from the currently selected payment account 2414. In some examples, as shown in fig. 24L, the electronic device 2400 receives a user selection 2406 (e.g., a tap gesture) of a different payment account (e.g., payment account 2432A) of the one or more payment accounts different from the payment account 2414.
In some examples, if the second device is a transaction terminal at a store, to authorize in-store payment using biometric authentication (e.g., facial recognition authentication), the user must first confirm the intent-to-pay by activating a hardware button (e.g., button 2404, by double-clicking a sleep/wake button). In some examples, the user then authenticates using biometric authentication (e.g., facial recognition authentication) before placing the device in proximity to a second device (e.g., a transaction terminal). In some examples, if the user wants to select a different payment method after biometric authentication (e.g., facial recognition authentication), the device prompts the user to re-authenticate with biometric authentication, but does not require the user to activate a hardware button (e.g., button 2404) (e.g., double-click the sleep/wake button again).
In some examples, upon user selection 2406 of payment account 2432A, the representation of payment account 2432A is slid up within payment user interface 2412 and the representation of payment account 2414 is slid down within payment user interface 2412, as shown in fig. 24M. In some examples, the representation of payment account 2432A is slid up within payment user interface 2412 to a position previously occupied by the representation of payment account 2414 (thereby indicating to the user that payment account 2432A is now selected for the payment transaction) and the representation of payment account 2414 is slid down within payment user interface 2412 to join the one or more representations of payment accounts other than the currently selected payment account, as shown in fig. 24N. Once the currently selected payment account has been switched from payment account 2414 to payment account 2432A, the device may proceed with the payment transaction using payment account 2432A (e.g., as described with reference to fig. 24J) to complete the transaction.
In some examples, the techniques described with reference to fig. 24B-24N may be initiated (e.g., by activation 2401 according to prompt 2416) when the electronic device 2400 is displaying an application 2434 (e.g., an instant messaging application) on the display 2402 that is different from the tutorial user interface 2410, as shown in fig. 24O. For example, fig. 24O shows electronic device 2400 displaying an application 2434 (e.g., an instant messaging application) on display 2402. While displaying application 2434, electronic device 2400 receives a user activation of button 2404 (e.g., two presses 2405). In response to receiving the user activation, the electronic device proceeds with the techniques described with reference to fig. 24B-24N to obtain biometric authentication for proceeding with the payment transaction (e.g., with a second device different from the electronic device).
In some examples, the techniques described with reference to fig. 24B-24N may be initiated when the display 2402 of the electronic device 2400 is in an off state (e.g., by activation 2401 according to prompt 2416 or by activation 2405 when displaying the application 2434), as shown in fig. 24P. While display 2402 is in the off state, electronic device 2400 receives a user activation of button 2404 (e.g., two presses 2407). In response to receiving the user activation, the electronic device proceeds with the techniques described with reference to fig. 24B-24N to obtain biometric authentication for proceeding with the payment transaction (e.g., with a second device different from the electronic device).
Fig. 24Q illustrates the electronic device 2400 with the display 2402 in an off state. While the display 2402 is in the off state, the electronic device detects (e.g., via a wireless communication radio of the device) a second device 2430 (e.g., a transaction terminal). In some examples, in response to detecting the second device 2430 while the display 2402 is in the off state, the electronic device 2400 displays a payment user interface 2412 and a prompt 2416 (e.g., as shown in fig. 24B) on the display 2402 for continuing the payment transaction.
If the electronic device 2400 determines that the biometric authentication is not successful (e.g., the biometric data does not match the biometric template associated with the user), the device does not transition from the first state to the second state, and in some examples, the electronic device remains in the first state (e.g., authorization to proceed with the payment transaction is disabled). In some examples, the electronic device also indicates that the biometric authentication was unsuccessful, e.g., by displaying a simulation of the representation of the biometric characteristic in the biometric authentication interface 2420. As shown in fig. 24R, in some examples, the biometric authentication interface 2420 includes a flag symbol 2436 indicating that biometric authentication is unsuccessful. The flag symbol 2436 indicates that, for example, the biometric feature was not recognized by the electronic device. In some examples, the electronic device 2400 generates a tactile output 2438 (e.g., tactile feedback) that further indicates (to the user) that the biometric authentication was unsuccessful, in addition to the landmark symbol 2436 within the biometric authentication interface 2420.
In some examples, the landmark symbol 2436 is moved further (e.g., horizontally or vertically) within the area of the biometric authentication interface 2420 to further indicate (to the user) that the biometric authentication was unsuccessful. For example, as shown by the transition from fig. 24R to fig. 24S to fig. 24T, the glyph 2436 slides back and forth in the horizontal direction (e.g., repeating a continuous sliding movement from left to right) for a predetermined period of time (e.g., 3 seconds). In some examples, the device continues to generate tactile output 2438 (e.g., tactile feedback) for the duration of the movement of the landmark symbol 2436 within the biometric authentication interface 2420. In some examples, the tactile output 2438 is synchronized with the movement of the landmark symbol 2436.
In some examples, as shown in fig. 24U, when a flag symbol 2436 is displayed within the biometric authentication interface 2420 indicating to the user that biometric authentication was unsuccessful, the electronic device 2400 displays an alternative authentication affordance 2440 within the payment user interface 2412 (e.g., below the biometric authentication interface 2420) for providing an alternative (e.g., password, passcode) authentication (e.g., in addition to or in lieu of biometric authentication) to proceed with the payment transaction. In some examples, upon displaying an alternative authentication affordance 2440 to provide alternative authentication, the electronic device detects a user selection 2411 (e.g., a tap gesture) of the affordance, as shown in fig. 24V.
As shown in fig. 24W, in response to detecting a user selection of affordance 2440, electronic device 2400 displays an alternative authentication interface 2442 on display 2402. In some examples, electronic device 2400 performs biometric authentication during display of alternative authentication interface 2442. In some examples, the electronic device obtains and processes biometric data to determine whether the obtained biometric data matches a biometric template associated with the user. As such, in some examples, alternative authentication interface 2442 includes a simulation of a representation of a biometric feature, such as landmark symbol 2444 (e.g., corresponding to landmark symbol 2422), as shown in fig. 24W. In some examples, alternative authentication interface 2442 indicates that biometric data is being processed by the electronic device, e.g., by displaying one or more rotating rings 2446 (e.g., corresponding to one or more rotating rings 2424), as shown in fig. 24X. In some examples, one or more rotating rings 2446 replace the flag symbol 2444 within an alternative authentication interface.
If the electronic device 2400 determines that the biometric authentication is successful (e.g., the biometric data matches a biometric template associated with the user), the device transitions from a first state in which a function (e.g., authorization to transmit payment credentials) is disabled to a second state in which the function is enabled. In some examples, successful biometric authentication enables the electronic device to transmit payment credentials (e.g., associated with payment account 2414) to, for example, a transaction terminal. In some examples, the electronic device also indicates that the biometric authentication was successful, e.g., by displaying a simulation of the representation of the biometric feature in an alternative authentication interface 2442. As shown in fig. 24Y, in some examples, alternative authentication interface 2442 includes a token 2448 (e.g., corresponding to token 2426) indicating that biometric authentication was successful (to the user). In some examples, the landmark symbols 2448 replace one or more rotating rings 2446 within an alternative authentication interface.
In some examples, in addition to or instead of biometric authentication, electronic device 2400 performs password authentication during display of an alternative authentication interface 2442. Accordingly, the electronic device receives and processes the password data to determine whether the received password data matches a login password associated with the user. Thus, in some examples, alternative authentication interface 2442 includes an indication of a received password entry, such as password indication 2450, as shown in fig. 24Y.
In fig. 24Z, after the relay electronic device 2400 determines that the biometric authentication (and/or an alternative authentication, such as a password authentication) is successful, the electronic device indicates (to the user) that authorization has been provided to proceed with the payment transaction using the currently selected payment account (e.g., payment account 2414), and thus the payment transaction may be initiated (e.g., with the transaction terminal). In some examples, the electronic device 2400 displays a textual indication 2452A (e.g., corresponding to the textual indication 2428A) and/or a graphical indication 2452B (e.g., corresponding to the graphical indication 2428B) to indicate that a payment transaction may be initiated.
Fig. 24AA shows that the electronic device 2400 displays an alternative authentication interface 2442 on the display, similar to the alternative authentication interface of fig. 24W. As shown in fig. 24W, the electronic device 2400 performs biometric authentication during display of an alternative authentication interface 2442. Thus, the electronic device obtains and processes the biometric data to determine whether the obtained biometric data matches a biometric template associated with the user. As such, alternative authentication interface 2442 includes a flag symbol 2444 that indicates to the user that biometric data has been (or is being) obtained. In fig. 24AB, as in fig. 24X, an alternative authentication interface 2442 indicates that biometric data is being processed by the electronic device, for example, by displaying one or more rotating rings 2446.
If the electronic device 2400 determines that the biometric authentication is not successful (e.g., the biometric data does not match the biometric template associated with the user), the electronic device forgoes transitioning from a first state in which a function (e.g., authorization to transmit payment credentials) is disabled to a second state in which the function is enabled (thereby not allowing the device to proceed with the payment transaction). In some examples, the electronic device 2400 also indicates (to the user) that the biometric authentication was unsuccessful, e.g., by displaying a simulation of the representation of the biometric characteristic in the alternative authentication interface 2442. As shown in fig. 24AC, in some examples, alternative authentication interface 2442 includes a flag 2454 (e.g., corresponding to flag 2436) indicating that biometric authentication was unsuccessful (to the user).
Fig. 24AD illustrates electronic device 2400 displaying an alternative authentication interface 2442 on display 2402 after an unsuccessful biometric authentication. The electronic device displays an indication (e.g., via the flag 2454) that biometric authentication was unsuccessful. In some examples, as shown in fig. 24AE, following an unsuccessful biometric authentication, electronic device 2400 receives one or more activations of button 2404 (e.g., two presses of button 2404) while displaying alternative authentication interface 2442 to restart the biometric authentication process (e.g., as described with reference to fig. 24B-24J). Thus, when the biometric authentication and/or alternative authentication process fails, the user can reattempt the process via one or more activations of button 2404. In some examples, if the biometric authentication process is unsuccessfully attempted for a predetermined number of consecutive attempts or is unsuccessfully attempted for a predetermined number of consecutive attempts within a predetermined time period, one or more activations of the button 2404 (e.g., two presses of the button 2404) can no longer restart the biometric authentication process (e.g., as described with reference to fig. 24B-24J).
In fig. 24AF, the electronic device 2400 detects a second device 2430 (e.g., a transaction terminal) when an alternate authentication interface 2442 is displayed (e.g., via a wireless communication radio of the device). In some examples, in response to detecting the second device 2430 while the electronic device is displaying an alternative authentication interface, the electronic device 2400 displays a payment user interface and a prompt (e.g., corresponding to prompt 2416 as shown in fig. 24B) for proceeding with the payment transaction.
Fig. 24AG illustrates that the electronic device 2400 displays a payment user interface 2412 on the display 2402 and is authorized (e.g., after being successfully provided with biometric and/or alternative (such as password) authentication) to initiate a transaction using a currently selected payment account (e.g., payment account 2414). In some examples, the payment user interface 2412 includes a menu tab 2456 (e.g., at a bottom area of the interface, next to a bottom edge of the display), as shown in fig. 24 AG. In some examples, the electronic device detects a swipe gesture 2415 (e.g., in an upward direction) with the menu tab 2456. For example, the swipe gesture 2415 corresponds to a touch (in an upward direction) and swipe gesture to menu tab 2456.
In some examples, a swipe gesture 2415 on the menu tab 2456 (e.g., over the payment user interface 2412) expands the menu tab, as shown in fig. 24 AH. Once deployed, menu tab 2456 includes one or more application affordances (e.g., application affordances 2456A-2456D) corresponding to applications installed on the device and accessible from the menu tab. For example, menu tab 2456 includes a first application affordance 2456A corresponding to an instant messaging application, a second application affordance 2456B corresponding to a voice call application, a third application affordance 2456C corresponding to an email application, and a fourth application affordance 2456D corresponding to a browsing application. In some examples, only the first party application (controlled only by the operating system of the device) may be included within menu tab 2456.
Fig. 24AI shows the electronic device 2400 detecting the slide gesture 2415 while the gesture is sliding in a downward direction on the display (and thus zooming out of the expanded menu bar). As a result of the slide gesture 2415 on the menu bar 2456 in the downward direction, the menu bar retracts (or folds back) to its original size and position (e.g., at the bottom of the payment user interface 2412), as shown in fig. 24 AJ. Once the menu bar has been fully collapsed, the payment user interface is again fully visible on the display.
Fig. 24AK illustrates the electronic device 2400 displaying a web page 2458 of a browsing application on the display 2402. For example, web page 2458 is a checkout page for an item 2460 that the user wishes to purchase and includes a purchase affordance 2462 for proceeding to purchase the item. In some examples, as shown in fig. 24AK, the electronic device detects a user activation 2417 of the purchase affordance 2462.
In some examples, upon detecting user activation of the purchase affordance 2462, the electronic device 2400 displays a payment listing interface 2464 on the display 2402, as shown in fig. 24 AL. In some examples, the payment checklist interface 2464 overlays (in part) the browsing application and includes a biometric authentication interface 2420. In some examples, as also shown in fig. 24AL, in addition to the payment inventory interface, the device displays (to the user) a prompt 2466 (e.g., corresponding to prompt 2416) indicating that one or more activations (e.g., two presses of button 2404) are to be provided to proceed with providing authorization for the purchase.
In some examples, the payment listing interface 2464 includes one or more details (e.g., payment account, shipping method, billing address, shipping address, contact information) related to the proposed transaction, as shown in fig. 24 AL. In some examples, the one or more details include the selected payment account. In some examples, the user may change the selected payment account to a different payment account by selecting the details area 2464A (area or icon therein) corresponding to the selected payment account. In some examples, the one or more details include a selected delivery method. In some examples, the user may change the selected shipping method to a different shipping method by selecting a detail area 2464B (an area or icon within) corresponding to the selected shipping method. In some examples, the one or more details include a selected address (e.g., billing address, shipping address). In some examples, the user may change the selected address to a different address by selecting a detail area 2464C (an area or icon within) corresponding to the selected address. In some examples, the one or more details include the selected contact information (e.g., email, phone number). In some examples, the user may change the selected contact information to a different contact information by selecting the detail area 2464D (an area or icon within) corresponding to the selected contact information.
In fig. 24AM, subsequent to displaying the payment statement interface 2464 and the prompt 2466 over the web page 2458 of the browsing application, the electronic device 2400 detects an input (e.g., two presses of the button 2404) corresponding to the request of the prompt 2466. In some examples, upon receiving an input corresponding to a request for the prompt 2466 (e.g., two presses of the button 2404), a token symbol 2468 (e.g., corresponding to the token symbol 2422) is provided within a biometric authentication interface 2420 displayed within the payment listing interface 2464. Further in response to the input, one or more biometric sensors of the electronic device 2400, such as the one or more cameras or facial recognition sensors (e.g., included in the one or more biometric sensors 2403), are activated.
In some examples, once the one or more biometric sensors are activated, the electronic device 2400 obtains (e.g., captures) biometric data corresponding to a biometric feature associated with the user. In some examples, the biometric feature captures biometric data using one or more biometric sensors 2403 of the electronic device (and/or of the one or more cameras). Optionally, a light-emitting device such as an IR floodlight or a structured light projector is used to help illuminate the biometric feature. In other examples, the electronic device receives biometric data from another device.
In some examples, once the electronic device 2400 has obtained the biometric data, the electronic device processes (e.g., analyzes) the biometric data to determine whether the biometric authentication was successful. In some examples, the determination includes determining whether the biometric data matches a biometric template associated with the user. The biometric template is optionally stored on the electronic device 2400.
In some examples, as shown in fig. 24AO, a biometric authentication interface 2420 within a payment checklist interface 2464 indicates that biometric data is being processed by the electronic device, e.g., by displaying one or more spinner rings 2470 (e.g., corresponding to one or more spinner rings 2424) using the biometric authentication interface. In some examples, one or more rotating rings 2470 replace the landmark symbols 2468 within the biometric authentication interface.
If the electronic device 2400 determines that the biometric authentication is successful (e.g., the biometric data matches a biometric template associated with the user), the electronic device transitions from a first state in which a function (e.g., authorizing transmission of payment credentials for a payment transaction) is disabled to a second state in which the function is enabled. As such, if the biometric authentication is successful, the device is in a state in which payment credentials (e.g., associated with the payment account 2472) are authorized to be transmitted (e.g., to the transaction terminal, to an external server) for use in a payment transaction (e.g., to make a purchase of the item 2460). In some examples, the electronic device also indicates that the biometric authentication was successful, for example, by displaying (to the user) a flag symbol 2474 (e.g., corresponding to flag symbol 2426) indicating that the biometric authentication was successful, as shown in fig. 24 AP. In some examples, the landmark symbols 2474 replace one or more rotating rings 2470 within the biometric authentication interface.
In some examples, in response to determining that the biometric authentication was successful, electronic device 2400 processes the payment transaction (e.g., transmits the payment credential to an external device, such as an external server, and receives a response from the external device indicating that the credential was successfully received). In some examples, as shown in fig. 24AQ, the electronic device 2400 also displays (to the user) a processing indication 2476 within the payment inventory interface 2464 indicating that the payment transaction is being processed (e.g., having a pattern similar or identical to one or more rings 2470). In some examples, upon completion of processing of the transaction, electronic device 2400 replaces in-process indication 2476 with a completed indication 2467 (e.g., which includes a check mark indicating completion), as shown in fig. 24AR, such that the payment transaction (and successful purchase of item 2460) has been successfully completed (to the user).
In some examples, to make a payment within an application or on the web (e.g., web page 2458), the electronic device requires the user to confirm the intent to pay by activating a hardware button (e.g., button 2404) (e.g., double-clicking a sleep/wake button), and then authenticate to authorize the payment using biometric authentication (e.g., facial recognition authentication). In some examples, if the payment transaction is not completed within a predetermined time threshold (e.g., 30 seconds) of activating the hardware button (e.g., 2404) (e.g., double-clicking the sleep/wake button), the device requires the user to confirm the payment intent by again activating the hardware button (e.g., button 2404) (e.g., double-clicking the sleep/wake button).
Fig. 24AS shows electronic device 2480 (e.g., a laptop computer) displaying a web page 2484 (e.g., similar to web page 2458) of a browsing application on display 2482. For example, web page 2484 is a check out page for item 2486 that the user wishes to purchase. In some examples, the web page 2484 of the browsing application includes a purchase affordance 2488 for using a different electronic device 2480 (e.g., using the electronic device 2400) to provide for proceeding with purchasing the item.
In some examples, user activation of the purchase affordance 2480 on the electronic device 2480 causes the electronic device 2400 to display a remote payment user interface 2490, as shown in fig. 24 AT. In some examples, the remote payment user interface 2490 includes a (graphical) indication 2492 of a device (e.g., electronic device 2480) that is requesting remote authentication for proceeding with a payment transaction, an indication 2494 of a payment account currently selected for the payment transaction, and a prompt 2496 (e.g., corresponding to prompt 2416) requesting that the user provide one or more activations (e.g., two presses) of the button 2404 to proceed with providing authentication (e.g., biometric authentication) for the payment transaction. In some examples, as also shown in fig. 24AT, the prompt 2496 is highlighted relative to one or more other displayed objects (e.g., relative to the indication 2492 to the electronic device 2480). Highlighting the prompt in this manner includes, for example, darkening, obscuring, and/or otherwise obfuscating one or more portions of the remote payment user interface 2490.
In some examples, indication 2494 of the payment account currently selected for the payment transaction includes affordance 2494A. In some examples, as shown in fig. 24AU, the electronic device 2400 detects a user activation 2421 (e.g., a tap gesture) of the affordance 2494A. In some examples, in response to detecting user selection of the affordance 2494A, the electronic device 2400 displays, within the remote payment user interface 2490, representations of one or more payment accounts (e.g., payment account 2494, payment account 2498) provided on the device (and thus available for payment transactions), as shown in fig. 24 AV. In some examples, as also shown in fig. 24AV, the representation of the currently selected payment account also includes an indication 2494B (e.g., a check mark) indicating that it corresponds to the currently selected account.
In fig. 24AW, when a representation of a payment account (e.g., payment account 2494, payment account 2498) provided on the electronic device is displayed within the remote payment user interface 2490, the electronic device 2400 detects a user selection 2423 of a payment account (e.g., payment account 2498) that is different from the currently selected payment account (e.g., payment account 2494). In response to detecting user selection 2423 of payment account 2498, electronic device 2400 removes the representation of available payment accounts from remote payment user interface 2490 and displays payment account 2498 (instead of payment account 2494) as the payment account currently selected for the payment transaction, as shown in fig. 24 AX.
As also shown in fig. 24AX, after replacing payment account 2494 with payment account 2498, electronic device 2400 detects one or more activations 2425 (e.g., two presses) of button 2404 corresponding to the request of reminder 2496. In fig. 24AY, in response to detecting one or more activations 2425 of the button 2404, the electronic device 2400 displays a biometric authentication interface 2420 within the remote payment user interface 2490. In some examples, the electronic device also displays a landmark symbol 2499 (e.g., corresponding to landmark symbol 2422) within a biometric authentication interface 2420 displayed within the remote payment user interface 2490. Further in response to the input, one or more biometric sensors of the electronic device 2400, such as the one or more cameras or facial recognition sensors (e.g., included in the one or more biometric sensors 2403), are activated.
In some examples, once the one or more biometric sensors are activated, the electronic device 2400 obtains (e.g., captures) biometric data corresponding to a biometric feature associated with the user. In some examples, the biometric feature captures biometric data using one or more biometric sensors 2403 of the electronic device (and/or of the one or more cameras). Optionally, a light-emitting device such as an IR floodlight or a structured light projector is used to help illuminate the biometric feature. In other examples, the electronic device receives biometric data from another device.
In some examples, once the electronic device 2400 has obtained the biometric data, the electronic device processes (e.g., analyzes) the biometric data to determine whether the biometric authentication was successful. In some examples, the determination includes determining whether the biometric data matches a biometric template associated with the user. The biometric template is optionally stored on the electronic device 2400.
In some examples, as shown in fig. 24AZ, a biometric authentication interface 2420 within the remote payment user interface 2490 indicates that biometric data is being processed by the electronic device, e.g., by displaying one or more spinner rings 2497 (e.g., corresponding to one or more spinner rings 2424) using the biometric authentication interface. In some examples, the one or more rotating rings 2497 replace the landmark symbols 2499 within the biometric authentication interface.
If the electronic device 2400 determines that the biometric authentication is successful (e.g., the biometric data matches a biometric template associated with the user), the electronic device transitions from a first state in which a function (e.g., authorizing transmission of payment credentials for a payment transaction) is disabled to a second state in which the function is enabled. As such, if the biometric authentication is successful, the device is in a state in which payment credentials (e.g., associated with the payment account 2498) are authorized to be transmitted (e.g., to the transaction terminal, to an external server) for use in a payment transaction (e.g., to make a purchase of the item 2486). In some examples, the electronic device also indicates that the biometric authentication was successful, for example, by displaying (to the user) a flag symbol 2495 (e.g., corresponding to flag symbol 2426) indicating that the biometric authentication was successful, as shown in fig. 24 BA. In some examples, the landmark symbols 2495 replace one or more rotating rings 2497 within the biometric authentication interface 2420 of the remote payment user interface 2490.
In some examples, in response to determining that the biometric authentication was successful, electronic device 2400 processes the payment transaction (e.g., transmits the payment credential to an external device, such as an external server, and receives a response from the external device indicating that the credential was successfully received). In some examples, as shown in fig. 24BB, the electronic device 2400 also displays (to the user) a handling indication 2493 within the remote payment user interface 2490 indicating that the payment transaction is being handled (e.g., having a pattern similar or identical to one or more rings 2497). In some examples, upon completion of processing of the transaction, electronic device 2400 replaces in-process indication 2493 with a completed indication 2491 (e.g., which includes a check mark indicating completion), as shown in fig. 24BC, such that the payment transaction (and the item 2486 have been successfully completed (and purchased) has been successfully completed (to the user).
Fig. 25A-25C are flowcharts illustrating methods for performing biometric authentication using an electronic device, according to some examples. The method 2500 is performed at a device (e.g., 100, 300, 500, 1900) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2000 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 2500 provides an intuitive way to manage biometric authentication. The method reduces the cognitive burden on the user in managing the biometric authentication, thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling users to manage biometric authentication more quickly and efficiently conserves power and increases the interval between battery charges.
In some examples, an electronic device (e.g., 2300, 2400) receives (2502) a request from a second device to proceed with an action, wherein the request includes information associated with one or more options selected at the second device. In some examples, the electronic device receives a request to proceed with a transaction and also receives information from the second device regarding details of the transaction. In some examples, the action relates to a transaction. In some examples, prior to receiving user input corresponding to a request to participate in a transaction, the electronic device receives input corresponding to the transaction from the second device (where the input includes one or more details of the transaction), and displays one or more transaction details associated with the transaction and a request for authorization to proceed with the transaction.
The electronic device (e.g., 2300, 2400) detects (2508) one or more activations of a button (e.g., 2304, 2404) (e.g., two presses of a button, such as a hardware button or a mechanical button, two presses of the button 2304 or the button 2404) while the electronic device is in a first state in which respective functions of the device are disabled. In some examples, the respective function is associated with a financial transaction (such as payment for goods or services). In some examples, the device may not be engaged in a transaction in the case where the functionality is disabled.
In some examples, the respective function of the electronic device (e.g., 2300, 2400) is to participate in a transaction (2510). In some examples, participating in the transaction includes transmitting the security data from the electronic device. In some examples, the secure data is financial data. In some examples, the transaction additionally or alternatively includes transmission of unsecure data. In some examples, information that enables the device to participate in the transaction is securely stored in a secure element (e.g., a physically and/or logically separate memory that stores credentials in a manner that prevents them from being maliciously accessed). In some examples, when in the first state, the electronic device is disabled from engaging in a transaction (e.g., a financial transaction, such as a payment for a good or service). In some examples, when the device is in the first state, information that enables the device to participate in the transaction is not accessible at the device outside of the secure element (e.g., the payment credentials are not available for the wireless payment transaction).
In some examples, the respective function of the electronic device (e.g., 2300, 2400) is a function (2512) of providing information associated with the action via a short-range communication radio of the electronic device. In some examples, the electronic device is in a first state when in an inactive state in which a display (e.g., 2302, 2402), biometric sensor (e.g., 2303, 2403), and/or one or more other components of the device are inactive. In some examples, the electronic device is not configured to engage in a transaction when in the first state because the respective function of the device is a function of providing information (e.g., payment information) associated with an action (e.g., a transaction) via a short-range communication radio (e.g., an NFC transmitter) of the device. For example, the device does not respond to the request for payment information with payment information until the user provides authorization to provide the payment information, such as receiving a double click of a hardware button (e.g., 2304, 2404) in conjunction with biometric or password/password authentication.
In some examples, the one or more activations of the button (e.g., 2304, 2404) occur at least in part when a display of the electronic device (e.g., 2300, 2400) is closed or displaying a user interface that is not related to payment (e.g., a lock screen user interface, a front cover user interface that includes a plurality of recent notifications, a home screen user interface that includes application icons for a plurality of different applications and optionally one or more desktop widgets, or an application user interface for an application that is not a payment application, such as an email application, a phone application, a messaging application, or a camera application) (2514).
In some examples, when a display of an electronic device (e.g., 2300, 2400) is off or displaying a user interface unrelated to payment, the electronic device is not configured to transmit payment information to a payment terminal (e.g., 2430) in response to a request for payment information (e.g., to protect the payment information from being inadvertently provided in situations in which the user does not intend to provide the payment information). In some examples, in at least some instances, when a display of the device is off or displaying a user interface unrelated to payment, the electronic device listens for requests for payment information and responds to at least some requests for payment information by displaying a payment user interface (e.g., a virtual wallet) that informs the user that payment information has been requested and prompts the user to provide authorization for providing payment information.
In some examples, the one or more activations of the (hardware) button include two presses of the button (2516) (e.g., a first press and a second press of the button within a predetermined time period). In some examples, the hardware buttons (e.g., 2304, 2404) are located on one side of the electronic device (e.g., 2300, 2400). In some examples, the hardware button is a mechanical button. In some examples, the button is activated for a predetermined period of time without a second activation of the button to perform a different function (e.g., turn on or off a display of the device). In some examples, different activations of the button (e.g., long presses of the button for different periods of time) cause different functions to occur (e.g., entering a user interface for turning off the device or invoking a virtual assistant).
In some examples, the one or more activations of the (hardware) button (e.g., 2304, 2404) are detected while the first application is active on the electronic device (2518). In some examples, detecting the one or more activations of the button occurs while a tutorial interface (e.g., 2410) is displayed (2520). In some examples, the electronic device (e.g., 2300, 2400) does not display the tutorial interface, but displays the biometric authentication interface (e.g., 2322, 2420) and performs the biometric authentication. Performing biometric authentication (e.g., rather than different types of authentication, such as password authentication) allows a user to more quickly and easily (e.g., without input and within a shorter period of time) provide authentication for performing a particular operation (e.g., a transaction) using a device. Reducing the number of inputs required to perform an operation enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the button (e.g., 2304, 2404) has a fixed position relative to a display of the electronic device (e.g., 2300, 2400) (e.g., the button is not entirely a software button). In some examples, the button is a hardware button, such as a mechanical button or a solid state button. In some examples, the button is a switch or any other type of toggle element. In some examples, the buttons have fixed positions relative to the electronic device and more particularly relative to a display of the electronic device such that the electronic device can display the prompt based on the position of the button.
In some examples, the buttons (e.g., 2304, 2404) are mechanical buttons (e.g., hardware buttons, such as push buttons). In some examples, the buttons are not software buttons, such as buttons on a touch screen of an electronic device (e.g., 2300, 2400). In some examples, the buttons are solid state buttons. In some examples, the button is a solid state button that operates according to a capacitive and/or resistive touch and/or that responds to a change in intensity of an input, without a mechanical switch that is pressed to activate the button and instead monitors whether the intensity of the input is above an intensity threshold corresponding to activation of the solid state button.
In some examples, prior to detecting (2508) the one or more activations of a button (e.g., 2304, 2404), the electronic device (e.g., 2300, 2400) outputs (2504) (e.g., by displaying on a display) a prompt (e.g., 2318, 2416) requesting that the one or more activations of the button be provided. In some examples, the electronic device prompts the user by displaying "double click pay. In some examples, the prompt is displayed on a display (e.g., 2302, 2404) of the electronic device. In some examples, the prompt is displayed adjacent to the button. In some examples, the prompt is an audible and/or tactile prompt. In some examples, the prompt is displayed when the device is displaying the transaction user interface but does not receive any indication that the transaction terminal is nearby and requesting the transaction credentials (e.g., the prompt to provide the one or more activations of the button is displayed before the device has been placed in the NFC field of the NFC reader that is requesting payment information). In some examples, prior to outputting the prompt, the electronic device displays a tutorial interface that includes the affordance.
In some examples, outputting the prompt (e.g., 2318, 2416) occurs in response to selection of the affordance. In some examples, the tutorial interface (e.g., 2410) is displayed when the user first attempts to implement the respective function without providing the one or more activations of the button. In some examples, the tutorial interface includes an animation at a location based on the location of the button (e.g., 2304, 2404) on the device (e.g., the animation includes movement of the user interface object in a direction that the button may be pushed on the device at a location adjacent or proximate to the button).
In some examples, outputting the prompt (e.g., 2318, 2416) to the user for providing the one or more activations of the button includes highlighting the prompt (2506) relative to one or more elements displayed on a display of the electronic device (e.g., 2300, 2400). In some examples, highlighting the cue includes blurring, dimming, and/or ceasing to display at least a portion of a display of the electronic device. In some examples, highlighting the cue includes brightening the cue, flashing the cue, or otherwise drawing attention to the cue. In some examples, highlighting the cue relative to the one or more elements displayed on the display of the electronic device includes blurring the one or more elements. In some examples, all elements (except cues) displayed on the display (e.g., 2302, 2402) are obscured. In some examples, only elements adjacent to the hint are obscured. In some examples, highlighting the cue relative to the one or more elements displayed on the display of the electronic device includes dimming the display of the one or more elements. In some examples, all elements displayed on the display (except the prompts) are dimmed. In some examples, only elements adjacent to the cue are dimmed. Dimming in this manner optionally includes reducing the brightness and/or darkening the displayed color.
In some examples, outputting the prompt (e.g., 2318, 2416) occurs in response to detecting an external signal of a predetermined type. In some examples, an electronic device (e.g., 2300, 2400) detects a signal, such as an NFC field from an NFC reader, such as a payment terminal (e.g., 2430), and prompts a user to provide input to initiate a biometric authentication process to authorize the device to make payment credentials available for transmission to the NFC reader.
In some examples, outputting the prompt (e.g., 2318, 2416) includes displaying a teaching interface including a prompt element adjacent to the button. In some examples, the electronic device (e.g., 2300, 2400) prompts the user to provide the one or more activations of the button (e.g., 2304, 2404) through an interface in which the prompt is displayed near and/or indicates a location of the button. In some examples, the teaching interface is displayed in an instance in which the user has attempted to implement the respective function of the electronic device but has not provided the one or more activations required to initiate biometric authentication to enable the electronic device to implement the respective function.
In some examples, an electronic device (e.g., 2300, 2400) displays, on a display, a prompt to provide the one or more activations of a button (e.g., 2304, 2404) at a first location in a biometric authentication interface (e.g., 2322, 2420). The output request provides a prompt for one or more activations of the button to provide feedback to the user about the current state of the device and to provide visual feedback to the user indicating which steps the user must take in order to use the device to proceed with a particular function. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, an electronic device (e.g., 2300, 2400) detects activation (e.g., selection) of an affordance of a first application (e.g., activation of an affordance of an application requesting a transaction for a good or service). In some examples, the first application is a communication application. In some examples, the device displays details of the transaction and one or more of an affordance for initiating the transaction and a prompt for triggering biometric authentication. In some examples, the details of the transaction are optionally modified before activating the affordance or before detecting the biometric feature (after the biometric authentication has been triggered by user input (e.g., two presses)).
In some examples, in response to detecting (2522) activation of the affordance for the first application, the electronic device (e.g., 2300, 2400) provides (e.g., transmits) information associated with an action from the first application to the second application. In some examples, the action from the first application to the second application involves a transaction. In some examples, prior to receiving user input corresponding to a request to engage in a transaction, the electronic device detects activation of an affordance of a first application, provides information about the transaction from the first application to a second application in response to detecting activation of the affordance of the first application, and continues the transaction using the second application.
In some examples, the electronic device also simultaneously displays on the display (e.g., 2302, 2402) at least a portion of the information associated with the action at the first location (e.g., in a payment user interface area that is separate from the first application and includes transaction information not shared with the first application, such as a credit card number, a billing address) and a second prompt at the second location to provide the one or more activations of the button, wherein the second location is closer to the button than the first location.
In some examples, prior to receiving activation of a button (e.g., 2304, 2404), details of the transaction are limited to a particular portion of the display such that a prompt for providing activation of the button may be displayed adjacent to the button. Limiting the details of the transaction to a particular portion of the display so that a prompt can be displayed adjacent to the relevant button provides the user with visual feedback that allows the user to more quickly and easily follow the request of the prompt. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the details are constrained to a particular height of the display. In some examples, if there is more information than can be displayed in the available area under the second prompt, information associated with the action is displayed in a scrollable area that scrolls in response to user input to display additional information that was hidden (e.g., off-screen) prior to the scrollable area scrolling. In some examples, providing information about a transaction from a first application to a second application includes providing information that may be displayed in the available area and additional information that is hidden.
In some examples, after displaying the prompt (e.g., 2318, 2416), the electronic device (e.g., 2300, 2400) moves the representation of the user credentials from the second location on the display (e.g., 2302, 2402) to the first location on the display. In some examples, the representation of the user credential is moved such that the user credential covers a prompt for pressing (e.g., double-clicking) a button (e.g., 2304, 2404) and/or exposes a biometric authentication flag symbol (e.g., 2324, 2422). In some examples, moving the representation of the user credential from the second location to the first location includes displaying a biometric authentication flag symbol at a portion of the display occupied by the user credential when the user credential is displayed at the second location.
In response to detecting (2522) the one or more activations of the button, the electronic device (e.g., 2300, 2400) captures (2524) biometric data with the one or more biometric sensors (e.g., 2303, 2403) separate from the button (e.g., 2304, 2404). In some examples, the device receives biometric data, such as data for a user's face, in response to two presses of a hardware button. In some examples, the one or more biometric sensors include a facial recognition sensor, and the biometric data corresponds to at least a portion of a face.
In some examples, capturing biometric data includes capturing biometric data using a camera. In some examples, biometric data is captured using a camera and/or facial recognition sensor (e.g., 2303, 2403). In some examples, a camera is used to ensure that the user is looking at the device and a facial recognition sensor is used to authenticate the user's face.
In some examples, capturing biometric data with the one or more biometric sensors includes activating the one or more biometric sensors (e.g., 2303, 2403) for a second predetermined period of time. For example, in response to pressing a button (e.g., 2304, 2404), the electronic device (e.g., 2300, 2400) activates one or more biometric sensors (e.g., 2303, 2403) (e.g., transitions the biometric sensors from an inactive state to an active state), such as a facial recognition sensor or a camera, and captures biometric data using the activated one or more biometric sensors. In some examples, the activated one or more biometric sensors are activated for a certain period of time, and if biometric data is not captured during the period of time, the biometric authentication process fails. In some examples, the second predetermined time period begins when the one or more activations of the button are detected. In some examples, the time period is initiated when a button is pressed. In some examples, capturing biometric data in this manner includes illuminating the biometric feature and capturing data corresponding to the illumination. In some examples, the biometric feature is illuminated using an IR floodlight or a structured light projector.
In some examples, in response to detecting (2522) the one or more activations of the button, the electronic device (e.g., 2300, 2400) also displays (2526) a biometric authentication interface (e.g., 2322, 2420) that includes a representation of user credentials that are restricted from use without a correct biometric authentication. In some examples, the biometric authentication interface includes an authentication token (e.g., 2324, 2422) and/or one or more representations of user credentials (e.g., images representing data to be used by a function of the electronic device, such as an image of a credit card, an image of a bank account, an image of a business card). Providing an authentication token provides the user with easily visible and recognizable visual feedback regarding the current state or progress of the authentication process. Providing the user with improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide appropriate input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the function uses the represented data to perform a transaction. In some examples, biometric authentication is performed for a particular credential and the credential is highlighted relative to other credentials. In some examples, the biometric authentication interface (e.g., 2322, 2420) includes an animation (e.g., a card carousel).
In accordance with a determination that the biometric data satisfies biometric authentication criteria (e.g., the biometric criteria match a biometric template stored on the device), the electronic device (e.g., 2300, 2400) transitions (2528) to a second state in which respective functions of the device are enabled. In some examples, the device may participate in the transaction in an instance in which the device is enabled. In some examples, the electronic device determines whether the biometric data meets biometric authentication criteria. Limiting the ability of a device to participate in a transaction to situations in which the device is enabled (to participate in the transaction) provides the user with more control over the device by helping the user avoid inadvertently performing the transaction and, at the same time, providing enhanced device security. Providing additional control over the device without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, when in the second state, the electronic device (e.g., 2300, 2400) is enabled to participate in the transaction. In some examples, when the device is in the second state, the secure element temporarily enables access to information that enables the device to participate in the transaction at a device external to the secure element (e.g., payment credentials are available for wireless payment transactions).
In some examples, after transitioning to the second state, the electronic device (e.g., 2300, 2400) holds (2530) the device in the second state for a first predetermined period of time (e.g., 60 seconds). In some examples, the functionality of the electronic device is enabled even when the user credentials change (e.g., due to a card switch). In some examples, the first predetermined time period begins when the one or more activations of the button are detected (e.g., when the button is pressed). In some examples, after transitioning to the second state, the electronic device transitions (2532) from the second state to the first state after a first predetermined period of time has elapsed. In some examples, after transitioning to the second state, the electronic device performs (2534) an information-based action (e.g., transmitting credentials to a remote server for processing the transaction) using a respective function of the electronic device. In some examples, the electronic device proceeds with the transaction using the credential provided in the request. In some examples, credentials are provided to a remote server for processing a transaction. In some examples, the electronic device causes an indication to be displayed at the other device indicating whether the authentication was successful. In some examples, the electronic device causes an indication to be displayed at another device (e.g., the other device or a second device) indicating whether the transaction was successful. In some examples, the prompt is output when the details of the transaction are displayed.
In some examples, while the device is in the second state, the electronic device (e.g., 2300, 2400) detects (2536) a user input corresponding to a request to exit the first application. In some examples, in response to detecting a user input corresponding to a request to exit the first application, the electronic device exits (2538) the first application and transitions to the first state. In some examples, when the device is enabled to participate in the transaction, exiting the application in which the received input caused the device to be enabled to participate in the transaction causes the device to cease being enabled to participate in the transaction. In some examples, when the device has been authorized to provide payment credentials for a payment transaction in the wallet application and the device switches from the wallet application to a different application, the device disables the ability to provide payment credentials until the provision of payment credentials is re-authorized by the user (e.g., with biometric authentication). Thus, accidental transmission of payment information is avoided by disabling transmission of payment information when the device is not displaying a user interface indicating that the device is configured to provide payment credentials.
In accordance with a determination that the biometric data does not satisfy the biometric authentication criteria (e.g., the biometric data does not match a biometric template stored on the device), the electronic device (e.g., 2300, 2400) maintains (2540) the first state and displays (2540) on the display an indication that the biometric authentication has failed. Maintaining the first state (e.g., a state in which the device is not authorized to proceed with the transaction) when authentication has failed provides enhanced device control and device security for the user. Providing additional control over the device and enhanced device security without cluttering the UI with additional displayed controls enhances operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the electronic device (e.g., 2300, 2400) also displays (2542) an alternative authentication affordance (e.g., 2334, 2440). In some examples, the alternative authentication affordance is an affordance that, when selected, causes the electronic device to display an interface (e.g., 2336, 2442) in which a user can provide an alternative form of authentication (e.g., a non-biometric form of authentication), such as a password, or pattern entry. In some examples, successful authentication via the alternative authentication causes the electronic device to transition to the second state. In some examples, a first failure results in the display of a "try again" affordance, and a second failure results in the display of an alternative authentication affordance (e.g., 2336, 2442), such as a "password" affordance. In some examples, the electronic device receives a user input, such as two presses of a button. In some examples, if a threshold number of biometric authentication attempts has been reached, the electronic device displays an affordance for entering a password (or password), and optionally an indication that biometric authentication is not available and/or a threshold number has been reached (e.g., "password required to enable FaceID").
In some examples, after determining that the biometric data does not satisfy the biometric authentication criteria (e.g., upon displaying an alternative authentication interface or alternative authentication affordance), the electronic device (e.g., 2300, 2400) detects (2544) selection of the alternative authentication affordance (e.g., 2334, 2440). In some examples, in response to detecting selection of an alternative authentication affordance, an electronic device (e.g., 2300, 2400) displays (2546) an alternative authentication interface (e.g., 2336, 2442), such as a password or passcode interface, on a display in response to selection of the affordance.
In some examples, the electronic device (e.g., 2300, 2400) also captures (2548) second biometric data with the one or more biometric sensors (e.g., 2303, 2403). In some examples, in accordance with a determination that the second biometric data satisfies the biometric authentication criteria, the electronic device transitions (2550) to the second state. In some examples, the electronic device performs a second iteration of biometric authentication in response to selection of the affordance.
In some examples, biometric authentication is performed during display or transition to an alternative authentication interface (e.g., 2336, 2442). In some examples, an alternative authentication interface includes display of a sequence of biometric authentication flag symbols such that the user is informed that biometric authentication is in progress. In some examples, successful biometric authentication bypasses the need for alternative authentication. Thus, the electronic device (e.g., 2300, 2400) ceases display of the alternative authentication interface (e.g., 2336, 2442) and continues as if the user had successfully authenticated at the first attempt.
In some examples, in accordance with a determination that the second biometric data does not satisfy the biometric authentication criteria, the electronic device (e.g., 2300, 2400) maintains (2552) the first state and displays an alternative authentication interface (e.g., 2336, 2442) on the display (e.g., 2302, 2402). In some examples, upon failure, display of an alternative authentication interface is maintained such that the user optionally provides alternative authentication. Maintaining the display of an alternative authentication interface (upon failure) allows a user to provide alternative authentication by providing the user with a number of different ways to provide authentication for a particular operation to be performed by the device. Providing additional control over the device without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, an electronic device (e.g., 2300, 2400) detects (2554) a respective user input corresponding to a request to retry biometric authentication. In some examples, the electronic device detects a user input, such as a button press or movement of the electronic device (e.g., raising and/or lowering of the device) or selection of an alternative authentication affordance (e.g., 2334, 2440). In some examples, the user input corresponding to the request to retry biometric authentication includes one or more activations of a button. In some examples, the user input includes the one or more activations of the button to initiate the first iteration of biometric authentication. In some examples, the user input corresponding to the request to retry biometric authentication includes a movement of the electronic device. In some examples, the predetermined type of user input is an input other than an activation of a button. In some examples, the predetermined type of user input is a raising and/or lowering of the electronic device (e.g., the electronic device is lowered into proximity with another electronic device, such as an NFC-compliant device, and raised back to the eye level of the user).
In some examples, in response to detecting a user input corresponding to a request to retry biometric authentication, the electronic device (e.g., 2300, 2400) captures (2556) third biometric data with the one or more biometric sensors. In some examples, the device performs additional iterations of biometric authentication in response to user input. In some examples, in accordance with a determination that the third biometric data satisfies the biometric authentication criteria, the electronic device transitions (2558) to a second state in which the respective function of the device is enabled. In some examples, in accordance with a determination that the third biometric data does not satisfy the biometric authentication criteria, the electronic device remains (2560) in the first state (and optionally displays an indication on the display that the biometric authentication has failed).
In some examples, the electronic device (e.g., 2300, 2400) detects one or more additional activations of the button (e.g., 2304, 2404). In some examples, in accordance with a determination that the biometric capture criteria are met, the electronic device captures second biometric data with the one or more biometric sensors (e.g., 2303, 2403) separate from the buttons (e.g., 2304, 2404). In some examples, in accordance with a determination that the biometric capture criteria are not satisfied, the electronic device forgoes capturing the second biometric data. In some examples, the number of biometric authentication attempts is limited to a predetermined number (e.g., 5). In some examples, the number is reset in response to a successful authentication. In some examples, the number is reset after a set amount of time.
Note that the details of the processes described above with respect to method 1200 (e.g., fig. 25A-25C) may also be applied in a similar manner to the methods described herein. For example, method 2500 optionally includes one or more of the features of the various methods described herein with reference to methods 800, 1000, 1200, 1400, 1600, 1800, 2000, 2200, and 2700. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to fig. 24F-G. As another example, in response to receiving an input prior to completion of the biometric authentication process, one or more plug-in interfaces as described in methods 2000 and 2700 are optionally displayed. For the sake of brevity, these details are not repeated here.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 25A-25C are optionally implemented by the components depicted in fig. 1A-1B. For example, detection operation 2508, conversion operation 2528, and hold operation 2540 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Fig. 26A-26 AS illustrate example user interfaces for biometric authentication, according to some examples. AS described in more detail below, the exemplary examples of user interfaces shown in fig. 26A-26 AS are used to illustrate the processes described below, including the processes in fig. 27A-27E.
Fig. 26A shows an electronic device 2600 (e.g., portable multifunction device 100, device 300, or device 500). In the illustrative example shown in fig. 26A-26 AS, electronic device 1900 is a smartphone. In other examples, the electronic device 1500 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 1900 has a display 2602, one or more input devices (e.g., a touch screen of the display 2602, buttons 2604, a microphone (not shown)), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the electronic device includes one or more biometric sensors (e.g., biometric sensor 2603), which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the one or more biometric sensors 2603 are the one or more biometric sensors 703. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In fig. 26A, the electronic device displays an unlock interface 2606 when in the unlocked state. The unlock interface 2606 includes a notification affordance 2608 and an unlock status indicator 2610. In some examples, because electronic device 2600 is in an unlocked state, notification affordance 2608 includes an indication of secure content associated with notification affordance 2608. For example, as shown, the notification affordance is associated with an instant messaging application and includes at least a portion of a message received by the electronic device.
Referring to fig. 26B-26D, while the unlock interface 2606 is displayed, the electronic device 2100 detects a user input 2612 (fig. 26B), for example, near an edge of the display 2602. User input 2612 is a swipe gesture, which in some examples is a request to access the home screen interface of electronic device 2600, and in response to swipe input 2612, the electronic device displays home screen interface 2614 of fig. 26D (e.g., replaces display of unlock interface 2606 therewith). In some examples, displaying the home screen interface 2129 includes sliding the unlock interface 2606 in an upward direction to display (e.g., expose) the home screen interface 2614, as similarly described with reference to fig. 19P-R.
In fig. 26E, the electronic device displays an unlock interface 2606 when in the unlocked state. The unlock interface 2606 includes a notification affordance 2608 and an unlock status indicator 2610. In some examples, because electronic device 2600 is in an unlocked state, notification affordance 2608 includes an indication of secure content associated with notification affordance 2608. For example, as shown, the notification affordance is associated with an instant messaging application and includes at least a portion of a message received by the electronic device.
While the unlock interface 2606 is displayed, the electronic device detects activation of the notification affordance 2608. In some examples, activation of notification affordance 2608 is a tap gesture 2615. In response to activation of the notification affordance 2608, the electronic device displays the instant messaging application interface 2616 of fig. 26G (e.g., replaces display of the unlock interface 2606 therewith). 21F-G, in some examples, displaying the instant message application interface 2616 includes sliding the unlock interface 2606 in an upward direction to display (e.g., expose) the instant message application interface 2616, as similarly described with reference to FIGS. 19P-R.
In fig. 26H, the electronic device displays a lock interface 2620 when in the locked state. Lock interface 2620 includes a notification affordance 2622 and a lock status indicator 2624. In some examples, because electronic device 2600 is in a locked state, notification affordance 2622 does not include an indication of secure content associated with notification affordance 2622.
Referring to fig. 26I-K, while displaying lock interface 2620, electronic device 2600 detects user input 2628 (fig. 26I), e.g., near an edge of display 2602. User input 2628 is a swipe gesture, which in some examples is a request to access the home screen interface of electronic device 2600. In some examples, the electronic device 2600 receives the user input 2628 prior to completing the initial biometric authentication (e.g., a biometric authentication performed in response to a wake condition, as described with reference to fig. 21A-C). Thus, in response to the swipe input 2628, the electronic device displays the drop-in interface 2630 of fig. 26K (e.g., replaces the display of the lock interface 2620 with it) to indicate that the electronic device has not completed biometric authentication. In some examples, displaying the poke-in interface 2630 includes sliding the locking interface 2620 in an upward direction to display (e.g., expose) the poke-in interface 2630, as similarly described with reference to fig. 19P-R. In some examples, the plug-in interface 2630 includes a lock status indicator 2624.
Alternatively, in some examples, the electronic device determines that a threshold number of biometric authentication attempts has been reached (e.g., 5). Thereafter, in response to user input 2628, electronic device 2600 displays plug-in interface 2632. The plug-in interface includes a biometric authentication enable indicator indicating that biometric authentication is disabled (e.g., because the number of attempts has been reached). The plug-in interface 2632 also includes an alternative authentication affordance 2636 and an alternative authentication affordance 2638. Activation of alternate authentication affordance 2636 causes the electronic device to display a first alternate authentication interface, such as a fingerprint authentication interface, and activation of alternate authentication affordance 2638 causes the electronic device to display a second alternate authentication interface, such as a password authentication interface.
In some examples, while the plug-in interface 2630 is displayed, the electronic device detects biometric data (e.g., facial biometric data) and, in response, performs biometric authentication. Referring to fig. 26M, the electronic device 2600 displays a biometric progress indicator 2625 to indicate that biometric data is being processed.
In fig. 26N, the electronic device 2600 determines that the biometric authentication is successful. In response, the electronic device 2600 displays an unlock state indicator 2626 and, optionally, outputs a tactile output 2640. After indicating that the biometric authentication is successful (e.g., after a predetermined amount of time), the electronic device displays the home screen interface 2614 of fig. 26P (e.g., replacing the display of the plug-in interface 2630 therewith). Referring to fig. 21O-P, in some examples, displaying the home screen interface 2614 includes sliding the home screen interface 2614, as similarly described with reference to fig. 19P-R.
Alternatively, in fig. 26Q, electronic device 2600 determines that the biometric authentication was unsuccessful. In response, the electronic device 2600 alternates the position of the status indicator 2627 to simulate a "shake" effect. The electronic device 2600 also outputs a tactile output 2644 to indicate that the biometric authentication was unsuccessful. In some examples, tactile output 2644 is the same as tactile output 2640. In some examples, tactile output 2644 is different from tactile output 2640. In some examples, in response to determining that the biometric authentication was unsuccessful, the electronic device displays an alternative authentication affordance 2642.
Referring to fig. 26R, the electronic device receives an activation of lock status indicator 2624. In some examples, the activation of the lock status indicator is a tap gesture 2650 on lock status indicator 2624. In response, as shown in fig. 26S, the electronic device 2600 initiates biometric authentication. In some examples, initiating biometric authentication includes obtaining (e.g., capturing with the one or more biometric sensors 2603) data corresponding to at least a portion of the biometric feature and processing the biometric data to determine whether the biometric feature (or a portion thereof) meets biometric authentication criteria (e.g., determines whether the biometric data matches a biometric template within a threshold). In processing the biometric data, the electronic device displays a biometric progress indicator 2625 (e.g., replaces the display of the lock status indicator 2624 with it) indicating that the electronic device 2600 is processing the biometric data. If the electronic device 2600 determines that the biometric authentication is successful, the electronic device indicates success, as described with respect to fig. 26N-P.
In fig. 26T, the electronic device 2600 determines that the biometric authentication (e.g., as described with reference to fig. 26S) is unsuccessful and, in response, alternates the position of the status indicator to simulate a "shake" effect, outputs a tactile output 2652, and displays an alternative authentication affordance 2642.
In fig. 26U, the electronic device detects activation of an alternative authentication affordance 2642. In some examples, the activation of the alternative authentication affordance is a tap gesture 2654 on the alternative authentication affordance 2642. Referring to fig. 26V, in response to activation of the alternative authentication affordance 2642, the electronic device displays an alternative authentication interface 2656 (e.g., that replaces the display of the plug-in interface 2630) with which the user authenticates with the electronic device upon entering a valid password (or passcode).
Referring to fig. 26W-26Y, in some examples, the electronic device fails to detect the biometric characteristic for a predetermined amount of time, the electronic device displays one or more interfaces and/or enters a low power state. In fig. 26W, the electronic device displays a plug-in interface 2630 (recall that the electronic device displays the plug-in interface 2630 in response to a request for secure content received prior to completion of biometric authentication). If the electronic device 2600 does not detect the biometric feature for the predetermined amount of time, the electronic device displays an alternative authentication interface 2657 (e.g., replaces the display of the plug-in interface 2630 therewith). In some examples, alternative authentication interface 2657 includes an indicator to instruct the user to provide alternative authentication, such as a password. In other examples, as shown in fig. 26X, alternative authentication interface 2657 does not include an indicator that indicates that the user provides alternative authentication.
If the biometric feature is not detected for the predetermined amount of time during the display of the alternative authentication interface 2657 and no alternative authentication is provided, the electronic device 2600 transitions to a low-power state (e.g., a display disabled state), as shown in fig. 26Y.
If a biometric feature is detected during display of alternative authentication interface 2657, electronic device 2600 performs biometric authentication, as described. As shown in fig. 26Z, the electronic device displays a biometric progress indicator 2625 (e.g., replaces the display of the lock status indicator 2624 with it) to indicate that the electronic device is processing biometric data. In fig. 26AA, the electronic device 2600 determines that the biometric authentication is successful. In response, the electronic device displays the unlock state indicator 2610 (replacing the display of the biometric progress indicator 2625 therewith), and optionally, outputs a tactile output 2658 to indicate that the biometric authentication was successful. In some examples, the electronic device 2600 then displays the home screen interface 2614, as shown in fig. 26 AB.
Referring to fig. 26AC-AE, if biometric authentication fails and at least a portion of the alternative authentication is provided during display of the alternative authentication interface 2657, the electronic device 2600 indicates that the biometric authentication was unsuccessful without providing a tactile output. As shown in fig. 26AC, the electronic device receives a portion of an alternative authentication (e.g., a password) via a user input (e.g., a tap gesture) 2660 while performing biometric authentication (as indicated by a biometric progress indicator 2625). In fig. 26AD, the electronic device determines that the biometric authentication was not successful and, in response, displays the lock status indicator 2627 and also alternates the position of the lock status indicator to simulate a "shake" effect. In some examples, electronic device 2600 does not output tactile output and also maintains a display of an alternative authentication interface 2657, as shown in fig. 26 AE.
In fig. 26AF, the electronic device 2600 displays a lock interface 2620 when in the locked state. As depicted, lock interface 2620 includes notification affordances 2622 and lock status indicators 2624. In some examples, the electronic device receives a request for secure content (e.g., a message associated with notification affordance 2622) on the electronic device. Electronic device 2600 detects, for example, activation of notification affordance 2622. In some examples, the activation of notification affordance 2622 is a tap gesture 2662.
In some examples, activation of the notification affordance 2622 is received before biometric authentication is completed. Thus, as shown in fig. 26AG, in response to activation of the notification affordance 2622, the electronic device 2600 displays a plug-in interface 2629 that includes a biometric indicator 2666. In some examples, the biometric indicator 2666 identifies secure content associated with the received request for secure content.
As shown in figure AH, if the biometric feature is not detected by the electronic device 2600 while the plug-in interface 2629 is displayed, the electronic device displays an alternative authentication affordance 2668. In some examples, activation of alternative authentication affordance 2668 causes the electronic device to display an alternative authentication interface (e.g., alternative authentication interface 2657 of fig. 26X).
If the biometric feature is not detected for the predetermined amount of time during the display of the alternative authentication interface and the alternative authentication is not provided, then electronic device 2600 transitions to a low-power state (e.g., a display disabled state), as shown in fig. 26 AI.
As described with respect to figure AH, if no biometric feature is detected, the electronic device displays 2668 an alternative authentication affordance. In some examples, the biometric feature is detected after display of the alternative authentication affordance 2668 and, in response, the electronic device performs biometric authentication, as described above. As shown in the diagram AJ, to indicate that biometric data is being processed, the electronic device 2600 displays a biometric progress indicator 2625. In fig. 26AK, the electronic device 2600 determines that the biometric authentication is successful. In response, the electronic device 2600 displays an unlock status indicator 2610 and, optionally, provides a tactile output 2670 to indicate that the biometric authentication was successful. In some examples, the electronic device 2600 then displays the instant messaging application interface 2616, as shown in fig. 26 AM. Referring to fig. 26AL-AM, in some examples, displaying the instant message application interface 2616 includes sliding the plug-in interface 2629 in an upward direction to display (e.g., expose) the instant message application interface 2616, as similarly described with reference to fig. 19P-R.
In fig. 26AN, electronic device 2600 displays a plug-in interface 2629 with AN alternative authentication affordance 2668. Upon display of the drop-in interface 2629, the electronic device detects activation of the alternative authentication affordance 2668. In some examples, the activation of alternative authentication affordance 2668 is a tap gesture 2674 on alternative authentication affordance 2668.
Referring to fig. 26AO, in response to activation of alternative authentication affordance 2668, electronic device 2600 displays alternative authentication interface 2631. In some examples, alternative authentication interface 2631 identifies the requested secure content ("enter password to view message").
Referring to fig. 26 AP-fig. 26AQ, a valid password (or passcode) is received by electronic device 2600 in response, at least in part, to tap gesture 2076 (fig. 26AP) and, optionally, one or more other inputs indicating additional alphanumeric digits of the valid password. AS shown in fig. 26AR-AS, once a valid password has been received, the electronic device is unlocked and displays the instant messaging application interface 2616 (e.g., replacing the display of the alternative authentication interface 2631 therewith). In some examples, displaying the instant messaging application interface 2616 includes sliding an alternative authentication interface 2631 in an upward direction to display (e.g., expose) the instant messaging application interface 2616, as similarly described with reference to fig. 19P-R.
27A-27E are flow diagrams illustrating methods for performing biometric authentication using an electronic device, according to some examples. The method 2700 is performed at a device (e.g., 100, 300, 500, 2600) having a display, one or more input devices (e.g., a touchscreen, a microphone, a camera), and a wireless communication radio (e.g., a bluetooth connection, a WiFi connection, a mobile broadband connection such as a 4G LTE connection). In some examples, the display is a touch-sensitive display. In some examples, the display is not a touch sensitive display. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In some examples, the device includes one or more biometric sensors, which optionally include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, the device further comprises a light emitting device, such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors. Some operations in method 2700 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 2700 provides an intuitive way for performing biometric authentication. The method reduces the cognitive burden of the user in authenticating the biometric features, thereby creating a more effective human-computer interface and a more intuitive user experience. For battery-driven computing devices, enabling a user to more quickly and efficiently perform authentication of biometric features conserves power and increases the interval between battery charges.
In some examples, when an electronic device (e.g., 2700) is in a locked state in which the device is not authorized to perform respective operations, the electronic device displays a first graphical indication (e.g., 2624) (e.g., a closed lock icon) indicating that the device is in the locked state. Displaying a first graphical indication indicating that the device is in the locked state provides a user with an easily available indication of the status of the device. In turn, the device is informed as to what functions of the device are enabled and/or available, making the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, when the device is in an unlocked state in which the device is authorized to perform the respective operation, a second graphical indication (e.g., 2610) (e.g., an open lock icon) indicating that the device is in the unlocked state is displayed in place of the first graphical indication. Displaying a second graphical indication indicating that the device is in the unlocked state provides the user with an easily available indication of the device's status. In turn, the device is informed as to what functions of the device are enabled and/or available, making the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the first and second graphical indications are displayed at a respective location in the user interface (e.g., 2606, 2620) (e.g., substantially near a top center of the display 2602).
In some examples, the electronic device detects (2702) a request to perform a respective operation requiring authentication. In some examples, the request to perform the respective operation requiring authentication is a request (e.g., 2612) to display a home screen (e.g., 2614) having a plurality of application open icons that open the corresponding application when selected, or a request to display an application user interface corresponding to the selected notification. In some examples, the request to perform the respective operation includes a home input (e.g., 2612). In some examples, the home input is a selection of a home button or a home gesture, such as an upward swipe from a respective edge of the display (such as the bottom of the display). In some examples, the request to perform the respective operation includes a selection (e.g., 2615) of a notification (e.g., 2608). In some examples, the selection of the notification is a tap, a long press, a hard press, or a swipe on the notification user interface object. In some examples, the respective operations include displaying a home screen including a plurality of application icons for opening different applications. In some examples, the plurality of application icons used to open different applications are application icons that when selected cause the corresponding applications to be opened. In some examples, the home screen also includes one or more desktop applets, system status indicators, device controls, and the like. In some examples, the respective operations include displaying an application user interface (e.g., 2616) of an application corresponding to the notification. In some examples, the application interface includes information that is specifically related to the notification (e.g., the notification is a notification of an electronic communication, and the application user interface includes a representation of the electronic communication or the notification).
In response to detecting a request (2704) to perform a corresponding operation requiring authentication, the electronic device performs (2706) the corresponding operation in accordance with a determination that the device is unlocked. Further, in accordance with a determination that the device is locked and the first form of authentication is available (2708), the electronic device displays (2712) an authentication indicator (e.g., 2625) for the first form of authentication on a display (e.g., 2602) without displaying one or more affordances (e.g., 2636, 2638) for using the second form of authentication. Displaying the authentication indicator without displaying the affordance for using the second form of authentication provides the user with an intuitive interface in which the device forgoes providing additional options when performing biometric authentication. Providing an intuitive interface in this manner enhances the operability of the device (e.g., avoids a user attempting alternative authentication before biometric authentication is completed) and makes the user-device interface more efficient (e.g., by helping a user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling a user to use the device more quickly and efficiently.
In some examples, the authentication indicator is a visual indication that the device is attempting the first form of authentication, such as a text or graphical element describing the first form of authentication (e.g., a password, fingerprint, or other form of authentication). In some examples, the first form of authentication is a form of biometric authentication (2710) based on data obtained by the one or more biometric sensors (e.g., a contactless form of biometric authentication such as facial recognition or iris recognition). In some examples, the authentication indicator includes information indicating that the device is attempting to use the first form of authentication (2714). In some examples, the authentication indicator includes a graphical or textual description (such as "FaceID" or "Face ID to open a message") indicating that facial biometric authentication is available. In some examples, an authentication indicator is displayed along with an option to cancel authentication. In some examples, the authentication indicator is displayed without unlocking the device along with an option for displaying emergency information (e.g., an emergency call user interface and/or emergency medical information). In some examples, the authentication indicator includes information of the progress of the attempt of the first form of authentication (2716), such as the progress indicator described in more detail with respect to fig. 11A-11O.
In some examples, the electronic device processes (2718) the respective data from the one or more biometric sensors (e.g., 2603) while displaying the authentication indicator for the first form of authentication and not displaying the affordance for using the second form of authentication. In some examples, at least a portion of the respective data from the one or more biometric sensors that is processed while displaying the biometric authentication indicator for the first form of biometric authentication and not displaying the one or more affordances for using the second form of authentication is obtained by the one or more biometric sensors prior to displaying the authentication indicator for the first form of authentication (2720). In some examples, at least a portion of the respective data from the one or more biometric sensors that is processed while displaying the biometric authentication indicator for the first form of biometric authentication and not displaying the one or more affordances for using the second form of authentication is obtained by the one or more biometric sensors after displaying the authentication indicator for the first form of authentication (2722).
In some examples, after processing the respective data from the one or more biometric sensors (2724), the electronic device performs (2726) the respective operation in accordance with a determination that the respective data from the one or more biometric sensors is consistent with biometric information authorized to perform the respective operation (e.g., the device detects an authorized face in the respective biometric data). Performing the respective operations in response to determining that the respective data from the one or more biometric sensors is consistent with biometric information authorized to perform the respective operations enhances operability of the device by, in some examples, allowing the user to authenticate with the device without having to manually authenticate, thereby making the user-device interface more efficient.
In some examples, further after processing the respective data from the one or more biometric sensors, in accordance with a determination that the respective data is inconsistent with biometric information authorized to perform the respective operation (2728) (e.g., the device detects no face or detects a face inconsistent with an authorized face), the electronic device displays (2730) one or more affordances (e.g., 2636, 2638) for using a second form of authentication that were not displayed prior to processing the respective data from the one or more biometric sensors. Displaying the one or more affordances for using the second form of authentication, not displayed prior to processing the respective data from the one or more biometric sensors, provides a user with a quick alternative method for an operation (e.g., a locking operation) that requires successful authentication of the access device when the biometric data is unsuccessful. Providing additional control options with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the one or more affordances include displaying a "use password" button, or a keypad/keyboard for entering a password/passcode. In some examples, the one or more affordances for authentication using the second form are displayed after a respective delay during which the authentication indicator for the first form of authentication is displayed without displaying the one or more affordances for authentication using the second form.
In some examples, displaying the one or more affordances includes: in accordance with a determination that a biometric feature corresponding to a first form of authentication is detected by the one or more biometric sensors, displaying the one or more affordances for authentication using a second form after a first period of time has elapsed (e.g., since an authentication indicator was displayed); and in accordance with a determination that the one or more biometric sensors do not detect the biometric characteristic corresponding to the first form of authentication, display the one or more affordances for authentication using the second form after a second period of time has elapsed (e.g., since the authentication indicator was displayed). In some examples, the second time period is different (e.g., longer or shorter) than the first time period.
In some examples, displaying the one or more affordances includes: in accordance with a determination that a biometric characteristic usable for the first form of authentication has been detected that is inconsistent with the authorized biometric characteristic, displaying a user interface (e.g., 2656) for the second form of authentication (e.g., displaying a plurality of character input keys (e.g., a keyboard or keypad) for inputting a sequence of characters (e.g., a password or password) for authentication while displaying corresponding instructions for providing one or more inputs to authenticate with the second form of authentication (e.g., displaying an indication for inputting a sequence of characters for authentication using the one or more character input keys (e.g., a password keypad display having an "enter password to unlock" instruction))), and in accordance with a determination that the one or more biometric sensors do not detect a biometric characteristic corresponding to the first form of authentication, displaying a user interface for the second form of authentication (e.g., 2657) (e.g., displaying a plurality of character input keys (e.g., a keyboard or keypad) for entering a sequence of characters (e.g., a password or password) for authentication without displaying corresponding instructions for providing one or more inputs to authenticate with the second form of authentication. In some examples, the user interface for the second form of authentication is displayed without displaying a corresponding instruction for entering a sequence of characters for authentication using the one or more character input keys (e.g., the password keypad display does not have an "enter password to unlock" instruction). In some examples, the plurality of character input keys are initially displayed without corresponding instructions for entering a sequence of characters for authentication using the one or more character input keys (e.g., when the device attempts to use the first form of authentication), and then when the authentication using the first form fails, the device displays explicit instructions for entering the sequence of characters for authentication using the one or more character input keys.
In some examples, displaying the one or more affordances includes: in accordance with a determination that the request to perform the respective operation includes a home input, displaying a plurality of character input keys (e.g., a keyboard or keypad) for inputting a sequence of characters (e.g., a password or password) for authentication; and in accordance with a determination that the request to perform the respective operation includes selection of a notification, displaying a password affordance that, when activated, causes display of a plurality of character input keys (e.g., a keyboard or keypad) for entering a sequence of characters (e.g., a password or password) for authentication. In some examples, the password affordance is constrained to not be activated in response to a tap input, and is responsive to one or more other types of input that include additional input requirements in addition to touch input. In some examples, the one or more additional input requirements include a requirement that the input be a hard press input (e.g., a requirement that the input reach a characteristic intensity above a respective intensity threshold), a requirement that the input be a long press input (e.g., a requirement that the input include a contact that is held on the touch-sensitive surface for more than a predetermined amount of time without moving more than a predetermined distance), and/or a requirement that the input be a swipe input (e.g., a requirement that the input include a movement of the contact in a respective direction by more than a threshold amount of movement). Constraining activation in response to a tap input in this manner to avoid false (e.g., accidental and/or unintentional) activation of the password affordance provides improved control of and usability of the electronic device, thereby reducing user error in operating/interacting with the device, which in turn reduces power usage and extends battery life of the device by enabling a user to use the device more quickly and efficiently.
In some examples, after displaying one or more affordances for authentication using a second form that were not displayed prior to processing respective data from the one or more biometric sensors: in accordance with a determination that the request to perform the respective operation includes a home input, the electronic device waits for additional input for a first delay period of time before ceasing (automatically, without further user input) to display the one or more affordances for using the second form of authentication (e.g., turning off the display); and in accordance with a determination that the request to perform the respective operation includes a selection of the notification, the electronic device waits for additional input for a second delay period before ceasing to display (automatically, without further user input) the one or more affordances for using the second form of authentication (e.g., turning off the display). In some examples, the second delay period is different from (e.g., shorter or longer than) the first delay period.
In some examples, the electronic device attempts (2732) biometric authentication using the first form of authentication when the device is locked and the first form of authentication is available. In some examples, the device is locked and/or the first form of authentication is available in response to a request to perform a corresponding operation, in response to an attempt to use the second form of authentication, or in response to an input requesting authentication, such as lifting the device, pressing a button (e.g., 2604) on the device, tapping a lock icon on the device, or tapping a touch-sensitive display of the device. In some examples, when attempting biometric authentication using a first form of authentication, the electronic device displays (2734) a progress indicator (e.g., as depicted by progress indicators 2624 and/or 2625) that changes appearance to indicate progress toward biometric authentication using the first form of authentication. In some examples, the electronic progress indicator is a progress bar or icon that changes from a "face detection" icon or animation to a "face analysis" icon or animation. In some examples, the device replaces the first graphical indication with the progress indicator when attempting biometric authentication using the first form of authentication. In some examples, after completing the attempt of the first form of authentication, in accordance with a determination that the authentication was unsuccessful, the electronic device replaces the progress indicator with a first graphical indication (e.g., a closed lock icon); and in accordance with a determination that the authentication is successful, the electronic device replaces the progress indicator with a second graphical indication (e.g., an open lock icon).
In some examples, after attempting biometric authentication using the first form of authentication (2736), in accordance with a determination that the biometric authentication with the first form of authentication was successful, the electronic device updates (2738) a progress indicator (e.g., displays a check mark or an open lock icon) in a first manner to indicate successful authentication with the first form of authentication (and optionally performs a corresponding operation); and/or generate a second tactile output (e.g., different from the first tactile output indicating authentication failure) indicating authentication success (e.g., a single tap).
In some examples, after attempting biometric authentication using the first form of authentication, in accordance with a determination that biometric features that are usable for the first form of authentication but are inconsistent with the authorized biometric features are detected with the biometric authentication of the first form of authentication, the electronic device updates (2740) the progress indicator in a second manner (e.g., shaking the lock icon or face icon side-by-side to indicate authentication failure) to indicate unsuccessful authentication. In some examples, the second manner of updating the progress indicator is different from the first manner of updating the progress indicator (2742). In some examples, the electronic device generates a first tactile output (e.g., three taps) indicating authentication failure.
In some examples, after attempting biometric authentication using the first form of authentication, in accordance with a determination that no biometric feature is detected with the biometric authentication of the first form of authentication that is usable for the first form of authentication, the electronic device updates (2744) the progress indicator in a third manner different from the first manner and the second manner (displays a closed lock icon without shaking the lock icon side-by-side to indicate authentication failure).
In some examples, after attempting biometric authentication using the first form of authentication, in accordance with a determination that biometric features available for the first form of authentication were not detected with the biometric authentication of the first form of authentication, the electronic device displays a graphical indication that successful authentication has not occurred without generating a tactile output. In some examples, the device does not generate the first tactile output or another tactile output indicating authentication failure because the device does not identify any biometric features that may be used for the first form of authentication.
In some examples, after a first attempt to use biometric authentication of the first form of authentication: the electronic device displays a user interface (e.g., 2654) for a second form of authentication that includes a plurality of character input keys (e.g., a keyboard or keypad) for entering a sequence of characters (e.g., a password or password) for authentication. Further, the electronic device attempts biometric authentication using the first form of authentication a second time while displaying the user interface for the second form of authentication. Further, in accordance with a determination that biometric authentication with the first form of authentication is successful, the electronic device performs a corresponding operation (and optionally, updates the progress indicator in a first manner (e.g., displays a checkmark or an open lock icon (e.g., 2610) to indicate successful authentication with the first form of authentication). further, in accordance with a determination that a biometric feature is detected that is usable for the first form of authentication but is inconsistent with the authorized biometric feature prior to receiving input of less than a threshold number of characters (e.g., less than 1, 2, 3, 4, or 5 characters) via one or more of the plurality of character input keys, the electronic device forgoes performing the corresponding operation and generates a first tactile output (e.g., a three tap) indicating authentication failure (and optionally, displaying a graphical indication that successful authentication has not occurred). Further, in accordance with a determination that biometric authentication with the first form of authentication after input of at least a threshold number of characters (e.g., less than 1, 2, 3, 4, or 5 characters) is received via one or more of the plurality of character input keys detects a biometric feature that is usable for the first form of authentication but is inconsistent with the authorized biometric feature, the electronic device performs a corresponding operation and displays a graphical indication that successful authentication has not occurred (e.g., a closed lock icon) without generating a tactile output.
In some examples, in response to detecting a request to perform a respective operation requiring authentication: in accordance with a determination (e.g., when a request to perform a corresponding operation is received) that the device is locked and the first form of biometric authentication is not available, the electronic device displays (2746) one or more affordances for using the second form of authentication (e.g., a password or password entry user interface or a prompt for using the second form of biometric authentication, such as fingerprint authentication). In some examples, the first form of authentication is not available because it has been disabled (2748). In some examples, the first form of authentication is not available due to exceeding a threshold number of failed biometric authentication attempts with the first form of biometric authentication, due to a reboot of the device, or due to a user request to disable the first form of biometric authentication. In some examples, the first form of authentication is disabled in response to the user entering the emergency options user interface without selecting an option corresponding to a request to access additional information at the device (e.g., the user triggers display of the emergency options user interface by simultaneously pressing two or more buttons for more than a threshold amount of time, and then selects an option to turn off the device or cancel display of the emergency options user interface, rather than selecting an option to display medical information or display an emergency dialing interface). In some examples, the first form of authentication is disabled in response to a user selection of an option to disable the first form of biometric authentication (e.g., via setting a biometric authentication setting in a user interface). In some examples, the first form of authentication is not available because operation of the one or more biometric sensors is limited by current environmental and/or device conditions that reduce the ability of the one or more biometric sensors to operate within predefined parameters (2750). In some examples, the device is too hot, the device is too cold, there is too much light in the environment of the device, there is too little light in the environment of the device, and/or the battery of the device is not sufficiently charged to run the one or more biometric sensors.
In some examples, the electronic device detects a first input (e.g., 2650) (e.g., a tap input) at a location corresponding to a respective location in the user interface. In some examples, in response to detecting the first input at a location corresponding to the respective location in the user interface, in accordance with a determination that the device is in a locked state (e.g., a tap input is detected on a closed lock icon), the electronic device attempts a first form of authentication. Attempting the first form of authentication in response to detecting the first input at a location corresponding to a respective location in the user interface and in accordance with a determination that the device is in the locked state provides an intuitive and convenient feature in which the first form of authentication is initiated, thereby making the user-device interface more efficient, which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the electronic device detects a second input (e.g., a tap input) at a location corresponding to a respective location in the user interface. In some examples, in response to detecting the second input at a location corresponding to the respective location in the user interface, in accordance with a determination that the device is in the unlocked state (e.g., a tap input is detected on the open lock icon), the electronic device transitions the device from the unlocked state to the locked state. In some examples, the respective location is on a cover user interface displayed when the device screen is initially opened, and the second graphical indication (e.g., an open lock icon) is displayed when the cover user interface is displayed on the device while the device is still in the unlocked state, and the first graphical indication (e.g., a lock icon) is displayed when the cover user interface is displayed on the device while the device is in the locked state.
Note that the details of the processes described above with respect to method 1200 (e.g., fig. 27A-27E) may also be applied in a similar manner to the methods described herein. For example, method 2700 optionally includes one or more of the features of the various methods described herein with reference to methods 800, 1000, 1200, 1400, 1600, 1800, 2000, 2200, and 2500. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to fig. 26L-N. As another example, in response to receiving an input prior to completion of the biometric authentication process, one or more plug-in interfaces as described in methods 2000 and 2700 are optionally displayed. For the sake of brevity, these details are not repeated here.
The operations in the above-described information processing method are optionally implemented by running one or more functional modules in an information processing apparatus, such as a general-purpose processor (e.g., as described with respect to fig. 1A, 3, 5A) or an application-specific chip. Further, the operations described above with reference to fig. 27A-27E are optionally implemented by the components depicted in fig. 1A-1B. For example, detection operation 2702, execution operation 2706, display operation 2712, and display operation 2746 are optionally implemented by event sorter 170, event recognizer 180, and event handler 190. Event monitor 171 in event sorter 170 detects a contact on touch-sensitive surface 604 and event dispatcher module 174 delivers the event information to application 136-1. Respective event recognizer 180 of application 136-1 compares the event information to respective event definitions 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as a selection of an object on the user interface. When a respective predefined event or sub-event is detected, the event recognizer 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 optionally utilizes or calls data updater 176 or object updater 177 to update application internal state 192. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update the content displayed by the application. Similarly, one of ordinary skill in the art will clearly know how other processes may be implemented based on the components depicted in fig. 1A-1B.
Fig. 28A-28 AA illustrate exemplary user interfaces for avoiding retry of biometric authentication, according to some examples. As described in more detail below, the exemplary examples of user interfaces shown in fig. 28A-28 AA are used to illustrate processes described below, including the processes in fig. 29A-29B.
Fig. 28A shows an electronic device 2800 (e.g., the portable multifunction device 100, the device 300, the device 500, or the device 1700). In the illustrative example shown in fig. 28A to 28AA, the electronic device 2800 is a smartphone. In other examples, electronic device 2800 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). Electronic device 2800 includes a display 2802, one or more input devices (e.g., a touch screen and a microphone of display 2802), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In fig. 28A, the electronic device includes a biometric sensor 2803. In some examples, the biometric sensor is one or more biometric sensors that may include a camera (such as an infrared camera, a thermal imaging camera, or a combination thereof). In some examples, biometric sensor 2803 is biometric sensor 703. In some examples, the one or more biometric sensors include one or more fingerprint sensors (e.g., fingerprint sensors integrated into buttons). In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In fig. 28A, an electronic device 2800 displays a user interface 2804 for an application on a display 2802. The application is a mobile browser application, and user interface 2804 corresponds to a website (onlinescore. In FIG. 28B, electronic device 2800 detects a tap gesture 2806 on login affordance 2808. Electronic device 2800 recognizes flick gesture 2806 as a request to load login user interface 2810 (shown in fig. 28C). Electronic device 2800 also recognizes flick gesture 2806 as a request to automatically populate fillable fields — username field 2812 and password field 2814 with credential information (e.g., a username and password that enables a user to successfully log into an account) in login user interface 2810. The request for autofill requires biometric authentication in order to proceed with autofill the fillable fields. In some examples, the request also includes a request to automatically log in to the user, such that the user need not tap the submit affordance (e.g., 2860 in fig. 28Z) in order to submit the credential and log in.
In fig. 28C, in response to a tap gesture 2806 (e.g., a request to automatically populate the fillable fields), the electronic device 2800 uses the biometric sensor 2803 to determine whether certain biometric authentication criteria have been met. The electronic device 2800 captures and processes (e.g., analyzes) biometric data from the biometric sensor 2803 to determine, based on the biometric data, whether the biometric feature (or a portion thereof) meets biometric authentication criteria (e.g., determines whether the biometric data matches a biometric template within a threshold). Biometric sensor 2803 is contactless such that the sensor is configured to perform biometric authentication without physical input from the user (e.g., without any additional gestures after flick gesture 2806). Thus, the electronic device 2800 initiates biometric authentication using the biometric sensor 2803 without receiving an explicit request from the user to initiate biometric authentication.
Performing biometric authentication includes displaying a biometric authentication interface 2816 having a biometric authentication token 2818. The biometric authentication token 2818 is a simulation of a representation of a biometric feature (e.g., a face). As shown in fig. 28C, a biometric authentication interface 2816 is overlaid on at least a portion of the login user interface 2810. Biometric authentication interface 2816 is optionally an operating system level interface (e.g., an interface generated by the device's operating system), and login user interface 2810 is an application level interface (e.g., a user interface generated by a third party application that is separate from the device's operating system). In some examples, the displayed biometric authentication interface is approximately centered along a horizontal and/or vertical axis, such as in fig. 28B-28E. In some examples, electronic device 2800 displays a biometric authentication interface on the top, bottom, sides, or corners of display 2802. For example, electronic device 2800 displays a biometric authentication interface near the top of display 2802, such as in, for example, fig. 30 AL. In some examples, the electronic device 3000 does not display the biometric authentication interface while biometric authentication is being performed.
In fig. 28D to 28E, the electronic device 2800 displays a biometric authentication animation that includes the biometric authentication marker 2820 in fig. 28D and the biometric authentication marker 2822 in fig. 28E, which are used as part of the animation during which the biometric sensor 2803 attempts to obtain biometric data.
In fig. 28F, it is determined that the biometric authentication has failed (e.g., the biometric authentication criteria has not been satisfied yet). Thus, electronic device 2800 forgoes auto-populating username field 2812 and password field 2814. In addition, the electronic device 2800 does not display to the user an indication that the user should reattempt biometric authentication. In some examples, the biometric authentication is determined to be successful (e.g., the biometric authentication criteria have been met). Thus, in those examples, electronic device 2800 automatically populates username field 2812 and password field 2814.
In fig. 28G, after determining that biometric authentication has failed, electronic device 2800 detects a tap gesture 2824 on reload affordance 2826. Electronic device 2800 recognizes flick gesture 2824 as a request to reload login user interface 2810. Electronic device 2800 also recognizes a tap gesture 2824 as a request to automatically populate one or more filable fields (e.g., username field 2812 and password field 2814) in login user interface 2810. As previously mentioned, the request for autofill requires biometric authentication in order to proceed with autofill the fillable field.
In response to the request for automatically populating the fillable field, it is determined that the failure of the biometric authentication in fig. 28F is due to the absence of the detected face for a predetermined amount of time. Accordingly, the electronic device 2800 re-performs biometric authentication using the biometric sensor 2803, as shown in fig. 28H. The electronic device 2800 automatically re-performs biometric authentication without requiring the user to provide input to initiate authentication.
In fig. 28H-28I, the electronic device 2800 performs biometric authentication, which includes displaying a biometric authentication interface and a biometric authentication flag symbol, as described with respect to fig. 28C-28D. Once the electronic device 2800 has obtained biometric data (e.g., obtained enough biometric data), the electronic device transitions to displaying the biometric authentication token 2828. The electronic device 2800 displays a biometric authentication flag 2828 to indicate that biometric data is being processed. In some examples, the biometric authentication landmark 2828 includes multiple rings that rotate spherically, for example, when displayed.
In fig. 28K, it is determined that the biometric authentication has failed again. Accordingly, the electronic device 2800 displays a failed biometric authentication interface 2830 with a failed biometric authentication landmark 2832. Thus, electronic device 2800 forgoes auto-populating username field 2812 and password field 2814. In some examples, the biometric authentication is determined to be successful. Thus, in those examples, electronic device 2800 automatically populates username field 2812 and password field 2814.
In fig. 28L, after determining that biometric authentication has failed for a second time, electronic device 2800 detects a tap gesture 2824 on reload affordance 2826. Electronic device 2800 recognizes flick gesture 2826 as a request to reload login user interface 2810. Electronic device 2800 also recognizes flick gesture 2826 as a request to automatically populate one or more filable fields (e.g., username field 2812 and password field 2814) in login user interface 2810. As previously mentioned, the request for autofill requires biometric authentication in order to proceed with autofill the fillable field.
In response to the request to automatically populate the populatable field, it is determined that the failure of the biometric authentication in fig. 28K is due to the detection of a face that is inconsistent with the authorized face (e.g., the biometric data does not match the biometric template within a threshold). Therefore, the electronic device 2800 foregoes performing biometric authentication, as shown in fig. 28M.
In fig. 28N, after forgoing performing biometric authentication in response to tap gesture 2826, electronic device 2800 detects tap gesture 2834 on username field 2812. Accordingly, electronic device 2800 displays cursor 2836 in username field 2812 and also displays virtual keyboard 2838 and password affordance 2840, as shown in fig. 28O. In FIG. 28P, electronic device 2800 detects a tap gesture 2842 on cryptographic affordance 2840. Accordingly, electronic device 2800 displays a list of candidate input affordances (e.g., 2844, 2846, and 2848), as shown in fig. 28Q. In some examples, in response to detecting tap gesture 2834 on username field 2812, electronic device 2800 displays an affordance labeled "username" instead of password affordance 2840.
In fig. 28R, electronic device 2800 detects a tap gesture 2850 on a candidate input affordance 2848 (labeled "jj _ applieded @ email. Electronic device 2800 recognizes tap gesture 2850 as a request to automatically populate username field 2812 and password field 2814 with credential information corresponding to candidate input affordance 2848. This request for auto-populating the fillable fields is a different type of request for auto-populating than those generated by the request for loading or reloading the login user interface 2810. The request for autofill via the request for loading the login user interface is a blanket request because the request for autofill is executed as part of the request for loading the login user interface. In contrast, the request for auto-fill in fig. 28R is an explicit request by the user to auto-fill username field 2812 and password field 2814 with credential information corresponding to candidate input affordance 2848. In response to the request to automatically populate the filable fields via the explicit request in fig. 28R, the electronic device 2800 initiates biometric authentication, as shown in fig. 28S.
In fig. 28S-28U, the electronic device 2800 performs biometric authentication, which includes displaying a biometric authentication interface and a biometric authentication flag symbol, as described with respect to fig. 28H-28J.
In fig. 28V, it is determined that the biometric authentication is successful. Accordingly, the electronic device 2800 displays a successful biometric authentication flag 2852 indicating that the biometric authentication is successful.
In fig. 28W, because the biometric authentication was successful, electronic device 2800 automatically populates username field 2812 and password field 2814 with credential information corresponding to candidate input affordance 2848. In some examples, it is determined that the biometric authentication has failed. Thus, in those examples, electronic device 2800 discards the ticket information automatically populates user name field 2812 and password field 2814. In some examples, upon failure of biometric authentication, the electronic device 2800 displays failure interface 2854 in fig. 28X, as described with respect to fig. 17M. Alternatively, electronic device 2800 can display failure interface 2856 in fig. 28Y, as described with respect to fig. 15S. The failure interface 2854 may be displayed when the user has not reached a maximum number of failed biometric authentication attempts (e.g., a maximum number of failed attempts during which there were no successful authentication attempts). The failure interface 2856 may alternatively be displayed when the maximum number of failed biometric authentication attempts has been reached.
After electronic device 2800 automatically populates username field 2812 and password field 2814 in fig. 28W, the electronic device detects tap gesture 2858 on submit affordance 2860, as shown in fig. 28Z. Electronic device 2800 recognizes flick gesture 2858 as a request to submit credential information in username field 2812 and password field 2814 for user authentication. Upon successful user authentication, the electronic device 2800 provides access to restricted content (e.g., content that is only visible once the user has logged into the user interface) in the user interface 2862 of FIG. 28 AA.
29A-29B are flow diagrams illustrating methods for re-performing biometric authentication after an initial unsuccessful biometric authentication attempt using an electronic device, according to some examples. The method 2900 is performed at a device (e.g., 100, 300, 500, 1700, 2800) having one or more biometric sensors (e.g., 2803) (e.g., a fingerprint sensor, a contactless biometric sensor (e.g., a biometric sensor that does not require physical contact, such as a thermal or optical facial recognition sensor), an iris scanner). In some examples, the one or more biometric sensors (e.g., 2803) include one or more cameras. The electronic device (e.g., 100, 300, 500, 1700, 2800) optionally includes a display (e.g., 2802). In some examples, the display (e.g., 2802) is a touch-sensitive display. In some examples, the display (e.g., 2802) is not a touch sensitive display.
Some operations in method 2900 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted. As described below, method 2900 provides an intuitive way for re-performing biometric authentication after an initial unsuccessful biometric authentication attempt. The method reduces the cognitive burden on the user when authenticating on the electronic device, thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling users to authenticate faster and more efficiently conserves power and increases the interval between battery charges.
An electronic device (e.g., 100, 300, 500, 1700, 2800) receives (2902) a first request (e.g., 2806) to perform a respective operation requiring authentication (e.g., automatically populate, unlock the device, make payment). In some examples, the first request (e.g., 2806) is associated with performance of a respective operation. In some examples, the first request (e.g., 2806) is also a request to perform a second operation different from the respective operation (e.g., a request to display a web page (e.g., 2810) or load content that requires authentication). In some examples, the second operation does not require authentication.
According to some examples, the first request (e.g., 2806) is also a request to perform an operation that does not require biometric authentication. In response to receiving the first request (e.g., 2806), the electronic device (e.g., 100, 300, 500, 1700, 2800) performs an operation that does not require biometric authentication.
According to some examples, the first request (e.g., 2806) is a request to open a web page (e.g., 2810).
In response (2904) to receiving the first request (e.g., 2806) to perform the corresponding operation, the electronic device (e.g., 100, 300, 500, 1700, 2800) proceeds to blocks 2906-2910.
An electronic device (e.g., 100, 300, 500, 1700, 2800) uses (2906) the one or more biometric sensors (e.g., 2803) to determine whether biometric authentication criteria are met, where the biometric authentication criteria include a requirement that a respective type of biometric feature (e.g., a face or fingerprint) authorized to perform a respective operation be detected by the biometric sensor (e.g., 2803). In some examples, the biometric authentication criteria include a requirement that authorized biometric features be detected by the one or more biometric sensors (e.g., 2803).
In accordance with a determination that the biometric authentication criteria are satisfied, the electronic device (e.g., 100, 300, 500, 1700, 2800) performs (2908) a corresponding operation.
In accordance with (2910) a determination that the biometric authentication criteria are not satisfied, the electronic device (e.g., 100, 300, 500, 1700, 2800) foregoes (2912) performing the respective operation. Abandoning (or performing) the respective operation based on not satisfying the biometric authentication criteria provides security and may prevent unauthorized users from initiating sensitive operations. Providing improved security enhances the operability of the device and makes the user-device interface more efficient (e.g., by restricting unauthorized access), which in turn reduces power usage and extends the battery life of the device by limiting the performance of restricted operations.
According to some examples, further in response (2904) to receiving the first request (e.g., 2806) to perform the respective operation and determining from (2910) that the biometric authentication criteria are not satisfied, the electronic device (e.g., 100, 300, 500, 1700, 2800) forgoes (2914) displaying on a display (e.g., 2802) an indication to reattempt authentication using the one or more biometric sensors (e.g., 2803) (e.g., a visually presented instruction prompting the user to reattempt biometric authentication). In some examples, the electronic device (e.g., 100, 300, 500, 1700, 2800) also forgoes displaying an indication to re-request the respective operation.
According to some examples, determining whether the biometric authentication criteria are met includes determining whether at least a portion of the biometric feature determined based on data obtained from the one or more biometric sensors (e.g., 2803) corresponding to the biometric feature meets the biometric authentication criteria. In some examples, when the request (e.g., 2806) is also a request to perform a second operation that does not require authentication and is different from the respective operation, the second operation is performed even in accordance with a determination that the biometric authentication criteria is not satisfied. For example, the first request (e.g., 2806) (e.g., entry of a URL address) is a request to perform a corresponding operation requiring authentication (e.g., automatically populating a username and/or password of a web page (e.g., 2810) associated with the URL address) and is also a request to perform a second operation not requiring authentication (e.g., display of a web page (e.g., 2810) associated with the URL address). Performing the second operation that does not require authentication even when the biometric authentication criteria is not satisfied can provide feedback to the user on the request even if the biometric-safe operation is not performed. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
Subsequent to determining, in response to receiving the first request (e.g., 2806), that biometric authentication criteria are not satisfied (e.g., no face detected or a detected face is inconsistent with an authorized face), the electronic device (e.g., 100, 300, 500, 1700, 2800) receives (2916) a second request (e.g., 2824) to perform a corresponding operation. In some examples, the non-user request to reload the web page (e.g., 2810) is not a request associated with retrying biometric authentication.
In response (2918) to receiving a second request (e.g., 2824) to perform a corresponding operation, the electronic device (e.g., 100, 300, 500, 1700, 2800) proceeds to blocks 2920 through 2922.
In accordance with a determination in response to a first request (e.g., 2806) that biometric authentication criteria are not satisfied because the one or more biometric sensors (e.g., 2803) did not detect the presence of a respective type of biometric feature, the one or more biometric sensors (e.g., 2803) are used (2920) in response to a second request (e.g., 2824) to determine whether the biometric authentication criteria are satisfied. Re-performing biometric authentication when a previous authentication failed due to the absence of detection of the presence of the biometric feature provides the user with the ability to re-attempt authentication without requiring additional input and without cluttering the user interface with additional displayed controls. Providing the ability to re-attempt authentication without additional input and without cluttering the UI with additional displayed controls enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the biometric feature is a face, and the data from the biometric sensor (e.g., 2803) does not include data indicating that a face was detected. In some examples, determining that the biometric authentication criteria are not satisfied in response to the first request (e.g., 2806) occurs when the one or more biometric sensors (e.g., 2803) do not detect the presence of a respective type of biometric feature within a predetermined amount of time.
According to some examples, determining, in response to the first request (e.g., 2806), that the biometric authentication criteria are not satisfied because the one or more biometric sensors (e.g., 2803) did not detect the presence of the respective type of biometric feature is in response to the first request (e.g., 2806) determining that the biometric authentication criteria are not satisfied because the one or more biometric sensors (e.g., 2803) did not detect the presence of the respective type of biometric feature for at least a predetermined time (e.g., at a predetermined time, such as after triggering of biometric authentication by the first request (e.g., 2806) to perform the respective operation).
In accordance with a determination that biometric authentication criteria are not satisfied because the one or more biometric sensors (e.g., 2803) detected a respective type of biometric feature that does not correspond to an authorized biometric feature (e.g., the detected face is not consistent with the authorized face) in response to the first request (e.g., 2806), the electronic device (e.g., 100, 300, 500, 1700, 2800) foregoes (2922) using the one or more biometric sensors (e.g., 2803) to determine whether the biometric authentication criteria are satisfied in response to the second request (e.g., 2824) (e.g., the device does not automatically retry biometric authentication in response to reloading the web page (e.g., 2810)). In some examples, forgoing to re-perform biometric authentication further comprises forgoing performing operations that are performed when the biometric authentication criteria are satisfied. Abandoning the reattempt of biometric authentication when the previous authentication failed due to the detection of an unauthorized biometric feature enhances security and reduces the instances where a potentially unauthorized user makes multiple resource-intensive reattempts. Providing improved security enhances the operability of the device and makes the user-device interface more efficient (e.g., by restricting unauthorized access), which in turn reduces power usage and extends the battery life of the device by limiting the performance of restricted operations.
According to some examples, subsequent to determining that the biometric authentication criteria are not satisfied in response to receiving the first request (e.g., 2806), the electronic device (e.g., 100, 300, 500, 1700, 2800) receives a third request (e.g., 2850) to perform a corresponding operation (e.g., tap on the secure password field and select a password to automatically populate, tap on the unsecure username field and select a username to automatically populate), wherein the third request is a different type of request than the first request (e.g., 2806) and the second request (e.g., 2824) (e.g., the third request is made using a selection of an affordance different from the affordances used to make the first request and the second request, the third type of request is not also a request to perform a second operation (e.g., loading of a web page), and the first request and the second request are also requests to perform the second operation). In response to receiving a third request (e.g., 2850) to perform a respective operation, the electronic device (e.g., 100, 300, 500, 1700, 2800) uses the one or more biometric sensors (e.g., 2803) to determine whether the biometric authentication criteria are satisfied in response to the third request (e.g., 2850) (e.g., uses the one or more biometric sensors to determine whether the biometric authentication criteria are satisfied regardless of a reason the biometric authentication criteria are not satisfied in response to the first request (e.g., 2806) (e.g., regardless of whether the biometric authentication criteria are not satisfied because the one or more biometric sensors detected a respective type of biometric feature that does not correspond to an authorized biometric feature or because the one or more biometric sensors did not detect the presence of the respective type of biometric feature)). Re-performing biometric authentication after receiving a different type of request (e.g., an explicit request) regardless of the reason for the previous authentication failure provides the user with the ability to explicitly request re-authentication and additional control options. Providing the user with additional control options enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, further in response to receiving a second request (e.g., 2824) to perform a corresponding operation and in accordance with a determination that biometric authentication is not available (e.g., a maximum number of failed biometric authentication attempts has been reached, the number of attempts since last successful authentication has exceeded a predefined number of allowed attempts), the electronic device (e.g., 100, 300, 500, 1700, 2800) prompts (e.g., 2854, 2856) for an alternative form of authentication (e.g., a non-biometric form of authentication, such as a password or password). Providing a prompt (e.g., display notification) for alternative authentication when biometric authentication is no longer available provides feedback to the user about the current state of the device and provides feedback to the user indicating what is needed for authentication. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, the electronic device (e.g., 100, 300, 500, 1700, 2800) imposes a respective limit on the number of unsuccessful biometric authentication attempts that are allowed before an alternative form of authentication is required. After a predetermined number of requests (within a threshold period of time) to perform respective operations have resulted in failed biometric authentication attempts, where the predefined number of requests is less than the respective limit, the electronic device (e.g., 100, 300, 500, 1700, 2800) ceases to use the one or more biometric sensors (e.g., 2803) to determine whether the biometric authentication criteria are satisfied in response to the requests to perform the respective operations.
In some examples, in response to detecting a respective request to perform a respective operation, a device (e.g., 100, 300, 500, 1700, 2800) determines whether a predetermined number of requests to perform the respective operation have resulted in a failed biometric authentication attempt. In accordance with a determination that a predetermined number of requests to perform respective operations have resulted in a failed biometric authentication attempt, the electronic device (e.g., 100, 300, 500, 1700, 2800) foregoes attempting biometric authentication. In accordance with a determination that a predetermined number of requests to perform respective operations have not resulted in a failed biometric authentication attempt, the electronic device (e.g., 100, 300, 500, 1700, 2800) proceeds with additional biometric authentication attempts.
In some examples, the number of biometric authentication attempts or reattempts that can be made (e.g., that can be made without success) is limited to a predetermined number of unsuccessful attempts before an alternative authentication (e.g., password or passcode) is required. In such examples, the electronic device (e.g., 100, 300, 500, 1700, 2800) foregoes reattempting biometric authentication after a certain number of attempts so as not to exceed a predetermined number of allowable attempts, even under conditions in which biometric authentication would otherwise be attempted (e.g., after a previous failure due to not detecting the presence of a respective type of biometric feature). Ceasing to use the biometric sensor (e.g., forgoing biometric authentication) before exhausting the allowed number of attempts after the repeat request avoids the user consuming the allowed number of attempts on repeat requests (e.g., repeat requests of the same type), saving at least one attempt for requests for other operations requiring biometric authentication. Saving at least one attempt enhances the operability of the device and makes the user-device interface more efficient (e.g., by avoiding exhausting authentication attempts on repeated similar requests), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, determining whether the biometric authentication criteria are satisfied using the one or more biometric sensors (e.g., 2803) in response to the second request (e.g., 2824) occurs automatically (e.g., without requiring input from the user) in response to receiving the second request (e.g., 2824) to perform the respective operation.
According to some examples, the one or more biometric sensors (e.g., 2803) are non-contact biometric sensors (e.g., 2803) (e.g., infrared cameras, visible light cameras, or a combination thereof) configured to perform biometric authentication without physical contact from a user.
According to some examples, in response to the second request (e.g., 2824) and in accordance with a determination that the biometric authentication criteria are satisfied in response to the second request (e.g., 2824), the electronic device (e.g., 100, 300, 500, 1700, 2800) performs a respective operation (e.g., the operation includes automatically populating, accessing data, unlocking the device, and/or making a payment).
According to some examples, the respective operation is automatically populating the one or more filable fields (e.g., 2812, 2814) with credential information (e.g., credit card information or login information). In some examples, the credit card information includes information associated with payment account information (e.g., credit card, bank account, or payment service information). In some examples, the login information includes information needed to login to an application, account, or website (e.g., 2862). Automatically populating credential information based on the request and successful authentication provides the user with the ability to populate the credential without further input (beyond the request). Performing the operation when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, the respective operations are providing access to restricted content (e.g., logging in to a web page (e.g., 2862), displaying a list of passwords associated with the user, displaying credit card information).
According to some examples, the respective operation is to transition the electronic device (e.g., 100, 300, 500, 1700, 2800) from a locked state to an unlocked state. In some examples, transitioning the electronic device (e.g., 100, 300, 500, 1700, 2800) to the unlocked state includes enabling a display (e.g., 2802), the one or more biometric sensors (e.g., 2803), and/or a microphone of the electronic device.
According to some examples, the respective operation is to enable an electronic device (e.g., 100, 300, 500, 1700, 2800) to participate in a transaction (e.g., a financial transaction such as payment for a good or service).
According to some examples, in using the one or more biometric sensors (e.g., 2803) to determine whether biometric authentication criteria are met, an electronic device (e.g., 100, 300, 500, 1700, 2800) displays an indication on a display (e.g., 2802) that biometric authentication is being performed (e.g., a small indicator is displayed on the top, bottom, side, or corner). In some examples, the indicator is not displayed during biometric authentication. In some examples, the electronic device (e.g., 100, 300, 500, 1700, 2800) forgoes displaying an indication that biometric authentication is being performed while using the one or more biometric sensors (e.g., 2803) to determine whether biometric authentication criteria are satisfied. In some examples, if the biometric authentication criteria are not satisfied in response to the first request (e.g., 2806) because the one or more biometric sensors (e.g., 2803) did not detect the presence of the respective type of biometric feature, a first visual indication (e.g., 2822) is displayed. In some examples, if, in response to the first request (e.g., 2806), the biometric authentication criteria are not satisfied due to the one or more biometric sensors (e.g., 2803) detecting respective types of biometric features that do not correspond to authorized biometric features, a second visual indication (e.g., 2832) is displayed (e.g., the same as or different from the first visual indication). In some examples, if the biometric authentication criteria are met, a third visual indication (e.g., 2852) is displayed (e.g., a third visual indication that is different from the first visual indication and/or the second visual indication).
Note that the details of the processes described above with respect to method 2900 (e.g., fig. 29A-29B) may also be applied in a similar manner to the methods described below and above. For example, method 2900 optionally includes one or more of the features of the various methods described above with reference to methods 800, 1000, 1200, 1400, 1600, 2000, 2200, 2500, 2700, 3100, 3300, and 3500. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to method 2900. As another example, the authentication cache of method 3100 can be based on successful authentication performed according to the re-performed biometric authentication as described with respect to method 2900. For the sake of brevity, these details are not repeated in the following.
Fig. 30A-30 AL illustrate example user interfaces for cached biometric authentication, according to some examples. As described in more detail below, the exemplary examples of user interfaces shown in fig. 30A-30 AL are used to illustrate the processes described below, including the processes in fig. 31A-31B.
Fig. 30A shows an electronic device 3000 (e.g., the portable multifunction device 100, the device 300, the device 500, or the device 1700). In the exemplary example shown in fig. 30A to 30AL, the electronic device 3000 is a smartphone. In other examples, the electronic device 3000 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). Electronic device 3000 includes a display 3002, one or more input devices (e.g., a touch screen of display 3002, buttons 3004, and a microphone), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In fig. 30A, the electronic device includes a biometric sensor 3003. In some examples, the biometric sensor is one or more biometric sensors that may include a camera (such as an infrared camera, a thermal imaging camera, or a combination thereof). In some examples, biometric sensor 3003 is biometric sensor 703. In some examples, the one or more biometric sensors include one or more fingerprint sensors (e.g., fingerprint sensors integrated into buttons). In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In fig. 30A, the electronic device 3000 displays a user interface 3006 of an application on the display 3002. Com, and the interface corresponds to a website. In some examples, the website online. Electronic device 3000 detects a tap gesture 3008 on login affordance 3010. Electronic device 3000 recognizes flick gesture 3008 as a request to load login user interface 3012 (shown in fig. 30B). The user interface 3012 is a web page id. In some examples, id. Electronic device 3000 also recognizes flick gesture 3008 as a request to automatically populate the fillable fields in login user interface 3012-username field 3014 and password field 3016. The request for autofill requires biometric authentication in order to proceed with autofill the fillable fields. In some examples, the request also includes a request to automatically log in to the user such that the user need not tap the submit affordance (e.g., 3030, 3046) in order to submit the credential and log in.
In fig. 30B, in response to a tap gesture 3008 (e.g., a request to automatically populate a fillable field), the electronic device 3000 uses the biometric sensor 3003 to determine whether certain biometric authentication criteria have been met. The electronic device 3000 captures and processes (e.g., analyzes) the biometric data from the biometric sensor 3003 to determine, based on the biometric data, whether the biometric feature (or a portion thereof) meets biometric authentication criteria (e.g., determines whether the biometric data matches a biometric template within a threshold). The biometric sensor 3003 is contactless such that the sensor is configured to perform biometric authentication without physical input from the user (e.g., without any additional gesture after the tap gesture 3008). Thus, the electronic device 3000 initiates biometric authentication using the biometric sensor 3003 without receiving an explicit request from the user to initiate biometric authentication.
Performing biometric authentication includes displaying a biometric authentication interface 3018 having a biometric authentication flag 3020. The biometric authentication flag symbol 3020 is a simulation of a representation of a biometric feature (e.g., a face). As shown in fig. 30B, a biometric authentication interface 3018 is overlaid on at least a portion of the login user interface 3012. The biometric authentication interface 3018 is optionally an operating system level interface (e.g., an interface generated by an operating system of the device), and the login user interface 3012 is an application level interface (e.g., a user interface generated by a third-party application separate from the operating system of the device).
In fig. 30C, the electronic device 3000 displays a part of a biometrics authentication animation including a biometrics authentication mark symbol 3022 serving as a part of animation during which the biometrics sensor 3003 obtains biometrics data. In some examples, the animation of which the landmark 3020 and landmark 3022 are part indicates that the electronic device is attempting to recognize a particular type of biometric feature (e.g., recognizing a face). Once the electronic device 3000 has obtained biometric data (e.g., obtained sufficient biometric data), the electronic device 3000 transitions to displaying a biometric authentication flag symbol 3024, as shown in fig. 30D. The electronic device 3000 displays a biometric authentication flag symbol 3024 to indicate that biometric data is being processed. In some examples, the biometric authentication flag symbol 3024 includes multiple rings that rotate spherically, for example, when displayed.
In fig. 30E, it is determined that the biometric authentication is successful. Accordingly, the electronic apparatus 3000 displays a biometric authentication flag symbol 3026 indicating success of biometric authentication.
In fig. 30F, since the biometric authentication was successful, electronic device 3000 automatically populates username field 3014 and password field 3016 with credential information (e.g., a username and password that enables the user to successfully log into an account). The electronic device 3000 automatically populates the fields when the device is in the unlocked state. In some examples, it is determined that the biometric authentication failed (e.g., the biometric authentication criteria have not been met). Thus, in those examples, electronic device 3000 forgoes automatically populating the one or more filable fields (e.g., username field 3014 and password field 3016).
In FIG. 30G, electronic device 3000 detects a tap gesture 3028 on the submit affordance 3030. In response to detecting tap gesture 3028, electronic device 3000 submits credential information in user name field 3014 and password field 3016 for user authentication. Upon successful user authentication, the electronic device 3000 provides access to restricted content (e.g., content that is only visible once the user has logged into the user interface) in the account user interface 3032 of fig. 30H.
In FIG. 30I, the electronic device 3000 detects a tap gesture 3034 on a store affordance 3036. Electronic device 3000 recognizes flick gesture 3034 as a request to load login user interface 3038 (shown in fig. 30J). User interface 3038 is a web page shop. In some examples, shop. In some examples, such as the example of fig. 30J, the sub-domain shop. Electronic device 3000 also recognizes a tap gesture 3034 as a request to automatically populate one or more filable fields (e.g., username field 3040 and password field 3042) in login user interface 3038. As previously described, in some examples, the request also includes a request to automatically log in to the user, such that the user need not tap the submit affordance (e.g., 3030, 3046) in order to submit the credential and log in.
In response to the request to automatically populate the populatable field, it is determined that cached authentication is available from the successful authentication that occurred in FIG. 30E. Thus, the electronic device 3000 foregoes re-performing biometric authentication and proceeds to auto-fill username field 3040 and password field 3042, as shown in fig. 30J. The electronic device 3000 automatically populates the field regardless of whether the biometric sensor 3003 detects a biometric feature (e.g., a face or a finger). In FIG. 30K, electronic device 3000 detects a tap gesture 3044 on a submit affordance 3046. In response to detecting flick gesture 3044, electronic device 3000 submits credential information in username field 3040 and password field 3042 for user authentication. Upon successful user authentication, the electronic device 3000 provides access to restricted content (e.g., content that is visible only once the user has logged into the user interface) in the store user interface 3048 of FIG. 30L.
Alternatively, in response to a request to automatically populate the populatable field, it is determined that cached authentication is not available. As described below, fig. 30N-30V illustrate various examples of making cached authentication unavailable to the electronic device 3000. As described below, 30W-Y depicts that biometric authentication must be performed when cached authentication is not available.
In FIG. 30M, the electronic device 3000 displays the store user interface 3048 and detects a tap gesture 3050 on the link affordance 3052. In response to detecting the tap gesture 3050, the electronic device 3000 displays an account user interface 3032, as shown in fig. 30N.
Fig. 30N-30O depict one example of making a cached authentication unusable by electronic device 3000. In fig. 30N, the electronic device 3000 detects an input (e.g., a single press) by the finger 3054 at the home button 3056. In response to detecting the input, the electronic device 3000 displays the home screen 3058, as shown in fig. 30O, and causes the application with the account user interface 3032 to enter an inactive state (e.g., a paused state, a dormant state, a background state, and/or an inactive state). If the application has been in an inactive state for more than a threshold amount of time (e.g., two and forty seconds) between when the populatable field in the login user interface 3012 (e.g., fig. 30F) is automatically populated and when a request is received to automatically populate the populatable field in the login user interface 3038 (e.g., fig. 30J), the cached authentication may not be available to the electronic device 3000.
Fig. 30P-30T depict one example of making a cached authentication unusable by the electronic device 3000. In fig. 30P, the electronic device 3000 detects an input (e.g., two presses) by the finger 3054 at the home button 3056. In response to detecting the input, the electronic device 3000 displays the recently used application view 3060, as shown in FIG. 30Q. In fig. 30R-30S, the electronic device 3000 detects a swipe gesture 3062, which causes an application with the account user interface 3032 to close (e.g., terminate). Thus, the electronic device 3000 displays the recently used applications view 3060, which no longer includes applications having an account user interface 3032, as shown in FIG. 30T. Once the application has been closed between when the populatable fields in the login user interface 3012 (e.g., fig. 30F) are automatically populated and when a request is received to automatically populate the populatable fields in the login user interface 3038 (e.g., fig. 30J), the cached authentication is not available to the electronic device 3000.
Fig. 30U-30V depict one example of making a cached authentication unusable by the electronic device 3000. In fig. 30U, the electronic device 3000 detects an input (e.g., a single press) by the finger 3064 at the button 3004. In response to detecting the input, electronic device 3000 transitions the device from the unlocked state to the locked state. Transitioning the device to the locked state includes deactivating (e.g., disabling) the display 3002, the one or more biometric sensors 3003, and/or the microphone of the electronic device 3000. In fig. 30V, the electronic device 3000 is in a locked state and does not display any content on the display 3002. Once electronic device 3000 transitions to the locked state between when the populatable fields in login user interface 3012 (e.g., fig. 30F) are automatically populated and when a request is received to automatically populate the populatable fields in login user interface 3038 (e.g., fig. 30J), the cached authentication is not available to electronic device 3000.
In fig. 30W, the electronic device 3000 displays the account user interface 3032 after the cached authentication is no longer available (e.g., after one or more of the sequence of events discussed with respect to fig. 30N-30O, 30P-30T, or 30U-30V). The electronic device 3000 detects a tap gesture 3034 on the store affordance 3036. Electronic device 3000 recognizes flick gesture 3034 as a request to load login user interface 3038 (shown in fig. 30X). Electronic device 3000 also recognizes a tap gesture 3034 as a request to automatically populate one or more filable fields (e.g., username field 3040 and password field 3042) in login user interface 3038. As previously described, in some examples, the request also includes a request to automatically log in to the user, such that the user need not tap the submit affordance to submit the credential and log in.
In response to the request to automatically populate the one or more filable fields, the electronic device 3000 determines that the cached authentication is not available. In fig. 30Y, the electronic device 3000 re-performs biometric authentication using the one or more biometric sensors 3003. Biometric authentication occurs automatically in response to receiving a request to automatically populate the fillable fields such that no intermediate input from the user is required to initiate biometric authentication. If the biometric authentication is successful (e.g., meets the biometric authentication criteria), the electronic device automatically populates the fillable field. If the biometric authentication is unsuccessful (e.g., the biometric authentication criteria is not satisfied), the electronic device 3000 forgoes auto-populating the populatable field.
In FIG. 30Z, the user is not logged in and is presented with a user interface similar to that of FIG. 30A. The electronic device 3000 displays a user interface 3006 of an application on the display 3002. Electronic device 3000 detects a tap gesture 3008 on login affordance 3010. Electronic device 3000 recognizes flick gesture 3008 as a request to load login user interface 3012 (shown in FIG. 30 AA). Electronic device 3000 also recognizes flick gesture 3008 as a request to automatically populate one or more filable fields (e.g., username field 3014 and password field 3016) in login user interface 3012. The request for autofill requires biometric authentication in order to proceed with autofill the fillable fields. As previously described, in some examples, the request also includes a request to automatically log in to the user, such that the user need not tap the submit affordance to submit the credential and log in.
In fig. 30AA, in response to a request to automatically populate the fillable fields, the electronic device 3000 uses the biometric sensor 3003 to determine whether certain biometric authentication criteria have been met. The biometric sensor 3003 is contactless such that the sensor is configured to perform biometric authentication without physical input from the user. Thus, the electronic device 3000 initiates biometric authentication using the biometric sensor 3003 without receiving an explicit request from the user to initiate biometric authentication. The biometric authentication is unsuccessful (e.g., does not meet the biometric authentication criteria). Thus, when biometric authentication fails, the electronic device 3000 displays the failure interface 3066 in fig. 30AB, as described with respect to fig. 17M. When the user has not reached the maximum number of failed biometric authentication attempts, a failure interface 3066 may be displayed.
In FIG. 30AC, the electronic device 3000 detects a tap gesture 3068 on the cancel affordance 3070. In response to detecting the tap gesture 3068, electronic device 3000 displays login user interface 3012 (shown in FIG. 30 AD). In fig. 30AD, in response to detecting selection of the username field 3014, the electronic device 3000 displays a cursor 3072 in the username field 3014 and also displays a virtual keyboard 3074. Electronic device 3000 receives input to enter one or more characters corresponding to credential information in username field 3014 and password field 3016. In FIG. 30AE, electronic device 3000 detects a tap gesture 3028 on the submit affordance 3030. Thus, electronic device 3000 submits credential information in username field 3014 and password field 3016 for user authentication. Upon successful user authentication, the electronic device 3000 provides access to restricted content (e.g., content that is visible only once the user has logged into the user interface) in the account user interface 3032 of fig. 30 AF.
In fig. 30AG, electronic device 3000 detects a tap gesture 3034 on store affordance 3036. Electronic device 3000 recognizes flick gesture 3034 as a request to load login user interface 3038 (shown in fig. 30 AH). Electronic device 3000 also recognizes a tap gesture 3034 as a request to automatically populate one or more filable fields (e.g., username field 3040 and password field 3042) in login user interface 3038. As previously described, in some examples, the request also includes a request to automatically log in to the user, such that the user need not tap the submit affordance to submit the credential and log in.
In response to the request to automatically populate the one or more filable fields, the electronic device 3000 determines that the cached authentication is not available. In fig. 30AH, the electronic device 3000 performs biometric authentication using the biometric sensor 3003. Biometric authentication occurs automatically in response to receiving a request to automatically populate the fillable fields such that no intermediate input from the user is required to initiate biometric authentication.
In fig. 30AI, it is determined that the biometric authentication is successful (e.g., meets the biometric authentication criteria). Accordingly, the electronic device 3000 displays the biometric authentication interface 3018 having the biometric authentication flag 3026 indicating the success of the biometric authentication. Upon successful biometric authentication, the electronic device 3000 automatically populates the username field 3040 and password field 3042 with credential information, as shown in fig. 30 AJ.
In FIG. 30AJ, electronic device 3000 detects a tap gesture 3044 on a submit affordance 3046. In response to detecting flick gesture 3044, electronic device 3000 submits credential information in username field 3040 and password field 3042 for user authentication. Upon successful user authentication, the electronic device 3000 provides access to restricted content (e.g., content that is visible only once the user has logged into the user interface) in the store user interface 3048 of fig. 30 AK.
In some examples, the electronic device 3000 displays a biometric authentication interface with a biometric authentication flag symbol indicating that biometric authentication is being performed. In some examples, the displayed biometric authentication interface is approximately centered along a horizontal and/or vertical axis, such as in fig. 30B-30E. In other examples, the electronic device 3000 displays a biometric authentication interface on the top, bottom, sides, or corners of the display 3002. For example, the electronic device 3000 displays a biometric authentication interface 3076 near the top of the display 3002, as shown in fig. 30 AL. In some examples, the electronic device 3000 does not display the biometric authentication interface while biometric authentication is being performed.
31A-31B are flow diagrams illustrating methods for using an electronic device to determine whether biometric re-authentication is required or whether cached authentication is available, according to some examples. Method 3100 is performed at a device (e.g., 100, 300, 500, 1700, 3000) having one or more biometric sensors (e.g., 3003) (e.g., fingerprint sensors, contactless biometric sensors (e.g., biometric sensors that do not require physical contact, such as thermal or optical facial recognition sensors), iris scanners). In some examples, the one or more biometric sensors (e.g., 3003) include one or more cameras. The electronic device (e.g., 100, 300, 500, 1700, 3000) optionally includes a display (e.g., 3002). In some examples, the display (e.g., 3002) is a touch-sensitive display. In some examples, the display (e.g., 3002) is not a touch-sensitive display.
Some operations in method 3100 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted. As described below, the method 3100 provides an intuitive way to determine whether biometric re-authentication is required or whether cached authentication is available. The method reduces the cognitive burden on the user when authenticating on the electronic device, thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling users to authenticate faster and more efficiently conserves power and increases the interval between battery charges.
An electronic device (e.g., 100, 300, 500, 1700, 3000) receives (3102) a first request (e.g., 3008) to perform a first operation requiring authentication (e.g., selecting a password to automatically populate, unlock, make payment for, etc.).
In response (3104) to receiving a first request (e.g., 3008) to perform a first operation, the electronic device (e.g., 100, 300, 500, 1700, 3000) proceeds to blocks 3106 through 3110.
The electronic device (e.g., 100, 300, 500, 1700, 3000) uses (3106) the one or more biometric sensors (e.g., 3003) to determine whether a first biometric authentication criterion is satisfied. The first biometric authentication criteria includes a requirement that a respective type of biometric feature (e.g., face or fingerprint) authorized to perform the first operation be detected by a biometric sensor (e.g., 3003).
In accordance with a determination that a first biometric authentication criterion is satisfied (e.g., at least a portion of the biometric feature determined based on data obtained from the one or more biometric sensors (e.g., 3003) corresponding to the biometric feature satisfies the biometric authentication criterion (e.g., the detected face is consistent with an authorized face)), the electronic device (e.g., 100, 300, 500, 1700, 3000) performs (3108) a first operation. Performing the first operation based on the request and the successful authentication provides the user with the ability to perform the first operation without further input (beyond the request). Performing the operation when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In accordance with a determination that the biometric authentication criteria are not satisfied, the electronic device (e.g., 100, 300, 500, 1700, 3000) foregoes (3110) performing the first operation. Abandoning (or performing) the respective operation based on not satisfying the biometric authentication criteria provides security and may prevent unauthorized users from initiating sensitive operations. Providing improved security enhances the operability of the device and makes the user-device interface more efficient (e.g., by restricting unauthorized access), which in turn reduces power usage and extends the battery life of the device by limiting the performance of restricted operations.
After performing the first operation, the electronic device (e.g., 100, 300, 500, 1700, 3000) receives (3112) a second request (e.g., 3034) to perform a second operation (e.g., the same or different operation as the first operation) that requires authentication (e.g., selecting a password to automatically populate, unlock, make payment).
In response (3114) to receiving the second request (e.g., 3034), the electronic device (e.g., 100, 300, 500, 1700, 3000) proceeds to blocks 3116-3118.
In accordance with a determination that the re-authentication criteria have been met (e.g., the cached authentication is not allowed for the second operation or the cached authentication is not available), the electronic device (e.g., 100, 300, 500, 1700, 3000) uses (3116) the one or more biometric sensors (e.g., 3003) to determine whether the second biometric authentication criteria are met. The second biometric authentication criteria includes a requirement that a respective type of biometric feature (e.g., face or fingerprint) authorized to perform the second operation be detected by the biometric sensor (e.g., 3003). In some examples, the first biometric authentication criteria and the second biometric authentication criteria are the same. In some examples, the first biometric authentication criteria and the second biometric authentication criteria are different. Performing biometric authentication when cached authentication is not available provides security and may prevent unauthorized users from initiating sensitive operations. Providing improved security enhances the operability of the device and makes the user-device interface more efficient (e.g., by restricting unauthorized access), which in turn reduces power usage and extends the battery life of the device by limiting the performance of restricted operations.
In accordance with a determination that the re-authentication criteria have not been met (e.g., cached authentication is available), the electronic device (e.g., 100, 300, 500, 1700, 3000) performs (3118) a second operation without performing the biometric authentication and forgoes using the one or more biometric sensors (e.g., 3003) to determine whether the second biometric authentication criteria are met. Performing the second operation upon request without requiring re-authentication provides the user with the ability to perform the operation without requiring further input (beyond the request). Performing the operation when a set of conditions has been met without requiring further user input or re-authentication enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, the first operation and the second operation occur while the electronic device (e.g., 100, 300, 500, 1700, 3000) is in an unlocked state. In some examples, determining whether the second biometric authentication criteria is satisfied using the one or more biometric sensors (e.g., 3003) occurs while the electronic device (e.g., 100, 300, 500, 1700, 3000) is in the unlocked state. Performing biometric authentication while the device is in the unlocked state enables the device to provide feedback by displaying an indication of the progress of biometric authentication. Providing the user with improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide appropriate input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, in response to receiving the second request (e.g., 3034) and in accordance with a determination that the second biometric authentication criteria is not satisfied, the electronic device (e.g., 100, 300, 500, 1700, 3000) foregoes performing the second operation.
According to some examples, in response to receiving the second request (e.g., 3034), performing the second operation occurs regardless of whether the biometric sensor (e.g., 3003) detects a respective type of biometric characteristic authorized to perform the second operation.
According to some examples, the first operation is to log in (e.g., 3028) a first web domain, and the second operation is to log in (e.g., 3044) a second web domain corresponding to the first web domain. In some examples, the second web domain is the same as the first web domain. In some examples, the second web domain is a sub-domain of the first web domain.
According to some examples, the re-authentication criteria includes a requirement that the device (e.g., 100, 300, 500, 1700, 3000) is already in a locked state (e.g., cached authentication is not available) between when the first operation is performed and when the second request (e.g., 3034) is received (e.g., the re-authentication criteria is not satisfied and biometric authentication is no longer needed again when the device has remained in an unlocked state between when the first operation is performed and when the second request is received).
According to some examples, the first operation is performed in the application, and the re-authentication criteria includes a requirement that the application has been closed (e.g., terminated) (e.g., cached authentication is not available) between when the first operation was performed and when the second request (e.g., 3034) was received (e.g., the re-authentication criteria is not satisfied and biometric authentication is no longer needed between when the application has remained open between when the first operation was performed and when the second request was received).
According to some examples, the re-authentication criteria includes a requirement that the application has been in an inactive state (e.g., suspended state, dormant state, background state, inactive state) for more than a threshold amount of time (e.g., 2 minutes 40 seconds) (e.g., cached authentication is not available) between when the first operation is performed and when the second request (e.g., 3034) is received (e.g., when the application has remained in an active state between when the first operation is performed and when the second request is received, the re-authentication criteria is not satisfied and biometric authentication is no longer needed again).
According to some examples, determining whether the second biometric authentication criteria is satisfied using the one or more biometric sensors (e.g., 3003) occurs automatically (e.g., without requiring intermediate input from the user) in response to receiving a second request (e.g., 3034) to perform a second operation requiring authentication.
According to some examples, the one or more biometric sensors (e.g., 3003) are non-contact biometric sensors (e.g., infrared cameras, visible light cameras, or a combination thereof) configured to perform biometric authentication without physical contact from a user (e.g., the one or more biometric sensors (e.g., 3003) may perform biometric authentication without physical input (e.g., touch or button press) from a user).
According to some examples, in using the one or more biometric sensors (e.g., 3003) to determine whether the first or second biometric authentication criteria are met, the electronic device (e.g., 100, 300, 500, 1700, 3000) displays an indication (e.g., 3076) on the display (e.g., 3002) that biometric authentication is being performed (e.g., a small indicator is displayed on the top, bottom, side, or corner). Displaying a small indicator away from the center of the display provides the user with an indication of the progress of the biometric authentication without obstructing or cluttering the display and shifting the user's focus, thereby providing improved visual feedback during authentication. Providing the user with improved visual feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide appropriate input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication. In some examples, the electronic device (e.g., 100, 300, 500, 1700, 3000) forgoes displaying an indication that biometric authentication is being performed while using the one or more biometric sensors (e.g., 3003) to determine whether biometric authentication criteria are satisfied.
Note that the details of the process described above with respect to method 3100 (e.g., fig. 31A-31B) may also be applied in a similar manner to the methods described below and above. For example, method 3100 optionally includes one or more of the features of the various methods described above with reference to methods 800, 1000, 1200, 1400, 1600, 2000, 2200, 2500, 2700, 2900, 3300, and 3500. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to method 3100. As another example, the visibility criteria of method 3300 may be used with method 3100 to control when biometric authentication should be performed (or re-performed). For the sake of brevity, these details are not repeated in the following.
Fig. 32A-32W illustrate exemplary user interfaces for automatically populating a filable field based on visibility criteria, according to some examples. As described in more detail below, the exemplary examples of user interfaces shown in fig. 32A-32W are used to illustrate the processes described below, including the process in fig. 33.
Fig. 32A shows an electronic device 3200 (e.g., the portable multifunction device 100, the device 300, or the device 500). In the illustrative example shown in fig. 32A-32W, the electronic device 3200 is a smartphone. In other examples, the electronic device 3200 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 3200 includes a display 3202, one or more input devices (e.g., a touch screen and a microphone of the display 3202), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In fig. 32A, the electronic device includes a biometric sensor 3203. In some examples, the biometric sensor is one or more biometric sensors that may include a camera (such as an infrared camera, a thermal imaging camera, or a combination thereof). In some examples, biometric sensor 3203 is biometric sensor 703. In some examples, the one or more biometric sensors include one or more fingerprint sensors (e.g., fingerprint sensors integrated into buttons). In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In fig. 32A, the electronic device 3200 displays a user interface 3204 of an application on a display 3202. Com, and the interface corresponds to a web site (airline). In fig. 32B-32D, the electronic device 3200 does not initiate biometric authentication because the login user interface has not met certain visibility criteria. For example, the visibility criteria can include whether a threshold amount of one or more filable fields (e.g., corresponding to credential information) are displayed within a visible region of the user interface.
In fig. 32B-32C, the electronic device 3200 detects a scroll gesture 3206 in an upward motion. In response to detecting the scroll gesture 3206, the electronic device 3200 causes the user interface 3204 to scroll downward. In fig. 32D, subsequent to the scroll gesture 3206, the electronic device 3200 displays a scrolled user interface 3208. Because it is determined that the visibility criteria have not been met, the electronic device 3200 has still not initiated biometric authentication.
In fig. 32E, the electronic device 3200 detects a tap gesture 3210 on the hidden menu affordance 3212. In response to detecting the tap gesture 3210, the electronic device 3200 displays a hidden menu 3214 that includes one or more filable fields (e.g., a username field 3216 and a password field 3218), as shown in fig. 32F. The electronic device 3200 displays a hidden menu 3214 that is overlaid on the scrolled user interface 3208 such that the obscured portion of the scrolled user interface 3208 is no longer displayed. It is determined whether the filable field meets certain visibility criteria.
If it is determined that the filable fields satisfy certain visibility criteria, the electronic device 3200 receives a request to automatically populate the filable fields in the hidden menu 3214 with credential information (e.g., a username and password that enables the user to successfully log into the account). The request for autofill requires biometric authentication in order to proceed with autofill the fillable fields. In some examples, the request also includes a request to automatically log in to the user, such that the user need not tap the submit affordance (e.g., submit affordance 3232 in fig. 32L) in order to submit the credential and log in.
In fig. 32G, upon determining that the one or more filable fields do meet certain visibility criteria, the electronic device 3200 uses the biometric sensor 3203 to determine whether certain biometric authentication criteria have been met. The electronic device 3200 captures and processes (e.g., analyzes) the biometric data from the biometric sensor 3203 to determine, based on the biometric data, whether the biometric feature (or a portion thereof) satisfies a biometric authentication criterion (e.g., determines whether the biometric data matches the biometric template within a threshold). Biometric authentication occurs automatically upon determining that the filable field satisfies the visibility criteria. The biometric sensor 3203 is contactless such that the sensor is configured to perform biometric authentication without physical contact from the user. Thus, the electronic device 3200 initiates biometric authentication using the biometric sensor 3203 without receiving an explicit request from the user to initiate biometric authentication. In some examples, initiating biometric authentication includes detecting contact with one or more fingerprint sensors and determining whether the contact satisfies certain fingerprint authentication criteria (e.g., determining whether the fingerprint is consistent with an enrolled fingerprint as discussed above with respect to the secure element 115; determining whether the fingerprint matches stored information as discussed above with respect to methods 1600, 1800, 2200, and 17O and 21). In some examples, determining whether a fingerprint is consistent with an enrolled fingerprint is performed according to one or more of the methods discussed in U.S. patent application publication No. 2015/0146945 (e.g., at paragraphs [0119] through [0121 ]). U.S. patent application publication No. 2015/0146945, and in particular the disclosure of the method with respect to which it is determined whether a fingerprint is consistent with an enrolled fingerprint, is hereby incorporated by reference.
Alternatively, it may be determined that the one or more filable fields do not meet certain visibility criteria. If the filable field does not satisfy the visibility criteria, the electronic device 3200 foregoes initiating biometric authentication.
Performing biometric authentication includes displaying a biometric authentication interface 3220 with biometric authentication landmark symbols 3222. The biometric authentication landmark symbols 3222 are simulations of representations of biometric features (e.g., faces). As shown in fig. 32G, a biometric authentication interface 3220 is overlaid over at least a portion of the hidden menu 3214. The biometric authentication interface 3220 is optionally an operating system level interface (e.g., an interface generated by the device's operating system), and the hidden menu 3214 is an application level interface (e.g., a user interface generated by a third-party application separate from the device's operating system). In some examples, the displayed biometric authentication interface is approximately centered along a horizontal and/or vertical axis, such as in fig. 32G-32J. In some examples, the electronic device 3200 displays a biometric authentication interface at the top, bottom, sides, or corners of the display 3202. For example, the electronic device 3200 displays a biometric authentication interface near the top of the display 3202, such as in, for example, fig. 30 AL. In some examples, the electronic device 3200 does not display a biometric authentication interface while biometric authentication is being performed.
In fig. 32H, the electronic device 3200 displays a portion of a biometric authentication animation that includes a biometric authentication mark symbol 3224 that is used as part of an animation during which the biometric sensor 3203 obtains biometric data. Once the electronic device 3200 has obtained biometric data (e.g., obtained sufficient biometric data), the electronic device 3200 transitions to displaying a biometric authentication mark symbol 3226, as shown in fig. 32I. The electronic device 3200 displays a biometric authentication mark symbol 3226 to indicate that biometric data is being processed. In some examples, the biometric authentication mark symbol 3226 includes multiple rings that, for example, spherically rotate when displayed.
In fig. 32J, it is determined that the biometric authentication is successful. Accordingly, the electronic device 3200 displays a biometric authentication mark symbol 3228 indicating the success of the biometric authentication.
In fig. 32K, because the biometric authentication was successful, the electronic device 3200 automatically populates the one or more populatable fields (e.g., a username field 3216 and a password field 3218) with credential information (e.g., login information such as a username and password that enables the user to successfully log into the account). In some examples, the electronic device 3200 automatically populates the fillable fields with credit card information (e.g., information associated with payment account information).
Alternatively, it may be determined that the biometric authentication criteria have not been met. If the biometric authentication fails, the electronic device 3200 may opt out of automatically populating the one or more populatable fields with credential information (e.g., login information or credit card information). Forgoing automatic population of the one or more filable fields optionally includes displaying a failure interface, such as failure interface 2854 in fig. 28X or failure interface 2856 in fig. 28Y.
In fig. 32L, the electronic device detects a tap gesture 3230 on a submit affordance 3232. In response to detecting the tap gesture 3230, the electronic device 3200 submits credential information in a username field 3216 and a password field 3218 for user authentication. Upon successful authentication, the electronic device 3200 provides access to restricted content (e.g., content that is only visible once the user has logged into the user interface) in the member user interface 3234 of fig. 32M.
32N-32W illustrate various scenarios in which certain visibility criteria are not initially met and then met after user input.
In fig. 32N, the electronic device 3200 displays the user interface 3236 of the application on the display 3202. Com, and the interface corresponds to a website. The user interface 3236 includes one or more filable fields (e.g., a username field 3238 and a password field 3240). The fillable fields are displayed within the viewable area of the user interface 3236 in a size below a threshold size (e.g., a threshold size that must be met or exceeded in order to meet certain visibility criteria).
In fig. 32O, the electronic device 3200 detects a zoom gesture 3242 while the user interface 3236 is displayed. In response to detecting the zoom gesture 3242, the electronic device 3200 displays a magnified user interface 3244 as shown in fig. 32P. The enlarged user interface 3244 includes an enlarged username field 3246 and an enlarged password field 3248. It is determined that the zoom gesture 3242 does not cause the fillable fields to meet certain visibility criteria. For example, the filable fields are still displayed within the viewable area of the enlarged user interface 3244 at a size below the threshold size. Upon determining that the visibility criteria has not yet been met, the electronic device 3200 foregoes initiating biometric authentication.
In fig. 32Q, the electronic device 3200 detects a zoom gesture 3250 while displaying the enlarged user interface 3244. In response to detecting the zoom gesture 3250, the electronic device 3200 displays a magnified user interface 3252 as shown in fig. 32R. The enlarged user interface 3252 includes an enlarged username field 3254 and an enlarged password field 3256. The zoom gesture 3250 is determined such that the fillable fields satisfy visibility criteria. For example, the filable fields are now displayed within the viewable area of the enlarged user interface 3252 at a size at or above the threshold size. Upon determining that the visibility criteria is satisfied, the electronic device 3200 initiates biometric authentication and displays a biometric authentication interface 3220 with biometric authentication landmark symbols 3222, as described with respect to fig. 32G.
In fig. 32S, the electronic device 3200 displays the user interface 3258 of the application on the display 3202. The application is a mobile browser application and the interface corresponds to a website (news feed. User interface 3258 is a user interface area corresponding to a portion of an electronic document (e.g., an HTML document). The electronic document includes one or more filable fields (e.g., username field 3268 and password field 3270 in fig. 32W) outside the visible area of the user interface 3258.
In fig. 32T, while the user interface 3258 is displayed, the electronic device 3200 detects the scroll gesture 3260 in an upward motion. In response to detecting the scroll gesture 3260, the electronic device 3200 causes the user interface 3258 to scroll downward. In fig. 32U, subsequent to the scroll gesture 3260, the electronic device 3200 displays a scrolled user interface 3262 that includes displaying a portion of the one or more filable fields (e.g., username field 3268). Determining that the scroll gesture 3260 does not cause the one or more filable fields to satisfy certain visibility criteria. For example, the visibility criteria includes whether a threshold amount of the one or more filable fields are displayed within a visible area of the scrolled user interface 3262. Upon determining that the visibility criteria has not yet been met, the electronic device 3200 foregoes initiating biometric authentication.
In fig. 32V, while the scrolled user interface 3262 is displayed, the electronic device 3200 detects a scroll gesture 3264. In response to detecting the scroll gesture 3264, the electronic device 3200 causes the scrolled user interface 3262 to scroll further down. In fig. 32W, subsequent to the scroll gesture 3264, the electronic device 3200 displays a scrolled user interface 3266 that includes displaying the one or more filable fields (e.g., a username field 3268 and a password field 3270). The scroll gesture 3264 is determined such that the fillable fields satisfy visibility criteria. For example, a threshold amount of the one or more filable fields are now displayed within the viewable area of the scrolled user interface 3262. Upon determining that the visibility criteria is satisfied, the electronic device 3200 initiates biometric authentication and displays a biometric authentication interface 3220 with biometric authentication landmark symbols 3222, as described with respect to fig. 32G.
Fig. 33 is a flow diagram illustrating a method for using an electronic device to determine when to perform an authentication operation, according to some examples. Method 3300 is performed at a device (e.g., 100, 300, 500, 1700, 3200) having a display (e.g., 3202). In some examples, the display (e.g., 3202) is a touch-sensitive display. In some examples, the display (e.g., 3202) is not a touch-sensitive display.
Some operations in method 3300 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted. As described below, method 3300 provides an intuitive way for determining when to perform an authentication operation. The method reduces the cognitive burden of the user when the authentication operation is executed, thereby creating a more effective human-computer interface. For battery-driven computing devices, enabling users to authenticate faster and more efficiently conserves power and increases the interval between battery charges.
An electronic device (e.g., 100, 300, 500, 1700, 3200) receives (3302) a request (e.g., 3210) (e.g., a request to load a web page, scroll the web page, zoom the web page) to display a first portion (e.g., 3214) of a respective content (e.g., 3208).
In response (3304) to a request (e.g., 3210) for displaying a first portion (e.g., 3214) of the respective content (e.g., 3208), the electronic device (e.g., 100, 300, 500, 1700, 3200) proceeds to blocks 3306-3310.
An electronic device (e.g., 100, 300, 500, 1700, 3200) displays (3306) at least a first portion (e.g., 3214) of respective content (e.g., 3208) (e.g., an area of an electronic document (e.g., an HTML document) having user-interactive elements) on a display (e.g., 3202). The corresponding content (e.g., 3208) includes elements (e.g., 3216, 3218) associated with the authentication operation (e.g., one or more fillable fields, such as a credit card input field, a login user interface element optionally including a username and password field for logging into the service).
In accordance with a determination that the element (e.g., 3216, 3218) associated with the authentication operation satisfies the visibility criteria (e.g., the element associated with the authentication operation is entirely outside of the visible area of the content, at least a threshold amount of the element associated with the authentication operation is outside of the visible area of the content, the element associated with the authentication operation is displayed within the visible area of the content at a size below a threshold size, and/or the element associated with the operation is contained in a portion of the content that is hidden from view (such as contained in a folded menu area or other hidden element) (e.g., at least a portion of the one or more filable fields is displayed, the one or more filable fields are displayed entirely, and/or the one or more filable fields are greater than a threshold size), the electronic device (e.g., 100, 3218, etc.) 300. 500, 1700, 3200) initiates (3308) biometric authentication (e.g., as described with reference to fig. 17G-17H). In some examples, the region corresponds to a portion of an electronic document (e.g., an HTML document), and the one or more filable fields are one or more elements of the electronic document having a property that causes the one or more elements to be rendered in a visible state (e.g., an HTML element having a "style. Initiating biometric authentication when the visibility criteria are satisfied provides the user with the ability to perform biometric authentication without requiring further input (beyond that which causes the visibility criteria to be satisfied). Performing biometric authentication when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In accordance with a determination that the element (e.g., 3216, 3218) associated with the authentication operation does not satisfy the visibility criteria (e.g., the element associated with the authentication operation is displayed entirely within the visible area of the content, at least a threshold amount of the element associated with the authentication operation is displayed within the visible area of the content, the element associated with the authentication operation is displayed within the visible area of the content at a size above a threshold size, and/or the element associated with the operation is contained in a portion of the content that is not otherwise hidden from view (such as contained in a folded menu area or other hidden element), the electronic device (e.g., 100, 300, 500, 1700, 3200) foregoes (3310) initiating the biometric authentication. Forgoing initiation of biometric authentication based on the visibility criteria not being satisfied avoids biometric authentication from occurring when a user does not intend for the device to initiate biometric authentication. Avoiding unintended biometric authentication enhances the operability of the device and makes the user-device interface more efficient (e.g., by restricting unauthorized access), which in turn reduces power usage and extends the battery life of the device by limiting the performance of restricted operations.
According to some examples, a first portion (e.g., 3214) of the respective content (e.g., 3208) is displayed without displaying a second portion of the respective content on the display (e.g., 3202). In some examples, the second portion is displayed before the first portion is displayed.
According to some examples, in accordance with a determination that an element (e.g., 3216, 3218) associated with the authentication operation satisfies visibility criteria, biometric authentication occurs automatically (e.g., no intermediate input is required to initiate biometric authentication).
According to some examples, while displaying a first portion (e.g., 3236, 3258) of respective content, an electronic device (e.g., 100, 300, 500, 1700, 3200) detects an input (e.g., 3242, 3250, 3260, 3264) (e.g., zoom, scroll, menu display). In response to detecting the input (e.g., 3250, 3264) and in accordance with a determination that the input causes an element (e.g., 3246, 3248, 3268, 3270) associated with the authentication operation to satisfy the visibility criteria, the electronic device (e.g., 100, 300, 500, 1700, 3200) initiates biometric authentication. Initiating biometric authentication in response to an input and when a visibility criterion is satisfied provides the user with the ability to perform biometric authentication without requiring further input (beyond the input that causes the visibility criterion to be satisfied). Performing biometric authentication when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In response to detecting the input (e.g., 3242, 3260) and in accordance with a determination that the input does not cause an element (e.g., 3238, 3240, 3268, 3270) associated with the authentication operation to satisfy the visibility criteria, the electronic device (e.g., 100, 300, 500, 1700, 3200) forgoes initiating the biometric authentication. In some examples, the input (e.g., 3210, 3242, 3250, 3260, 3264) (e.g., selection of an affordance, resize request) affects a visibility characteristic of an element (e.g., 3216, 3218, 3246, 3248, 3268, 3270) associated with the authentication operation such that the element transitions from not meeting the visibility criterion to meeting the visibility criterion. Forgoing initiation of biometric authentication based on the visibility criteria not being satisfied avoids biometric authentication from occurring when a user does not intend for the device to initiate biometric authentication. Avoiding unintended biometric authentication enhances the operability of the device and makes the user-device interface more efficient (e.g., by restricting unauthorized access), which in turn reduces power usage and extends the battery life of the device by limiting the performance of restricted operations.
According to some examples, the input (e.g., 3242, 3250) is a request to perform a scaling operation, and the visibility criteria includes a requirement that elements (3238, 3240, 3246, 3248, 3254, 3256) associated with the authentication operation have a size greater than a threshold size.
According to some examples, the input (3260, 3264) is a request to perform a scrolling operation, and the visibility criteria includes a requirement that at least a predetermined amount of an element (3268, 3270) associated with the authentication operation is displayed on a display (e.g., 3202).
According to some examples, the input (e.g., 3210) is a request to perform a hidden interface region display operation (e.g., a request to display a hidden menu or other hidden interface region), and the visibility criteria includes a requirement that an element (e.g., 3216, 3218) associated with the authentication operation not be specified for display in the hidden interface region.
According to some examples, the electronic device (e.g., 100, 300, 500, 1700, 3200) further includes one or more biometric sensors (e.g., 3203), and initiating biometric authentication includes initiating biometric authentication using the one or more biometric sensors (e.g., 3203).
According to some examples, the one or more biometric sensors (e.g., 3203) include one or more contactless biometric sensors (e.g., infrared cameras, visible light cameras, or a combination thereof) configured to perform biometric authentication without physical contact from a user (e.g., the one or more biometric sensors (e.g., 3203) may perform biometric authentication without physical input (e.g., touch or button press) from a user). Initiating biometric authentication occurs without receiving an explicit request to initiate biometric authentication.
According to some examples, the one or more biometric sensors (e.g., 3203) include one or more facial recognition sensors. Initiating biometric authentication includes using the one or more facial recognition sensors to determine whether facial authentication criteria have been met (e.g., as described with respect to fig. 23D-23F).
According to some examples, the one or more biometric sensors (e.g., 3203) include one or more fingerprint sensors. Initiating biometric authentication includes: detecting contact with the one or more fingerprint sensors, and determining whether the contact satisfies fingerprint authentication criteria (e.g., whether the fingerprint is consistent with an enrolled fingerprint or an authorized fingerprint).
According to some examples, initiating biometric authentication includes displaying a progress indicator (e.g., 3222, 3224, 3226, 3228) on a display (e.g., 3202) indicating a state of a biometric authentication process. In some examples, the progress indicator corresponds to a simulated progress indicator (e.g., a progress indicator having some or all of the features of the progress indicator (such as a plurality of progress elements distributed around a representation of the biometric feature of the user) surrounding a simulated display of the biometric feature). In some examples, the small progress indicator is displayed at the top, bottom, side, or corner. Displaying a small indicator away from the center of the display provides the user with an indication of the progress of the biometric authentication without obstructing or cluttering the display and shifting the user's focus, thereby providing improved visual feedback during authentication. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication. In some examples, the progress indicator is not displayed during biometric authentication. In some examples, an electronic device (e.g., 100, 300, 500, 1700, 3200) forgoes displaying a progress indicator indicating a status of a biometric authentication process.
According to some examples, the element associated with the authentication operation is a fillable field (e.g., 3216, 3218, 3254, 3256, 3268, 3270) (e.g., a username field, a password field, a credential field, or a payment information entry field). In response to initiating biometric authentication and in accordance with a determination that biometric authentication criteria have been satisfied, the electronic device (e.g., 100, 300, 500, 1700, 3200) automatically populates the populatable field (e.g., 3216, 3218, 3254, 3256, 3268, 3270) with credential information (e.g., populates the field with data stored by the electronic device (e.g., 100, 300, 500, 1700, 3200) or accessible to the electronic device (e.g., 100, 300, 500, 1700, 3200), such as a username, password, credit card information, or other sensitive information). In response to initiating the biometric authentication and in accordance with a determination that the biometric authentication criteria have not been met, the electronic device (e.g., 100, 300, 500, 1700, 3200) forgoes automatically populating the fillable field (e.g., 3216, 3218, 3254, 3256, 3268, 3270) with the credential information.
According to some examples, in response to initiating biometric authentication and in accordance with a determination that biometric authentication criteria have been met, an electronic device (e.g., 100, 300, 500, 1700, 3200) provides access to restricted content (e.g., logging in to a web page (e.g., 3234), displaying a password list associated with a user, displaying credit card information). In response to initiating the biometric authentication and in accordance with a determination that the biometric authentication criteria have not been met, the electronic device (e.g., 100, 300, 500, 1700, 3200) relinquishes providing access to the restricted content.
According to some examples, wherein the credential information comprises login information (e.g., information required to login to an application, account, or website).
According to some examples, wherein the credential information comprises information associated with payment account information (e.g., credit card, bank account, or payment service information).
Note that the details of the process described above with respect to method 3300 (e.g., fig. 33) may also be applied in a similar manner to the methods described below and above. For example, method 3300 optionally includes one or more of the features of the various methods described above with reference to methods 800, 1000, 1200, 1400, 1600, 2000, 2200, 2500, 2700, 2900, 3100, and 3500. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to method 3300. As another example, the re-authentication criteria of method 3100 can be used with method 3300 to control when biometric authentication should be performed (or re-performed). For the sake of brevity, these details are not repeated in the following.
Fig. 34A-34N illustrate example user interfaces for automatic login using biometric authentication, according to some examples. As described in more detail below, the exemplary examples of user interfaces shown in fig. 34A-34N are used to illustrate the processes described below, including the process in fig. 35.
Fig. 34A shows an electronic device 3400 (e.g., the portable multifunction device 100, the device 300, the device 500, or the device 1700). In the exemplary example shown in fig. 34A to 34N, the electronic device 3400 is a smartphone. In other examples, the electronic device 3400 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 3400 includes a display 3402, one or more input devices (e.g., a touchscreen and a microphone of the display 3402), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In fig. 34A, the electronic device includes a biometric sensor 3403. In some examples, the biometric sensor is one or more biometric sensors that may include a camera (such as an infrared camera, a thermal imaging camera, or a combination thereof). In some examples, the biometric sensor is depth camera 175 of device 100 or a depth camera having one or more features and/or functions of a depth camera as described with respect to certain examples of device 700 or biometric sensor 703. In some examples, biometric sensor 3403 is a depth camera used with a visible light camera to determine depth maps of different portions of an object captured by the visible light camera, as described above with respect to biometric sensor 703.
As seen in fig. 34A, the electronic device 3400 also includes a fingerprint sensor 3414 (e.g., a biometric sensor) integrated into the button. In some examples, the device further includes a light-emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. The light emitting device is optionally used to illuminate a biometric feature (e.g., a face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
In some examples, electronic device 3400 may share one or more features, elements, and/or components with devices 100, 300, 500, 700, 900, 1100, 1300, 1500, 1700, 1900, 2100, 2300, 2400, 2600, 2800, 3000, and 3200, and each of these devices may share one or more features, elements, and/or components of another of these devices (e.g., device 700 may include a component of device 3200, or vice versa). For example, the biometric sensor 3403 may be the biometric sensor 903, or the biometric sensor 1103 may be the biometric sensor 1303. As another example, button-integrated fingerprint sensor 3414 may be fingerprint sensor 1764. As another example, display 3402 may be display 1302, or display 1502 may be display 2102.
Prior to displaying the login user interface 3404 in fig. 34A, the electronic device 3400 detects a request to load the login user interface 3404. In response to detecting the request, it is determined whether biometric authentication using the fingerprint sensor 3414 is available. Upon determining that biometric authentication is available, the electronic device 3400 displays a login user interface 3404 with a prompt 3406 ("scan finger to login") in the submit affordance 3420 (e.g., submit an affordance of credential information in one or more of the filable fields (e.g., username field or password field) upon selection). Prompt 3406 indicates to the user that placing their finger on fingerprint sensor 3414 will cause credential information (e.g., a username and password that enables the user to successfully log into an account) to be submitted via username field 3408 and password field 3410. In addition, the user field 3408 is pre-populated with a default username (e.g., jj _ applieded @ email. com), as shown in fig. 34A. In some examples, the username field is not pre-populated with a username.
In fig. 34B, the electronic device 3400 detects the finger 3412 using the fingerprint sensor 3414 while the login user interface 3404 is displayed. In response to detecting finger 3412, the finger is determined to satisfy certain biometric authentication criteria (e.g., the fingerprint is consistent with an enrolled fingerprint). Upon successful authentication, in fig. 34C, electronic device 3400 automatically enters credential information in username field 3408 and/or password field 3410 and submits the credential information for user authentication (e.g., submits the information without further input from the user). Upon successful user authentication, the electronic device 3400 provides access to restricted content (e.g., content that is only visible once the user has logged into the user interface) in the account user interface 3416 of fig. 34D.
Alternatively, if it is determined that the finger does not meet certain biometric authentication criteria, the electronic device 3400 forgoes entering and submitting credential information and displays a failure interface 3418, as shown in fig. 34E. When the user has reached the maximum number of failed biometric authentication attempts, a failure interface 3418 may be displayed. The fingerprint sensor 3414 may not be used for biometric authentication if a maximum number of failed biometric authentication attempts has been reached.
Fig. 34F shows a login user interface 3404-1 that is displayed when biometric authentication using the fingerprint sensor 3414 is not available (e.g., when such authentication is disabled via a user-selectable setting or when a maximum number of attempts has been exceeded). In response to detecting the request to display the login user interface, it is determined that biometric authentication using the fingerprint sensor 3414 is not available. Based on this determination, the electronic device 3400 displays a login user interface 3404-1 without a prompt 3406. The electronic device 3400 displays the submit affordance 3420-1 in its original, unmodified state with text 3422 ("login") displayed in the submit affordance 3420-1.
In some examples, the electronic device 3400 does not immediately display the prompt 3406 in response to a request to load a login user interface. In contrast, the electronic device 3400 displays the prompt 3406 after receiving a selection of the filable field to enter text. In fig. 34G, the electronic device 3400 initially displays a login user interface 3404-1 without a prompt 3406. The electronic device 3400 detects a tap gesture 3424 on the password field 3410. In response to detecting the tap gesture 3424, the electronic device 3400 displays a virtual keyboard 3426 (e.g., a keyboard for entering one or more characters) and a cursor 3428 in the password field 3410, as shown in fig. 34H. Further in response to the tap gesture 3424, the electronic device 3400 displays a login user interface 3404 with a prompt 3406 located in the submit affordance 3420.
In fig. 34I, the electronic device 3400 receives an input corresponding to an input of one or more characters (e.g., character 3430) via the virtual keyboard 3426. In response to receiving the input of the character 3430, the electronic device 3400 again displays the login user interface 3404-1 without the prompt 3406. The electronic device 3400 displays the submit affordance 3420-1 in its original, unmodified state with text 3422 displayed in the submit affordance 3420-1.
In fig. 34J, upon receiving input via the virtual keyboard 3426, the electronic device 3400 detects a tap gesture 3432 on the submit affordance 3420-1. In response to detecting the tap gesture 3432, the electronic device 3400 submits one or more characters in the username field 3408 and the password field 3410 for user authentication. Upon successful user authentication, the electronic device 3400 provides access to restricted content (e.g., content that is only visible once the user has logged into the user interface) in the account user interface 3416 of fig. 34K.
In some examples, successful user authentication via fingerprint authentication (as described with respect to fig. 34A-34D) results in some outcome (e.g., access to a restricted application, web page, or account). In some examples, successful user authentication by typing and submitting credential information (as described with respect to fig. 34F-34K) results in the same result (e.g., access to a restricted application, web page, or account).
Fig. 34L illustrates that the prompt 3406 may be displayed in other locations on the login user interface, and that some elements discussed with respect to interface 3404 and interface 3404-1 may be omitted. In fig. 34L, in response to detecting the request to display the login user interface and if biometric authentication is available, the electronic device 3400 displays the login user interface 3404-2 with a prompt 3406 displayed in the password field 3410 and a submit affordance (e.g., 3420-1) not displayed.
In fig. 34M, the electronic device 3400 detects the finger 3412 using the fingerprint sensor 3414 while the login user interface 3404-2 is displayed. In response to detecting finger 3412, the finger is determined to satisfy certain biometric authentication criteria (e.g., the fingerprint is consistent with an enrolled fingerprint). Upon successful authentication, the electronic device 3400 automatically submits credential information for user authentication. Upon successful user authentication, the electronic device 3400 provides access to restricted content (e.g., content that is only visible once the user has logged into the user interface).
Alternatively, if the finger is determined not to satisfy certain biometric authentication criteria (e.g., the fingerprint is not consistent with the enrolled fingerprint), the electronic device 3400 foregoes submitting credential information. In addition, upon failure of biometric authentication, the electronic device 3400 displays a login user interface 3404-1 (as shown in fig. 34N) that includes a previously hidden submission affordance (e.g., 3420-1). Further, upon failure of biometric authentication, the electronic device 3400 prompts the user for manual input by displaying the cursor 28 in a fillable field, such as the password field 3410.
Fig. 35 is a flow diagram illustrating a method for indicating availability of biometric authentication using an electronic device, according to some examples. Method 3500 is performed at a device (e.g., 100, 300, 500, 1700, 3400) having a display (e.g., 3402) and one or more biometric sensors (e.g., 3403, 3414) (e.g., a fingerprint sensor, a non-contact biometric sensor (e.g., a biometric sensor that does not require physical contact, such as a thermal or optical facial recognition sensor), an iris scanner). In some examples, the one or more biometric sensors (e.g., 3403) include one or more cameras. In some examples, the display (e.g., 3402) is a touch-sensitive display. In some examples, the display (e.g., 3402) is not a touch-sensitive display.
Some operations in method 3500 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted. As described below, method 3500 provides an intuitive way to indicate the availability of biometric authentication. The method reduces the cognitive burden on the user in determining the usability of biometric authentication, thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling a user to more quickly and efficiently identify the availability of biometric authentication conserves power and increases the interval between battery charges.
An electronic device (e.g., 100, 300, 500, 1700, 3400) detects (3502) a predefined operation (e.g., a request to load and/or display a user interface, a selection of a particular element of a user interface) corresponding to a credential submission (e.g., login) user interface (e.g., 3404) having a credential submission (e.g., login) user interface element (e.g., 3408, 3410) (e.g., a fillable field such as a username or password).
In response to (3504) detecting the predefined operation, the electronic device (e.g., 100, 300, 500, 1700, 3400) proceeds to blocks 3506 through 3516.
In response to (3504) detecting the predefined operation and determining from (3506) that biometric authentication (e.g., Touch ID, Face ID) via the one or more biometric sensors (e.g., 3403, 3414) is available, the electronic device (e.g., 100, 300, 500, 1700, 3400) displays (3508) a credential submission (e.g., login) user interface (e.g., 3404) on a display (e.g., 3402) that has presented to the one or more biometric sensors (e.g., 3403, 3414) a biometric feature (e.g., 3412) that satisfies biometric authentication criteria that will cause a visual indication (e.g., 3406) of the credential to be submitted via a credential submission user interface element (e.g., 3408, 3410).
In some examples, the credential submission user interface is generated based on an electronic document (e.g., an HTML document), and the credential submission user interface element is an input element (e.g., a login button) for submitting the credential. In some examples, if biometric authentication is available, the electronic device (e.g., 100, 300, 500, 1700, 3400) does not render and display the credential input element in the first state (e.g., a default state, a state displayed when biometric authentication is not available), but instead displays the biometric authentication element (e.g., instructions for providing the desired biometric authentication input (e.g., a fingerprint), in lieu of the credential submission user interface element). Displaying a prompt to the user indicating that placing their finger on the sensor results in an automatic login provides feedback to the user about the current state of the device (e.g., biometric authentication is available) and provides feedback to the user indicating valid options for login. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, in response to (3504) detecting the predefined operation and determining from (3506) that biometric authentication via the one or more biometric sensors (e.g., 3403, 3414) is available, the electronic device (e.g., 100, 300, 500, 1700, 3400) foregoes (3516) displaying a credential submission affordance (e.g., 3420) on the display (e.g., 3402) (e.g., does not display a login button). Omitting the display of the login button encourages the user to perform an effective login method, thereby providing improved feedback. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, in response to (3504) detecting the predefined operation and determining from (3510) that biometric authentication via the one or more biometric sensors (e.g., 3403, 3414) is not available, the electronic device (e.g., 100, 300, 500, 1700, 3400) displays (3512) a credential submission (e.g., login) user interface (e.g., 3404-1) on the display (e.g., 3402) without displaying a visual cue (e.g., 3406). Forgoing display of a prompt for logging in via biometric authentication provides the user with feedback regarding the current state of the device, as it indicates to the user that logging in via biometric authentication is not available. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, further in response to (3504) to detecting the predefined operation and determining from (3510) that biometric authentication via the one or more biometric sensors (e.g., 3403, 3414) is unavailable, the electronic device (e.g., 100, 300, 500, 1700, 3400) displays (3514) a credential submission affordance (e.g., 3420) (e.g., a touch-activated login button associated with the one or more fillable fields) on the display (e.g., 3402). Receiving an input (e.g., 3432) corresponding to selection of a credential submission affordance (e.g., 3420) causes a credential to be submitted via a credential submission user interface element (e.g., 3408, 3410) (e.g., causes a credential to be submitted without using biometric authentication).
According to some examples, upon display of a credential submission user interface (e.g., 3404), an electronic device (e.g., 100, 300, 500, 1700, 3400) detects a respective type of biometric feature (e.g., 3412) via the one or more biometric sensors (e.g., 3403, 3414). In response to detecting a respective type of biometric feature (e.g., 3412), and in accordance with a determination that the biometric feature (e.g., 3412) satisfies biometric authentication criteria, the electronic device (e.g., 100, 300, 500, 1700, 3400) submits a credential (e.g., successful authentication results in submission of the credential) via a credential submission user interface element (e.g., 3408, 3410). In response to detecting the respective type of biometric feature (e.g., 3412), and in accordance with a determination that the biometric feature (e.g., 3412) does not satisfy the biometric authentication criteria, the electronic device (e.g., 100, 300, 500, 1700, 3400) foregoes submission of the credential via the credential submission user interface element (e.g., 3408, 3410). Surrendering the submission of credentials based on not satisfying the biometric authentication criteria provides security and may prevent unauthorized users from initiating sensitive operations. Providing improved security enhances the operability of the device and makes the user-device interface more efficient (e.g., by restricting unauthorized access), which in turn reduces power usage and extends the battery life of the device by limiting the performance of restricted operations.
According to some examples, a credential submission user interface element (e.g., 3408, 3410) includes one or more filable fields.
According to some examples, displaying a credential submission user interface (e.g., 3404-1, 3404-2) includes displaying a credential submission user interface element that is pre-populated with credentials (e.g., pre-populated with default usernames) to be submitted via the credential submission user interface element (e.g., 3408, 3410). Pre-populated default user names provide the user with the ability to log in with less input. Performing operations with a reduced number of inputs enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable inputs and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, an electronic device (e.g., 100, 300, 500, 1700, 3400) receives a selection of a filable field (e.g., 3408, 3410) of the one or more filable fields. In response to receiving a selection of a filable field, the electronic device (e.g., 100, 300, 500, 1700, 3400) displays a character input interface (e.g., 3426) (e.g., a keypad or keyboard including character input keys for entering a password or passphrase) on the display (e.g., 3402).
According to some examples, an electronic device (e.g., 100, 300, 500, 1700, 3400) receives input (e.g., 3430) corresponding to entering one or more characters in a filable field via a character input interface (e.g., 3426) (e.g., via a character input key). In some examples, the character input interface includes character input keys. Subsequent to receiving the input, the electronic device (e.g., 100, 300, 500, 1700, 3400) receives a selection of a second credential submission affordance (e.g., 3420) (e.g., a login button). In response to receiving a selection of the second credential submission affordance, the electronic device (e.g., 100, 300, 500, 1700, 3400) submits the one or more characters in the filable field (e.g., 3408, 3410) for credential verification.
According to some examples, indicating that presenting the one or more biometric sensors (e.g., 3403, 3414) with a biometric feature (e.g., 3412) that satisfies biometric authentication criteria will cause a visual indication (e.g., 3406) of submission of a credential via a credential submission user interface element (e.g., 3408, 3410) to be displayed in a filable field (e.g., 3410) of the one or more filable fields (e.g., user name field, password field). Displaying a prompt to the user indicating that placing their finger on the sensor results in an automatic login provides feedback to the user about the current state of the device (e.g., biometric authentication is available) and provides feedback to the user indicating valid options for login. Displaying a prompt in the password field may provide feedback regarding the operation to be performed (e.g., automatically populating the password field) when authentication is successful. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
According to some examples, the predefined operation is a request to display a credential submission interface (e.g., 3404-1, 3404-2) (e.g., load a credential submission user interface, scroll the credential submission user interface into view, zoom in on the credential submission user interface, expose the credential submission user interface from hidden user interface elements) on a display (e.g., 3402). In some examples, the predefined operation that is a request to display the credential submission interface is also a request to display the first portion of the respective content, as described with respect to method 3300 (e.g., at 3302 of method 3300).
According to some examples, the predefined operation is detected when a credential submission interface (e.g., 3404-1) is displayed, and the predefined operation includes an input (e.g., 3424) directed to a portion of the credential submission user interface (e.g., a user input (e.g., a tap) on the credential submission user interface or a user input on a fillable field such as a username or password field).
According to some examples, an electronic device (e.g., 100, 300, 500, 1700, 3400) provides a first result in accordance with the one or more characters in the submission filable field (e.g., 3408, 3410) for credential validation and in response to receiving a selection of a second credential submission affordance (e.g., 3420). In accordance with submitting a credential via a credential submission user interface element (e.g., 3408, 3410) and in response to determining that a biometric feature (e.g., 3412) detected via the one or more biometric sensors (e.g., 3403, 3414) satisfies a biometric authentication criterion, an electronic device (e.g., 100, 300, 500, 1700, 3400) provides a first result. In some examples, submission (e.g., successful submission) of a credential (e.g., a valid credential) via a password or password (entered via a credential submission user interface element) results in the same result (e.g., access to a restricted application, web page, or account) as successful authentication via biometric authentication.
It is noted that the details of the process described above with respect to method 3500 (e.g., fig. 33) may also be applied in a similar manner to the methods described below and above. For example, method 3500 optionally includes one or more of the features of the various methods described above with reference to methods 800, 1000, 1200, 1400, 1600, 2000, 2200, 2500, 2700, 2900, 3100, and 3300. For example, the registered biometric data described in method 1200 may be used to perform biometric authentication as described with respect to method 3500. As another example, the re-authentication criteria of method 3100 can be used with method 3500 to control when biometric identification is available. For the sake of brevity, these details are not repeated in the following.
In some examples (e.g., in some examples of methods 1600, 1800, 2000, 2200, 2900, 3100, 3300, and 3500), the electronic device limits the allowable number of biometric authentication attempts that can be made before biometric authentication is disabled (e.g., disabled until successful authentication occurs via an alternative means) (e.g., to provide improved security by avoiding brute-force attempts to bypass security and conserve device resources). In some such examples, ceasing to use the biometric sensor (e.g., discarding further biometric authentication retries) before running out of the allowed/limited number of attempts avoids the user consuming the allowed number of attempts on repeated requests (e.g., repeated requests of the same type), thereby saving at least one attempt for requests for other operations that require biometric authentication (e.g., requests for other, more critical operations). Saving at least one attempt enhances the operability of the device and makes the user-device interface more efficient (e.g., by avoiding exhausting authentication attempts on repeated similar requests), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. Additionally, saving at least one biometric authentication attempt may reduce situations where the user must provide alternative non-biometric authentication (e.g., such as password or password authentication), which in turn may facilitate the use of more secure (e.g., more complex) passwords/passwords, as the user may not be reluctant to use more secure passwords/passwords due to the need to frequently use such passwords/passwords when biometric authentication becomes disabled due to exhaustion of allowed attempts. Facilitating the use of more secure passwords/pass codes enhances the operability of the device by reducing the risk of unauthorized access.
Fig. 36A-36L illustrate example user interfaces for retrying biometric authentication at a credential entry user interface, according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 37A-37B.
Fig. 36A shows an electronic device 3600 (e.g., portable multifunction device 100, device 300, device 500). In the illustrative example shown in fig. 36A-36L, electronic device 3600 is a smartphone. In other examples, the electronic device 3600 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 3600 includes a display 3602, one or more input devices (e.g., a touch screen of the display 3002, buttons 3604, and a microphone), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. At fig. 36A, the electronic device includes a biometric sensor 3603. In some examples, the biometric sensor is one or more biometric sensors, which may include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, biometric sensor 3603 is biometric sensor 703. In some examples, the one or more biometric sensors include one or more fingerprint sensors (e.g., fingerprint sensors integrated into buttons). In some examples, the device further includes a light emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. Optionally, the light emitting device is for illuminating the biometric feature (e.g., face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
At fig. 36A, electronic device 3600 displays a locked state User Interface (UI)3606 that includes a lock icon 3608 that provides an indication to a user that electronic device 3600 is in a locked state. Because electronic device 3600 is in a locked state, the user cannot view restricted content of notification 3610A (e.g., a message from John Appleseed).
At fig. 36B, the user wishes to unlock the electronic device 3600 to access restricted content on the device (e.g., message from John applesed, home screen 3628 of fig. 36L, most recently used application). Unlocking the device requires successful authentication of the user. To request to unlock the device, the user performs a swipe up starting within region 3612A, which is a predefined region adjacent to the bottom edge of display 3602.
While displaying the locked state UI 3606, the electronic device 3600 receives input 3614A (e.g., swipe up). While displaying the locked state UI 3606, electronic device 3600 requires a swipe up to start from within region 3612A to trigger a request to unlock the device. In response to receiving input 3614A, electronic device 3600 determines whether input 3614A begins within region 3612A.
Upon detecting input 3614A (e.g., an input beginning within region 3612A), electronic device 3600 initiates biometric authentication and determines whether biometric authentication is currently enabled (or available) on the device. Biometric authentication may not be available for various reasons, including, for example, that biometric authentication has failed more than a predetermined number of times (e.g., 5, 10, 15) since the last successful authentication with the device.
In some examples, instead of receiving input 3614A, electronic device 3600 receives any of inputs 3614B-E (e.g., a swipe input that moves a similar distance as input 3614A). Similar to input 3614A, each of inputs 3614B-E is a swipe up. In some examples, because input 3614B also begins within region 3614A, electronic device 3600 treats inputs 3614A and 3614B as the same input (e.g., electronic device 3600 has the same response to both inputs). In contrast, in some examples, electronic device 3600 responds differently to inputs 3614C-E than to inputs 3614A-B. In particular, in some examples, in response to receiving any of inputs 3614C-E, electronic device 3600 does not initiate biometric authentication, as described below with respect to fig. 36C. In contrast, electronic device 3600 displays one or more notifications (e.g., 3610A-E) (e.g., unread notifications) while keeping the device in the locked state, as shown in fig. 36K (e.g., displays one or more notifications that were previously not visible on the display by scrolling through a portion of the wake-up screen user interface (e.g., 3606), where the amount of scrolling is optionally determined based on the magnitude or speed of movement of the contact).
At fig. 36C, in response to input 3614 and upon determining that biometric authentication is currently enabled, electronic device 3600 attempts to biometrically authenticate the user (e.g., attempts to match information about the user's face obtained using biometric sensor 3603 with stored authorized credentials). In an attempt to authenticate a user in a biometric manner, the electronic device 3600 displays (e.g., replaces the display of the lock state UI 3606 with) an add-on 3616 having an authentication glyph 3618. The authentication glyph 3618 includes a plurality of rings that rotate about an axis parallel to the display such that they appear to rotate out of the z-axis of the display, providing an indication to the user that biometric authentication is being performed.
In attempting to authenticate a user biometrically, electronic device 3600 uses biometric sensor 3603 to determine whether certain biometric authentication criteria have been met. More specifically, electronic device 3600 captures and processes (e.g., analyzes) biometric data from biometric sensor 3603 to determine, based on the biometric data, whether a biometric feature (or a portion thereof) satisfies a biometric authentication criterion (e.g., determines whether the biometric feature data matches a biometric template (e.g., a stored authorized credential) within a threshold). After initiating the biometric authentication, the electronic device 3600 determines that the biometric authentication has failed (e.g., the biometric data from the biometric sensor 3603 does not match the stored authorized credentials).
At fig. 36D, upon determining that the biometric authentication failed, the electronic device 3600 displays an animation of the lock icon 3608 alternating between different positions to simulate a "shake" effect. The "shake" animation provides an indication to the user that the electronic device 3600 cannot authenticate the user in a biometric manner.
As shown in fig. 36E, further in response to determining that biometric authentication has failed, the electronic device 3600 displays (e.g., replaces display of the on-page interface 3616 with) the passcode entry UI 3620, which provides an alternative (e.g., non-biometric) method of authenticating a user at the electronic device 3600. The pass code input UI 3620 includes a lock icon 3608 and a plurality of input keys for inputting (or inputting) a password or a pass code. The passcode entry UI 3620 also includes a prompt 3622A that prompts the user to swipe up to retry biometric authentication or to enter a passcode (or password) to authenticate the user (e.g., in a non-biometric manner). Upon determining that biometric authentication is currently enabled on the device, electronic device 3600 displays prompt 3622A. Additionally, upon determining that biometric authentication is currently enabled, electronic device 3600 displays an unlock indication 3624, which provides an indication of an approximate location on display 3602 from which the user may swipe up to retry biometric authentication.
At fig. 36E, the user attempts to retry biometric authentication instead of entering a password or passcode for authentication. Upon displaying the passcode entry UI 3620, the electronic device 3600 receives an input 3614C, which begins outside of region 3612A of fig. 36B. However, input 3614C triggers a retry of biometric authentication because input 3614C starts from within region 3612B. Note that region 3612B is larger than region 3612A because the parameters for swiping up the location where biometric authentication must begin to initiate are relaxed on passcode input UI 3620 compared to lock state UI 3606.
In some examples, instead of receiving input 3614C, the electronic device receives any of inputs 3614A-B and 3614D-E. In some examples, all inputs except input 3614E will trigger a retry of biometric authentication according to relaxation parameters on the passcode input UI 3620. In some examples, in response to receiving input 3614E, electronic device 3600 determines that input 3614E does not begin within region 3612B and, in response, does not retry biometric authentication.
At fig. 36F, in response to receiving input 3614C, electronic device 3600 device determines that input 3614 begins within region 3612B. Upon determining that input 3614 starts within region 3612B, electronic device 3600 retries biometric authentication. When retrying the biometric authentication, the electronic device 3600 displays (for example, replaces the display of the lock icon 3608 with) the authentication font 3618. The authentication glyph 3618 provides an indication to the user that biometric authentication is being performed.
Upon retrying the biometric authentication, the electronic device 3600 determines that the biometric authentication is successful (e.g., the biometric data obtained using the biometric sensor 3603 matches the stored authorized credentials).
At fig. 36G to 36H, upon determining that biometric authentication is successful, the electronic device 3600 transitions from the locked state to the unlocked state. The electronic device 3600 provides an indication of the transition by displaying an animation of the lock icon 3608 transitioning to the unlock icon 3626 of fig. 36H, which provides an indication that the electronic device 3600 has transitioned to the unlocked state. Additionally, upon determining that the biometric authentication is successful and upon displaying the unlock icon 3626, the electronic device 3600 provides access to the restricted content. For example, the electronic device 3600 displays the home screen 3628 of fig. 36L or a recently used application (e.g., a user interface of the recently used application (e.g., the instant messaging application interface 2616 of fig. 26G)).
In some examples, instead of determining at fig. 36C that the biometric authentication has failed, the electronic device 3600 determines that the biometric authentication is successful. In some examples, upon determining that the biometric authentication is successful, the electronic device 3600 transitions to an unlocked state and displays an unlock icon 3626, as described above with respect to fig. 36G-36H. Additionally, in some examples, after determining that the biometric authentication is successful and after displaying the unlock icon of fig. 36H, the electronic device 3600 displays the home screen 3628. (the home screen 3628 may include some or all of the features of the home screen interface 2614 of fig. 26D.) in some examples, upon determining that the biometric authentication is successful and after displaying the unlock icon 3626, the electronic device 3600 displays the most recently used application (e.g., the instant messaging application interface 2616 of fig. 26G).
In some examples, instead of determining that biometric authentication is enabled on the device as described above with respect to fig. 36B-36C, the electronic device 3600 determines that biometric authentication is not currently enabled (e.g., not successful because biometric authentication has been disabled by the user (manually), or because a predetermined biometric authentication has been attempted). In some examples, upon determining that biometric authentication is not currently enabled, electronic device 3600 displays prompt 3622B of fig. 36I instead of prompt 3622A of fig. 36E. In contrast to prompt 3622A, prompt 3622B only prompts the user to enter a passcode (or password) to authenticate the user, but does not prompt the user to swipe to retry biometric authentication. Additionally, in some examples, upon determining that biometric authentication is not currently enabled, electronic device 3600 does not display unlock indication 3624, as shown in fig. 36I. As described above, the unlock indication 3624 provides an indication of the approximate location on the display 3602 from which the user may start a swipe up to retry biometric authentication. In some examples, electronic device 3600 does not display unlock indication 3624 because biometric authentication is not currently enabled.
In some examples, instead of determining at fig. 36F that the biometric authentication was successful, the electronic device 3600 determines that the biometric authentication has failed. In some examples, upon determining that biometric authentication failed, the electronic device 3600 displays an animation of the lock icon 3608 alternating between different positions to simulate a "shake" effect (rather than displaying a transition to an unlocked state as described above with respect to fig. 36G-36H). As described above, this "shake" animation provides an indication to the user that the electronic device 3600 cannot authenticate the user in a biometric manner.
37A-37B are flow diagrams illustrating methods for retrying biometric authentication at a credential entry user interface using an electronic device, according to some examples. The method 3700 is performed at an electronic device (e.g., 100, 300, 500, 3600) having a touch-sensitive display (e.g., 3602) and one or more biometric sensors (e.g., 3603) (e.g., a first biometric sensor of a device having a plurality of biometric sensors) (e.g., a fingerprint sensor, a contactless biometric sensor (e.g., a biometric sensor that does not require physical contact, such as a thermal or optical facial recognition sensor), an iris scanner). In some examples, the one or more biometric sensors include one or more cameras. Some operations in method 3700 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 3700 provides an intuitive way of retrying biometric authentication at a credential entry user interface using an electronic device. The method reduces the cognitive burden of the user to retry the biometric authentication, thereby creating a more efficient human-machine interface. For battery-powered computing devices, enabling a user to retry biometric authentication more quickly and efficiently conserves power and increases the interval between battery charges.
An electronic device (e.g., 100, 300, 500, 3600) displays (3706) a credential entry user interface (e.g., 3620) with a plurality of character entry (e.g., passcode, password, pattern) keystrokes on a touch-sensitive display (e.g., 3602). In some examples, the credential input user interface includes a virtual keypad or a virtual keyboard. In some examples, the virtual keypad or virtual keyboard includes a plurality of character input keys. In some examples, while displaying the credential input user interface (e.g., 3620), the electronic device receives input corresponding to one or more of the plurality of character input keys. In some examples, in response to receiving (or subsequent to) input corresponding to the plurality of character input keys, in accordance with a determination that the received input corresponds to (or matches) an authorized credential (e.g., a stored passcode or password), the electronic device transitions from a locked state (e.g., corresponding to 3606) to an unlocked state (e.g., corresponding to 3628).
While displaying a credential input user interface (e.g., 3620), the electronic device receives (3708) a touch gesture input (e.g., 3614A-B) (e.g., a swipe at a predefined location) via a touch-sensitive display (e.g., 3602), the touch gesture input including movement of a contact on the touch-sensitive display.
In response to (3712) receiving a touch gesture input comprising movement of a contact on the touch-sensitive display and in accordance with (3714) determining that a first set of one or more criteria are satisfied, the electronic device attempts to biometrically authenticate a user of the electronic device based on biometric information captured using one or more biometric sensors, wherein the first set of one or more criteria includes a requirement that biometric authentication is currently enabled on the electronic device. In some examples, the first set of one or more criteria includes only one criterion. In some examples, biometric authentication may become unavailable (or not enabled on the electronic device) when one or more of the following conditions are met: the electronic equipment is not successfully authenticated after being automatically opened or restarted; the electronic device is not unlocked for more than a predetermined amount of time (e.g., 48 hours); the password has not been used to unlock the device for more than a predetermined amount of time (e.g., 156 hours); biometric authentication using biometric features (e.g., face, fingerprint) has not been used to unlock the device for more than a predetermined amount of time (e.g., 4 hours); the electronic device has received the remote lock command; biometric authentication has failed more than a predetermined number of times since the last successful authentication with the device (e.g., 5, 10, 15); and the electronic device has received a power-off and/or emergency SOS command. In some examples, the touch gesture input is a request to unlock electronic device 3600. Providing a user with the functionality to retry biometric authentication by performing a touch gesture input at a credential input user interface enhances operability of the device by providing additional control over the device without cluttering the UI with additional displayed controls. Providing this functionality enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in response to (3712) receiving a touch gesture input comprising movement of a contact on the touch-sensitive display, and determining from (3716) that the first set of one or more criteria is not satisfied (e.g., because biometric authentication is not currently enabled on the device), the electronic device forgoes attempting to biometrically authenticate a user of the electronic device based on biometric information captured using one or more biometric sensors (e.g., 3603). Forgoing attempts to biometrically authenticate a user when biometric authentication is not enabled improves the security of the device by preventing fraudulent use of the device.
In some examples, displaying the credential input user interface (e.g., 3620) occurs in response to receiving (3702) a request to perform an operation requiring authentication (e.g., a request to unlock the electronic device (e.g., 3614A-B) (e.g., a swipe starting from an edge (e.g., a bottom edge) of the display or from a predefined area (e.g., a lower portion) of the display)) and failing to biometrically authenticate a user of the electronic device based on biometric information captured using one or more biometric sensors. In some examples, an electronic device (e.g., 100, 300, 500, 3600) may not be able to biometrically authenticate a user of the electronic device when biometric information captured using one or more biometric sensors does not correspond to (or match) authorized credentials (e.g., stored information about biometric features (e.g., face, fingerprint) authorized for biometric authentication). Displaying the credential input user interface when a set of conditions is satisfied provides the user with the ability to authenticate via alternative methods without the user explicitly requesting that the credential input user interface be displayed. Performing the operation when a set of conditions has been met without further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the request to perform the operation requiring authentication (e.g., 3614A-B) is a second touch gesture input (3704) that includes movement of a contact on the touch-sensitive display (e.g., a swipe starting from an edge (e.g., bottom edge) of the display or from a predefined area (e.g., 3612A) (e.g., bottom) of the display). In some examples, the set of one or more criteria includes a requirement that the touch gesture input begin at a first region (e.g., 3612A) of the display (e.g., a region along a bottom edge of the display) and end at (or progress through) a second region of the display (e.g., a region above the region along the bottom edge of the display (e.g., 3612A)).
In some examples, in accordance with a determination that the second set of one or more criteria is satisfied, displaying the credential input user interface (e.g., 3620) includes displaying, on a touch-sensitive display (e.g., 3602), an indication (e.g., 3622A) (e.g., text, graphics, icon) that performs a gesture on the touch-sensitive display to attempt to biometrically authenticate the user, wherein the second set of one or more criteria includes a requirement that biometric authentication is currently enabled on the electronic device. In some examples, in accordance with a determination that the second set of one or more criteria is not satisfied, the electronic device forgoes displaying an indication to perform a gesture on the touch-sensitive display to attempt to biometrically authenticate the user. In some examples, the second set of one or more criteria is the same as the first set of one or more criteria. In some examples, the second set of one or more criteria includes only one criterion. Displaying an indication to perform a gesture when biometric authentication is available provides feedback to the user about the current state of the device (e.g., biometric authentication is available) and prompts the user to perform a gesture associated with the indication. Further, not displaying an indication to perform a gesture when biometric authentication is not available also provides feedback to the user regarding the current state of the device (e.g., biometric authentication is not available). Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in accordance with a determination that the third set of one or more criteria is satisfied, displaying the credential input user interface (e.g., 3620) includes displaying, on the touch-sensitive display (e.g., 3602), a user interface element (e.g., 3624) (e.g., a graphical element, a horizontal bar, a home affordance, an indication of a location where the user should begin swiping to attempt biometric authentication), adjacent to an edge of the touch-sensitive display (e.g., a bottom edge of the display), wherein the third set of one or more criteria includes a requirement that biometric authentication is currently enabled on the electronic device. In some examples, the third set of one or more criteria is the same as the first set of one or more criteria. In some examples, the third set of one or more criteria includes only one criterion. In some examples, in accordance with a determination that the third set of one or more criteria is satisfied, the electronic device (e.g., 100, 300, 500, 3600) forgoes displaying the user interface element. In some examples, the set of one or more criteria includes a requirement that the touch gesture input begin at (e.g., 3612A) a first region of the display (e.g., a region along a bottom edge of the display) and end at (or progress through) a second region of the display (e.g., a region above the region along the bottom edge of the display). In some examples, the user interface element (e.g., 3624) is displayed at a location corresponding to (e.g., within) the first region of the display. Displaying user interface elements adjacent to the edge of the display when biometric authentication is available provides feedback to the user about the current state of the device and how the user may request to unlock the device. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in attempting to biometrically authenticate a user of an electronic device based on biometric information captured using one or more biometric sensors, the electronic device (e.g., 100, 300, 500, 3600) displays an animation on a touch-sensitive display with one or more movement elements (e.g., 3618) indicating that biometric authentication is occurring (e.g., displays an animation that includes one or more rings (e.g., graphical icons) moving on the display). In some examples, the animation provides an indication that biometric information is being processed. In some examples, the animation includes a ring that rotates around the object (e.g., a sphere). In some examples, the sphere is visible. In some examples, the sphere is not visible. Displaying one or more mobile elements indicating that biometric authentication is occurring provides feedback to the user about the current state of the device (e.g., biometric authentication is being performed) and the user need not take any action at this time. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, displaying the animation with the one or more mobile elements includes transitioning from the animation with the one or more mobile elements to a locked icon (e.g., 3608) (e.g., an icon indicating a locked state). In some examples, after transitioning from an animation having one or more moving elements (e.g., 3618) to a locked icon and in accordance with a determination that biometric information captured using one or more biometric sensors corresponds to (e.g., matches) authorized credentials (e.g., stored information about biometric features (e.g., faces, fingerprints) authorized for biometric authentication), the electronic device transitions from a locked state to an unlocked state. In some examples, transitioning the electronic device from the locked state to the unlocked state includes displaying an animation of the lock icon transitioning to an unlock icon (e.g., 3626) (e.g., an icon indicating the unlocked state). In some examples, after transitioning from the animation with the one or more moving elements to the lock icon, and in accordance with a determination that the biometric information captured using the one or more biometric sensors does not correspond (e.g., does not match) with the authorized credential, the electronic device displays on a touch-sensitive display (e.g., 3602) an animation of the lock icon (e.g., 3608) alternating between a first location and a second location, wherein the second location is different from the first location. In some examples, the animation with the lock icon is an animation of the lock icon panning (e.g., rotating back and forth from side to side). In some examples, the electronic device displays an animation that involves the lock icon to indicate that the biometric authentication has failed. In some examples, a tactile output is provided in conjunction with the shake-lock icon. In some examples, no haptic output is provided. In some examples, the electronic device (e.g., 100, 300, 500, 3600) maintains a locked state of the electronic device in accordance with a determination that biometric information captured using one or more biometric sensors does not correspond or match authorized credentials. The animation of the display lock icon transitioning to the unlock icon provides feedback to the user regarding the current state of the device (e.g., biometric authentication is successful, the device is unlocked, and the user can now access restricted content). In addition, displaying an animation of the lock icon shaking provides feedback to the user about the current state of the device (e.g., biometric authentication failed) and prompts the user to take further action. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, a touch gesture input that includes movement of a contact on the touch-sensitive display begins at a location that is away from (e.g., substantially not near) an edge of the touch-sensitive display (e.g., a bottom edge of the display) (3710). In some examples, a location (e.g., 3602) away from an edge of the touch-sensitive display includes a location closer to the center of the display than the edge. In some examples, the location away from the edge of the touch-sensitive display includes a location greater than a threshold distance from the edge of the display (or outside of a predefined area (e.g., 3612A)). In some examples, a threshold distance (or predefined area) is used on the wake-up screen to determine whether to deactivate the wake-up screen when the swipe input does not begin a threshold distance from the edge of the display. In some examples, the threshold distance (or predefined area (e.g., 3612A)) is used to perform different operations on the wake-up screen, such as scrolling content (e.g., 3610A-E) on the wake-up screen when the swipe input begins to exceed the threshold distance from the edge of the display (or outside the predefined area (e.g., 3612A)). In some examples, the parameters for the location where the touch gesture input must begin are relaxed for the credential input user interface (e.g., 3620). Relaxing the parameters at the credential input user interface for the location where the touch gesture input must begin enhances the operability of the device by allowing less precise gesture-initiated biometric authentication. The parameter is relaxed because the user has expressed a desire to initiate biometric authentication. Thus, there is less risk that the user does not intend to initiate biometric authentication at the credential input user interface. Allowing a less accurate gesture to initiate biometric authentication after the user has expressed a desire to initiate biometric authentication enhances device operability and makes the user-device interface more efficient, which in turn reduces power usage and extends the battery life of the device by limiting the performance of limited operations.
In some examples, the electronic device transitions from the locked state to the unlocked state in response to receiving a touch gesture input (e.g., 3614C-E) that includes movement of a contact on the touch-sensitive display from a location away from an edge of the touch-sensitive display and determining, in accordance with (3718), that biometric information captured using the one or more biometric sensors corresponds to (e.g., matches) authorized credentials (e.g., stored information about biometric features (e.g., face, fingerprint) authorized for biometric authentication). In some examples, transitioning the electronic device to the unlocked state includes displaying an unlocked user interface (e.g., a user interface indicating the unlocked state, such as a home screen (e.g., 3628) or a recently used application).
In some examples, prior to displaying the credential input user interface, the electronic device (e.g., 100, 300, 500, 3600) displays a lock user interface (e.g., 3606) (e.g., a user interface indicating a locked state of the electronic device, a wake screen, a lock screen) on the touch-sensitive display (e.g., 3602). In some examples, the locking user interface is different from a credential input user interface (e.g., 3620). In some examples, while displaying the locked user interface, the electronic device receives a touch gesture input (e.g., 3614A-B) having a starting position (e.g., swipe up) via a touch-sensitive display (e.g., 3602). In response to receiving the touch gesture input having the start location and in accordance with a determination that a fourth set of one or more criteria are satisfied, the electronic device initiates biometric authentication, wherein the fourth set of one or more criteria includes a requirement that the start location of the touch gesture input is substantially near an edge of the touch-sensitive display (e.g., within a predefined area (e.g., 3612A) that is within a predefined distance from the edge; a location near the edge of the display rather than the center of the display). In some examples, the fourth set of one or more criteria includes only one criterion. In some examples, initiating biometric authentication includes attempting to biometrically authenticate a user of an electronic device based on biometric information captured using one or more biometric sensors. In some examples, in response to receiving the touch gesture input having the start location and in accordance with a determination that the fourth set of one or more criteria is not satisfied, the electronic device forgoes initiating biometric authentication.
It is noted that details of the process described above with respect to method 3700 (e.g., fig. 37A-37B) may also be applied in a manner similar to the methods described below. For example, method 3900, method 4100, and/or method 4300 optionally include one or more features of the various methods described above with reference to method 3700. For example, as described above with respect to method 3700, the process for retrying biometric authentication may be used to retrye biometric authentication to authorize payment of goods, as described with respect to method 4100. Similarly, the process for retrying biometric authentication may be used to retry biometric authentication in the process described with respect to method 3900. For the sake of brevity, these details are not repeated below.
Fig. 38A-38 AD illustrate exemplary user interfaces for providing an indication of an error condition during biometric authentication, according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 39A-39B.
Fig. 38A shows an electronic device 3800 (e.g., portable multifunction device 100, device 300, device 500). In the exemplary example shown in fig. 38A to 38AD, the electronic device 3800 is a smartphone. In other examples, the electronic device 3600 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 3600 includes a display 3802, one or more input devices (e.g., a touchscreen of the display 3802, buttons 3804, and a microphone), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In fig. 38A, the electronic device includes a biometric sensor 3803. In some examples, the biometric sensor is one or more biometric sensors, which may include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, biometric sensor 3803 is biometric sensor 703. In some examples, the one or more biometric sensors include one or more fingerprint sensors (e.g., fingerprint sensors integrated into buttons). In some examples, the device further includes a light emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. Optionally, the light emitting device is for illuminating the biometric feature (e.g., face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
At fig. 38A, the user learns from the notification 3808 that she has received a message from John Appleseed. The user wishes to view the restricted content of notification 3808 (e.g., a message from John Appleseed), but cannot do so because electronic device 3800 is currently in a locked state. Electronic device 3800 displays a lock state User Interface (UI) with a lock icon 3806 that provides an indication that electronic device 3800 is in a locked state. The restricted content of the view notification 3808 requires successful authentication (e.g., determining that information (or data) regarding the biometric characteristic obtained using the biometric sensor 3803 corresponds to (or matches) the stored authorized credentials). To view the restricted content of notification 3808, the user lifts (or raises) electronic device 3800 (e.g., from a substantially horizontal orientation to an orientation of the device, as shown in the user's hand in fig. 38A). The electronic device 3800 detects a change in orientation of the electronic device 3800 and, in response, initiates biometric authentication. In some examples, after initiating the biometric authentication, the electronic device 3800 determines that the biometric authentication was successful. In some examples, upon determining that the biometric authentication is successful, the electronic device 3800 transitions from the locked state to the unlocked state and displays the restricted content of the notification 3808.
After initiating biometric authentication (e.g., before successful authentication), the electronic device 3800 determines whether the biometric sensor 3803 detected a face. At fig. 38B, upon determining that a face is detected, the electronic device 3803 displays an authentication glyph 3810 that includes a plurality of spherically rotated rings. The authentication glyph 3810 provides an indication that biometric authentication is being performed. In some examples, the electronic device 3800 displays an animation of the lock icon 3806 morphed into the authentication glyph 3810. In some examples, upon determining that no face is detected using the biometric sensor 3803, the electronic device 3803 remains in a locked state and does not display the authentication glyph 3810.
After detecting the presence of the face, the electronic device 3800 determines that the authentication was unsuccessful due to a failure to obtain sufficient information about the user's face using the biometric sensor 3803. In particular, as shown in fig. 38B, biometric sensor 3803 is outside of an acceptable distance range 3812 (e.g., above a maximum threshold range), resulting in an inability to obtain sufficient information about the user's face. Upon determining that the biometric authentication was unsuccessful due to the user's face being outside the acceptable distance range 3812, the electronic device 3800 maintains the device in the locked state and does not display the restricted content of the notification 3808. In some examples, the electronic device 3800 maintains the device in a locked state and does not display the restricted content of the notification 3808 upon determining that the unauthentication was successful and that no error condition exists. In some examples, upon determining (e.g., due to the captured biometric information not matching an authorized biometric information profile (e.g., stored authorized credentials)) that the authentication was not successful and that an error condition does not exist (e.g., a state that does not prevent capture of sufficient biometric information), electronic device 3800 maintains the locked state and automatically retries the biometric authentication. In some examples, the electronic device 3800 continues to display the authentication glyph 3810 in fig. 38B when the biometric authentication is retried.
As shown in fig. 38C to 38G, upon determining that biometric authentication is unsuccessful because the face of the user is outside the acceptable distance range 3812, the electronic device 3800 displays an animation in which the authentication glyph 3810 is morphed into an error indication 3814A, so that the error indication 3814A replaces the display of the authentication glyph 3810. At fig. 38G, the electronic device 3800 displays an error indication 3814A that prompts the user to take action to correct the error condition below the error indication 3814A. Specifically, the error indication 3814A prompts the user to move her face closer to the biometric sensor 3803. The error indication 3814A also suggests to the user that the user's face is too far from the biometric sensor 3803, which is the reason for the error indication 3814A. As long as the user's face is outside of the acceptable distance range 3812, the electronic device 3800 will continue to determine that an error indication 3814A exists. Upon determining that the error indication 3814A is still present, the electronic device 3800 does not attempt to retry the biometric authentication. Note that the electronic device 3800 displays an error indication 3814A at a position that coincides with the position of the lock icon 3806 in fig. 38A. Further, the electronic device 3800 displays an error indication 3814A on a portion of the display 3802 adjacent to the biometric sensor 3803 to suggest to the user that the error indication 3814A is associated with (or corresponds to) the biometric sensor 3803.
As shown in fig. 38H, after being prompted to correct the error indication 3814A, the user moves her face closer to the biometric sensor 3803 so that the user's face is within an acceptable distance range 3812. At fig. 38H, the electronic device determines that the error indication 3814A is no longer present. Upon determining that the error indication 3814A is no longer present, the electronic device 3800 enables biometric authentication on the device and automatically retries biometric authentication using the biometric sensor 3803.
In response to automatically retrying the biometric authentication, the electronic device 3800 displays an error indication 3814A having a blinking effect (e.g., animating the error indication such that one or more portions of the error indication move left and right to produce an effect that the error indication appears to blink) to indicate that the electronic device 3800 is attempting to authenticate the user again in the biometric manner. Fig. 38H to 38L depict animations of the error indication 3814A with a blinking effect. In some examples, instead of displaying the error indication 3814A with a blinking effect, the electronic device 3800 displays (e.g., replaces the display of the error indication 3814A with) the authentication glyph 3810 to indicate that the electronic device 3800 is attempting to authenticate the user again in a biometric manner. Thus, in some examples, the electronic device 3800 displays an animation of the authentication glyph 3814A morphing into the lock icon 3806, rather than an animation of the error indication 3814A morphing into the lock icon 3806.
At fig. 38L, after retrying the biometric authentication, the electronic device 3800 successfully authenticates the user in a biometric manner. In response to a successful biometric authentication, the electronic device 3800 transitions the device from the locked state to the unlocked state. Upon transition from the locked state to the unlocked state, the electronic device 3800 displays an animation in which the error indication 3814A is deformed into a lock icon 3806, as shown in fig. 38L to 38N. After displaying the error indication 3814A to morph into the animation of the lock icon 3806, the electronic device 3800 displays the animation of the lock icon 3806 transitioning to the unlock icon 3816, as shown in fig. 38N to 38O. An unlock icon 3816 provides an indication that the electronic device 3800 is in an unlocked state. In addition, as shown in fig. 38O, in response to the biometric authentication being successful, the electronic device 3800 displays restricted content of the notification 3808 (e.g., "is our meeting still in progress.
At fig. 38P, instead of determining that the user's face is outside the acceptable distance range 3812 as discussed above with respect to fig. 38B, the electronic device 3800 determines that biometric authentication is not available on the device. Upon determining that biometric authentication is not available, the electronic device 3800 displays an error indication 3814B in fig. 38P that provides an indication that biometric authentication is not currently available on the device. Biometric authentication may not be available for various reasons, including failure of biometric authentication more than a predetermined number of times since the last successful authentication (e.g., 5, 10, 15).
Since biometric authentication is not available, the user must use an alternative method to authenticate the user. For example, the user may authenticate by entering a passcode at the electronic device 3800. When an error indication 3814B is displayed in fig. 38P, the electronic device 3800 receives an input 3820 at the error indication 3814B.
At fig. 38Q, in response to receiving input 3820 at error indication 3814B, electronic device 3800 displays a passcode entry UI 3822A having a plurality of entry affordances for entering a passcode (or password).
In some examples, instead of determining that authentication was successful by retrying biometric authentication, as discussed above with respect to fig. 38L-38O, electronic device 3800 determines that the non-authentication was successful. In some examples, upon determining that the non-authentication is successful, the electronic device 3800 maintains the locked state and displays an animation of the locked icon 3806 in fig. 38R alternating between different positions to simulate a "shake" effect. The shake animation provides an indication to the user that biometric authentication has failed and that the electronic device 3800 remains in a locked state.
After determining that the non-authentication is successful, the user may perform an action at the electronic device 3800 to trigger a retry of the biometric authentication. At fig. 38S, the user triggers a retry of biometric authentication by swiping upward from an area near the bottom edge of the display 3802. Electronic device 3800 receives input 3824 and, in response, retries biometric authentication. In some examples, after retrying the biometric authentication, the electronic device 3800 determines that the authentication was successful. In some examples, upon determining that authentication is successful by retrying biometric authentication, electronic device 3800 transitions from the locked state to the unlocked state.
At fig. 38S to 38T, the electronic device determines that the authentication is not successful by retrying the biometric authentication in response to the input 3824. Upon determining that the authentication is not successful by retrying the biometric authentication, the electronic device 3800 displays the pass code input UI 3822B in fig. 38T. As described with respect to the process of fig. 37, the user can retry biometric authentication again at the passcode entry UI 3822B by performing a swipe up (e.g., entry 3826). (pass code entry UI 3822B includes some or all of the features of pass code entry UI 3620, including a relaxation parameter for swiping up the location where it must be started in order to initiate biometric authentication.)
At fig. 38U, the electronic device determines that the authentication is successful by retrying the biometric authentication at the passcode entry UI 3822B. Upon determining that the authentication is successful, the electronic device transitions from the locked state to the unlocked state, as shown in fig. 38U to 38W. In some examples, at fig. 38U, the electronic device determines that the non-authentication was successful by retrying the biometric authentication at the passcode entry UI 3822B. In some examples, the electronic device remains in the locked state while the determination is made.
Fig. 38X to 38AD show various error conditions that may be detected by the electronic device 3800 when attempting to authenticate a user in a biometric manner. Instead of displaying the error indication 3814A as described above with respect to fig. 38G, the electronic device 3800 can display any of the error indications described below (e.g., error indications 3814C-I). 38X-38 AD also depict electronic device 3800 (e.g., via error indication 3814C-I) instructing the user to take action to correct the detected error condition so that electronic device 3800 can retry biometrically authenticating the user.
In fig. 38X, the face of the user is positioned too close to the biometric sensor 3803. Accordingly, the electronic device 3800 determines that the user's face is positioned outside of the acceptable distance range 3812 (e.g., below a minimum threshold range). Upon determining that the user's face is positioned outside of the acceptable distance range 3812, the electronic device 3800 displays an error indication 3814C that prompts the user to move her face away from the biometric sensor 3803. The error indication 3814C also provides an indication of the cause of the error condition (e.g., an indication that the user's face is too close to the biometric sensor 3803).
In fig. 38Y, the hand of the user covers the biometric sensor 3803. Accordingly, the electronic device 3800 determines that an object (e.g., a hand of the user) overlays the biometric sensor 3803 such that the sensor cannot obtain any information about the user's face. Upon determining that the object overlies the biometric sensor 3803, the electronic device 3800 displays an error indication 3814D that prompts the user to move her hand away from the biometric sensor 3803. The error indication 3814D also provides an indication of the cause of the error condition (e.g., an indication that the biometric sensor 3803 is covered).
In fig. 38Z, the user does not view the electronic device 3800. Thus, the electronic device 3800 determines that the user's eyes are not viewing the device. Upon determining that the user's eyes are not viewing the device, the electronic device 3800 displays an error indication 3814E that prompts the user to view the device to correct the error condition. Error indication 3814E also provides an indication of the cause of the error condition (e.g., an indication that the user did not view the device).
In fig. 38AA, the user's face is within the field of view 3828, but the user is wearing a hat. Accordingly, electronic device 3800 determines that a portion of the user's face is occluded (or obstructed). For example, electronic device 3800 uses biometric sensor 3803 to obtain partial information about the user's face, where the partial information is below a threshold amount required for comparison with stored authorized credentials. Upon determining that a portion of the user's face is occluded, electronic device 3800 displays an error indication 3814F that prompts the user to remove the hat. Error indication 3814F also provides an indication of the cause of the error condition (e.g., an indication that a portion of the user's face is obscured).
In fig. 38AB, the user's face is outside the field of view 3828 of the biometric sensor 3803. Accordingly, the electronic device 3800 determines that the user's face is outside the field of view 3828 of the biometric sensor 3803. In some examples, the user's face is outside the field of view 3828 when more than a threshold portion of the face is outside the field of view. In some examples, the user's face is outside the field of view 3828 when no face is detected within the field of view. Upon determining that the user's face is outside the field of view 3828, the electronic device 3800 displays an error indication 3814G that prompts the user to move her face into the field of view 3828. The error indication 3814G also provides an indication of a cause of the error condition (e.g., an indication that the user's face is outside the field of view 3814G).
In fig. 38AC, the user's face is within the field of view 3828, but is turned away from the biometric sensor 3803. Accordingly, the electronic device 3800 determines that the user's face is turned away from the biometric sensor 3803. Upon determining that the user's face is turned away from the biometric sensor 3803, the electronic device 3800 displays an error indication 3814H that prompts the user to turn her face toward the sensor. The error indication 3814H also provides an indication of the cause of the error condition (e.g., an indication that the user's face is turned away from the biometric sensor 3803).
In fig. 38AD, the face of the user is properly positioned within the field of view and within the acceptable distance range of the biometric sensor 3803. However, the lighting conditions of the environment in which the user is located are not suitable for biometric authentication. Specifically, the amount of light is so large that it interferes with biometric authentication. Accordingly, the electronic device 3800 determines (e.g., via one or more ambient light sensors) that the amount of light exceeds a predefined threshold. Upon determining that the amount of light exceeds the threshold, the electronic device 3800 displays an error indication 3814I prompting the user to seek improved lighting conditions using a lesser amount of light. Error indication 3814I also provides an indication of the cause of the error condition (e.g., an indication that the light condition is not suitable for biometric authentication).
39A-39B are flow diagrams illustrating methods for providing an indication of an error condition during biometric authentication, according to some examples. The method 3900 is performed at an electronic device (e.g., 100, 300, 500, 3800) having a display (e.g., 3802) and one or more input devices (e.g., an accelerometer (e.g., 168), a touchscreen of the display (e.g., 3802)). In some examples, the electronic device includes one or more biometric sensors (e.g., fingerprint sensors, contactless biometric sensors (e.g., biometric sensors that do not require physical contact, such as thermal or optical facial recognition sensors), iris scanners). In some examples, the one or more biometric sensors include one or more cameras. Some operations in method 3900 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 3900 provides an intuitive way for providing an indication of an error condition during biometric authentication. The method reduces the cognitive burden of the user in biometric authentication, thereby creating a more efficient human-computer interface. For battery-driven computing devices, enabling a user to perform biometric authentication more quickly and efficiently conserves power and increases the interval between battery charges.
An electronic device (e.g., 100, 300, 500, 3800) receives (3902) a request to perform an operation requiring authentication (e.g., biometric authentication) via one or more input devices (e.g., an accelerometer (e.g., 168), a touchscreen of a display (e.g., 3802)). In some examples, the request to perform the operation requiring authentication includes a request to unlock the device (e.g., a swipe at a predefined location). In some examples, the request is triggered by lifting the device from a substantially horizontal position.
In response to (3904) performing a request to perform an operation requiring authentication (e.g., biometric authentication) and determining from (3906) that the authentication (e.g., biometric authentication) is successful, the electronic device performs the operation. In some examples, authentication is successful when the user input (e.g., data obtained from one or more biometric sensors corresponding to a biometric feature of the user (e.g., face, finger, pass code) corresponds to (e.g., matches) an authorized credential (e.g., enrollment fingerprint, face, or pass code). In some examples, the user input corresponds to an authorized credential when the user input matches the authorized credential.
In response to (3904) performing a request to perform an operation requiring authentication (e.g., biometric authentication) and determining from (3908) that the authentication (e.g., biometric authentication) is unsuccessful and that a set of error condition criteria are satisfied (e.g., there is an error condition), the electronic device (e.g., 100, 300, 500, 3800) displays (3910) an indication (e.g., 3814A-I) of the error condition (e.g., within the set of error condition criteria) on a display (e.g., 3802) and aborts (3916) performing the operation. The indication includes (3912) information about a cause of the error condition. In some examples, authentication is unsuccessful when the user input (e.g., data obtained from one or more biometric sensors corresponding to a biometric feature of the user (e.g., face, finger, pass code) does not correspond to (e.g., match) an authorized credential (e.g., enrollment fingerprint, face, or pass code). In some examples, the user input does not correspond to an authorized credential when the user input does not match the authorized credential. In some examples, the set of error condition criteria includes only one criterion. Displaying an indication of the error condition provides feedback to the user regarding the current state of the device (e.g., the error condition prevents successful biometric authentication) and prompts the user to take further action to correct the error condition. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. Furthermore, when biometric authentication fails and an error condition is detected, forgoing performing operations enhances security and reduces instances of multiple resource-intensive retries of biometric authentication that may fail due to the error condition. Providing improved security enhances the operability of the device and makes the user-device interface more efficient (e.g., by limiting unauthorized access), which in turn reduces power usage and extends the battery life of the device by limiting the performance of limited operations.
In some examples, in response to (3904) executing the request to perform the operation requiring authentication and determining from (3926) that the authentication (e.g., biometric authentication) is unsuccessful and the set of error condition criteria is not satisfied, the electronic device (e.g., 100, 300, 500, 3800) forgoes (3928) displaying on the display (e.g., 3802) an indication of the error condition and forgoes (3930) performing the operation.
In some examples, the indication of the error condition (e.g., 3814A-I) includes (3914) an indication (e.g., a visual indication (e.g., a graphic or text)) of a user action that may be performed to correct the error condition (e.g., for a subsequent authentication attempt). In some examples, the indication of the user action may indicate how to correct the error condition for a subsequent authentication attempt. Displaying an indication of a user action that may be performed to correct the error condition provides feedback to the user as to what action to take so that the user may be biometrically authenticated in a subsequent authentication attempt. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication.
In some examples, the indication of the error condition (e.g., 3814A-I) includes information about the cause of the error condition (e.g., an indication of a user action and/or a device status, a visual indication (e.g., a graphic or text)). The indication of the cause of the error condition is displayed provides feedback to the user as to what action to take so that the user can be biometrically authenticated in subsequent authentication attempts. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication.
In some examples, the set of error condition criteria includes a requirement to be met when a biometric feature (e.g., fingerprint, face) of a first type (e.g., a type corresponding to an authorized biometric feature) is detected using one or more biometric sensors (e.g., 3803) of the electronic device. In some examples, if a potentially valid biometric feature is not detected (e.g., indicating that the user is not currently engaged with the device), an indication of the error condition is not displayed (e.g., 3814A-I). Forgoing the display of the indication of the error condition when the biometric characteristic is not detected prevents possible confusion of the user, as the user may not intend to perform biometric authentication if the biometric characteristic is not detected. Thus, forgoing display of the indication in this case makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in accordance with a determination that the authentication (e.g., biometric authentication) is successful, the electronic device (e.g., 100, 300, 500, 3800) foregoes displaying an indication of the error condition (e.g., 3814A-I) on the display (e.g., 3802).
In some examples, after displaying the indication of the error condition (e.g., 3814A-I) and in accordance with a determination to continue to satisfy the set of error condition criteria, the electronic device (e.g., 100, 300, 500, 3800) foregoes (3918) attempting (and optionally disabling further attempts) biometric authentication on the electronic device (e.g., the biometric authentication function cannot be used on the device when the set of error conditions is satisfied). In some examples, after displaying the indication of the error condition and in accordance with the determination that the set of error condition criteria is no longer satisfied, the electronic device enables (3922) retry of the biometric authentication on the electronic device (e.g., the error condition is no longer present (e.g., has been corrected (e.g., due to user action to correct the error condition))). Automatically retrying biometric authentication when the set of error condition criteria is no longer satisfied allows a user to quickly attempt to biometrically authenticate himself without requiring the user to explicitly request biometric authentication. Performing optimized operations when a set of conditions has been met without requiring further user input enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, after displaying the indication of the error condition and in response to determining that the set of error condition criteria is no longer satisfied, the electronic device retries (3924) the authentication (e.g., biometric authentication) (e.g., automatically retries the authentication). In some examples, retrying authentication includes attempting to match biometric information obtained by one or more biometric sensors having authorized credentials (e.g., stored data that has been authorized for biometric authentication). In some examples, a determination that the error condition is not met occurs after (or in response to) receiving input to correct the error condition. In some examples, the retry authentication occurs (or only occurs) in accordance with a determination that the error condition is not satisfied due to detection of a user input that causes the error condition not to be satisfied.
In some examples, after determining that the set of error condition criteria is no longer satisfied (e.g., detecting that the error condition has been corrected), the electronic device (e.g., 100, 300, 500, 3800) receives, via one or more input devices, an input (e.g., 3824, 3826) corresponding to the request to retry authentication. In some examples, the input is a touch gesture input (e.g., a tap, a swipe (e.g., a swipe up)), or an activation of a hardware button (e.g., a power button). In some examples, the electronic device retries authentication (e.g., biometric authentication) (e.g., automatically retries authentication) in response to receiving an input corresponding to a request to retry authentication. In some examples, retrying authentication includes attempting to match biometric information obtained by one or more biometric sensors having authorized credentials (e.g., stored data that has been authorized for biometric authentication). In some examples, retrying authentication includes using one or more biometric sensors to obtain data of a biometric feature (e.g., face, fingerprint) of a user.
In some examples, displaying the indication of the error condition (e.g., 3814A-I) includes an animation (e.g., flashing) indicating that an authentication attempt is being made. In some examples, making the authentication attempt includes attempting to detect biometric information using one or more biometric sensors. Displaying a flashing animation indicating that an authentication attempt is being made provides feedback to the user regarding the current state of the device, and no further action is required at this time. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication.
In some examples, an electronic device (e.g., 100, 300, 500, 3800) performs authentication after (or in response to) receiving a request to perform an operation requiring authentication and before displaying an indication of an error condition (e.g., 3814A-I). In some examples, in performing authentication, the electronic device 3800 displays a first indication (e.g., 3810, 3814A-I) (e.g., a ring that rotates around a sphere, a flashing user interface object, wherein the user interface object includes an indication of an error condition) on a display (e.g., 3802) indicating that the electronic device is using one or more biometric sensors (e.g., 3803) of the electronic device to obtain information about the biometric characteristic. In some examples, displaying the indication of the error condition includes replacing display of the first indication with display of the indication of the error condition. Displaying an indication that biometric authentication is occurring provides feedback to the user about the current state of the device (e.g., biometric authentication is being performed) and the user need not take any action at this time. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in performing authentication, an electronic device (e.g., 100, 300, 500, 3800) displays a first lock icon (e.g., 3806) (e.g., an icon indicating a locked state of the electronic device) and a first animation transitioning from the first lock icon to the first indication on a display (e.g., 3802). In some examples, after displaying the indication of the error condition (e.g., and in accordance with a determination that the authentication is successful) and after displaying the first animation, the electronic device displays a second animation on the display (e.g., 3802) that transitions from the indication of the error condition to an unlock icon (e.g., 3816) (e.g., an iconic indication of a locked state of the electronic device). In some examples, the first animation and the second animation display a deformation from one object to the next object. In some examples, the second animation includes displaying the first locking icon after the indication of the error condition and before the unlocking icon.
In some examples, after displaying the indication of the error condition, the electronic device displays an animation on the display (e.g., 3802) that transitions from the indication of the error condition to a second locking icon (e.g., 3806), or from a second indication (e.g., 3810, 3814A-I) (e.g., a ring that rotates around a sphere) to a second locking icon (e.g., an icon that indicates a locked state of the electronic device), the second indication indicating that the electronic device used one or more biometric sensors of the electronic device to obtain information about the biometric characteristic. In some examples, the second lock icon is the first lock icon. In some examples, the second indication is the first indication.
In some examples, after retrying authentication and after displaying the indication of the error condition and in accordance with a determination that the error condition is not present, the electronic device displays a third indication (e.g., 3810, 3814A-I) (e.g., a ring that rotates around a sphere, a flashing user interface object, wherein the user interface object includes an indication of the error condition) on the display indicating that the electronic device is using one or more biometric sensors of the electronic device to obtain information about the biometric characteristic. In some examples, the third indication is the first indication.
In some examples, prior to displaying the indication of the error condition, the electronic device (e.g., 100, 300, 500, 3800) displays, on a display (e.g., 3802), a third lock icon (e.g., 3806) at a location on the display (e.g., an icon indicating a locked state of the electronic device). In some examples, the indication of the error condition (e.g., 3814A-I) is displayed proximate (e.g., next to, adjacent to, at, within a predetermined distance of) a location on the display. In some examples, the third lock icon is the first lock icon and/or the second lock icon.
In some examples, an electronic device (e.g., 100, 300, 500, 3800) transitions from a locked state to an unlocked state when the electronic device is in the locked state while a request to perform an operation requiring authentication is received and authentication is successful according to a determination. In some examples, the operation requiring authentication transitions the electronic device from the locked state to the unlocked state. In some examples, the electronic device remains in the locked state when the electronic device is in the locked state while a request to perform an operation requiring authentication is received and, in accordance with a determination that authentication is unsuccessful. Maintaining the device in a locked state when authentication is unsuccessful enhances device security by preventing fraudulent and/or unauthorized access to the device. Improving the security of the device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, when the electronic device is in a locked state, while receiving a request to perform an operation requiring authentication and according to a determination that authentication is unsuccessful, the electronic device (e.g., 100, 300, 500, 3800) maintains the locked state and retries authentication (e.g., biometric authentication) (e.g., automatically retries authentication). In some examples, retrying authentication includes attempting to obtain information about a biometric feature (e.g., face, fingerprint) using one or more biometric sensors of the electronic device. In some examples, retrying authentication includes attempting to match biometric information obtained by one or more biometric sensors having authorized credentials (e.g., stored data that has been authorized for biometric authentication). In some examples, after retrying the authentication and in accordance with a determination that the authentication resulted from retrying the authentication is successful, the electronic device transitions from the locked state to the unlocked state. In some examples, after retrying the authentication and in accordance with a determination that the authentication resulting from retrying the authentication is unsuccessful, the electronic device remains in the locked state.
In some examples, an electronic device (e.g., 100, 300, 500, 3800) attempts authentication (e.g., biometric authentication) after (or in response to) receiving a request to perform an operation requiring authentication. In some examples, when attempting authentication, the electronic device displays a third indication (e.g., 3810, 3814A-I) (e.g., a ring that rotates around a sphere) on the display (e.g., 3802) indicating that the electronic device is using one or more biometric sensors of the electronic device to obtain information about biometric features (e.g., face, fingerprint). In some examples, the indication is a scan animation. In some examples, the third indication is the first indication and/or the second indication. In some examples, when retrying authentication, the electronic device maintains display of the third indication on the display (e.g., 3802).
In some examples, in accordance with a determination that authentication resulting from retrying authentication is unsuccessful, the electronic device displays on a display (e.g., 3802) an animation having a lock icon (e.g., 3806) (e.g., an icon indicating a locked state of the electronic device) alternating between a first position and a second position, the second position different from the first position. In some examples, the animation with the lock icon is an animation of the lock icon panning (e.g., rotating back and forth from side to side). In some examples, the electronic device displays an animation that involves the lock icon to indicate that the biometric authentication has failed. In some examples, a tactile output is provided in conjunction with the shake-lock icon. In some examples, no haptic output is provided. In some examples, an electronic device (e.g., 100, 300, 500, 3800) maintains a locked state of the electronic device in accordance with a determination that biometric information captured using one or more biometric sensors does not correspond or match authorized credentials. Displaying an animation of the lock icon shaking provides feedback to the user about the current state of the device (e.g., biometric authentication failed) and prompts the user to take further action. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the electronic device (e.g., 100, 300, 500, 3800) includes a biometric sensor (e.g., 3803), and the set of error condition criteria includes one or more of the following error condition criteria:
the distance of the biometric feature from the biometric sensor exceeds a first predetermined threshold distance (e.g., the biometric feature (e.g., the face) is too far from the biometric sensor) or exceeds a maximum value of a range of distances (e.g., 3812). In some examples, exceeding a first predetermined threshold or maximum value of a range of distances is highly correlated with degraded or reduced accuracy of information about biometric features obtained by a biometric sensor. In some examples, the user may correct this error condition by moving the user's face closer to the biometric sensor.
The distance of the biometric feature from the biometric sensor is below a second predetermined threshold distance (e.g., the biometric feature (e.g., the face) is too close to the biometric sensor) or falls below a minimum value of the distance range (e.g., 3812). In some examples, a drop below a second predetermined threshold or minimum value of the range of distances is highly correlated with degraded or reduced accuracy of the information about the biometric characteristic obtained by the biometric sensor. In some examples, the user may correct this error condition by moving the user's face farther away from the biometric sensor.
A biometric sensor (e.g., 3803) is occluded (e.g., partially occluded, fully occluded, occluded to an extent sufficient to inhibit operation of the sensor) (e.g., occluded by a portion of a user (e.g., a hand) while interacting with the electronic device). In some examples, the user may correct this error condition by moving the user's hand away from the biometric sensor.
A sub-portion of the detected biometric feature (e.g., eyes of the detected face) is not oriented toward the biometric sensor (e.g., one or more eyes are not focused on the electronic device (e.g., biometric sensor)). In some examples, the user may correct this error condition by opening the user's eyes or viewing the electronic device (e.g., biometric sensor).
At least a portion of the detected biometric feature is occluded (e.g., partially occluded, fully occluded, occluded to a degree sufficient to obtain incomplete information about the biometric feature). In some examples, the user may correct this error condition by removing accessories (e.g., sunglasses) or clothing items (e.g., scarves, hats) that block the user's face.
No biometric features are detected within the field of view (e.g., 3828) of the biometric sensor.
A gesture (e.g., orientation relative to the biometric sensor) of the detected biometric feature exceeds a threshold range (e.g., the biometric feature (e.g., face) is turned away from the biometric sensor). In some examples, exceeding the threshold range is highly correlated with degraded or reduced accuracy of information about the biometric characteristic obtained by the biometric sensor. In some examples, the user may correct the error condition by rotating the user's face toward the electronic device (e.g., biometric sensor).
The electronic device detects an amount of light (e.g., ambient light) that exceeds a predetermined light threshold (e.g., via one or more ambient light sensors) (e.g., exceeding the predetermined light threshold is highly correlated with degraded or reduced accuracy of information about biometric features obtained by the biometric sensor). In some examples, the user may correct the error condition by rotating the user's back toward the sun in order to reduce the amount of light detected by the electronic device or to move to a new location (e.g., indoors) with less ambient light.
In some examples, the set of error condition criteria may be a first subset of the error conditions listed above. For example, the first subset may include one or more error condition criteria selected from the group consisting of: the distance of the biometric feature exceeds a first predetermined threshold distance, the distance of the biometric feature is below a second predetermined threshold distance, the biometric feature is outside the field of view of the biometric sensor, and the gesture of the biometric feature is outside a threshold range. The first sub-group focuses on guiding the user to correct error conditions related to the positioning and/or orientation of the face. As another example, the second subset may include one or more error condition criteria selected from the group consisting of: the biometric sensor is occluded and no biometric feature is detected within the field of view of the biometric sensor. The second subset focuses on guiding the user to correct error conditions where the biometric sensor cannot obtain any information about the biometric characteristic of the user. For another example, the third subset may include one or more error condition criteria selected from the group consisting of: the detected gesture of the biometric feature exceeds a threshold range and the biometric sensor is occluded. The third subset focuses on error conditions that may occur for a certain form factor/size device (e.g., a tablet device (e.g., iPad)).
In some examples, an electronic device (e.g., 100, 300, 500, 3800) includes a biometric sensor (e.g., 3803) at a portion (e.g., a location) of the electronic device (e.g., a portion not on the display). In some examples, in response to a request to perform an operation requiring authentication, the electronic device displays on a display (e.g., 3802) a progress indicator (e.g., 3814A-I) proximate to (e.g., adjacent, next to, within a predetermined distance of) the portion of the electronic device, the progress indicator including an indication of an error condition. Displaying the progress indicator in proximity to the biometric sensor provides feedback to the user regarding the degree of association of the biometric sensor with the process occurring at the device (e.g., attempted authentication). In particular, the user is aware of the biometric sensor during biometric authentication, such that the user is less likely to perform actions that interfere with the biometric sensor, or alternatively prompt the user to take corrective action. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the indication of the error condition (e.g., 3814B) includes an indication that biometric authentication is not currently enabled on the electronic device in accordance with a determination that biometric authentication is not currently enabled on the electronic device. In some examples, biometric authentication may become unavailable (or not enabled on the electronic device) when one or more of the following conditions are met: the electronic equipment is not successfully authenticated after being automatically opened or restarted; the electronic device is not unlocked for more than a predetermined amount of time (e.g., 48 hours); the passcode has not been used to unlock the device for more than a predetermined amount of time (e.g., 156 hours); biometric authentication using biometric features (e.g., face, fingerprint) has not been used to unlock the device for more than a predetermined amount of time (e.g., 4 hours); the electronic device has received a remote lock command; biometric authentication has failed more than a predetermined number of times since the last successful authentication with the device (e.g., 5, 10, 15); the electronic device has received a power-off and/or emergency SOS command and has detected a display request that the user disable biometric authentication. Displaying an indication that biometric authentication is not currently enabled provides feedback to the user on the current state of the device and prompts the user to seek alternative methods to authenticate himself. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the indication that biometric authentication is not currently enabled includes an affordance (e.g., 3814B) (e.g., the indication is an affordance). In some examples, an electronic device (e.g., 100, 300, 500, 3800) receives an input (e.g., 3820) corresponding to an affordance, and in response to receiving the input corresponding to the affordance, the electronic device (e.g., 100, 300, 500, 3800) displays a credential input user interface (e.g., 3822A) having a plurality of character input keys on a display (e.g., 3802). In some examples, the credential input user interface includes a virtual keypad or a virtual keyboard. In some examples, the virtual keypad or virtual keyboard includes a plurality of character input keys.
In some examples, an electronic device (e.g., 100, 300, 500, 3800) detects a condition that triggers an attempt to authenticate (e.g., biometric authentication). In some examples, the request to perform the operation requiring authentication includes a request to unlock the device (e.g., a swipe at a predefined location). In some examples, in response to detecting a condition that triggers an attempted authentication (e.g., biometric authentication) and in accordance with a determination that the condition corresponds to an alert (e.g., 3808) generated by the device without user input to the device (e.g., based on satisfaction of a criterion other than detection of the user input), while the biometric feature is available for detection by one or more biometric sensors (e.g., a face is detected in a field of view of one or more face detection sensors such as a depth camera), the electronic device displays a fifth indication (e.g., 3810) (e.g., a ring that rotates around a sphere) indicating that the electronic device is using one or more biometric sensors of the electronic device to obtain information about the biometric feature. In some examples, in accordance with a determination that the condition corresponds to an alert generated by the device without user input to the device (e.g., based on satisfaction of a criterion rather than detecting user input) while the biometric feature is not detectable by the one or more biometric sensors (e.g., no face is detected in a field of view of the one or more face detection sensors, such as a depth camera), the electronic device forgoes displaying a fifth indication (e.g., a ring that rotates around a sphere) indicating that the electronic device is using the one or more biometric sensors of the electronic device to obtain information about the biometric feature. In some examples, in accordance with a determination that the condition corresponds to a user input to the device (e.g., a request unrelated to the notification; a request as a touch gesture input (e.g., a tap, a swipe (e.g., 3824) (e.g., a swipe up), or an activation of a hardware button (e.g., a power button) or sensor data indicating a movement (e.g., a lift) of the device)), the electronic device displays a fifth indication that indicates that the electronic device is using one or more biometric sensors of the electronic device to obtain information about the biometric feature (e.g., regardless of whether the biometric feature is available for detection by the one or more biometric sensors). Discarding displaying the indication when no face is detected prevents possible confusion of the user, as the user may not intend to initiate biometric authentication if no biometric feature is detected. Thus, forgoing display of the indication in this case makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
Note that the details of the processes described above with respect to method 3900 (e.g., fig. 39A-39B) can also be applied in a similar manner to the methods described below/above. For example, method 3700, method 4100, and/or method 4300 optionally include one or more features of the various methods described above with reference to method 3900. For example, an error indication (e.g., 3814A-I) as described with respect to method 3900 may be used to provide an indication of an error condition during biometric authentication performed in the process described with respect to method 3700 and method 4100. For the sake of brevity, these details are not repeated below.
Fig. 40A-40U illustrate example user interfaces for providing an indication of a biometric sensor during biometric authentication, according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 41A through 41C.
Fig. 40A shows an electronic device 4000 (e.g., portable multifunction device 100, device 300, device 500). In the exemplary example shown in fig. 40A to 40U, the electronic device 4000 is a tablet computer. In other examples, the electronic device 4000 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 4000 includes a display 4002, one or more input devices (e.g., a touch screen of the display 4002, buttons 4004, and a microphone), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In fig. 40A, the electronic device includes a biometric sensor 4003. In some examples, the biometric sensor is one or more biometric sensors, which may include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, biometric sensor 4003 is biometric sensor 703. In some examples, the one or more biometric sensors include one or more fingerprint sensors (e.g., fingerprint sensors integrated into buttons). In some examples, the device further includes a light emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. Optionally, the light emitting device is for illuminating the biometric feature (e.g., face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
At fig. 40A, a user wishes to purchase goods using payment information stored on the electronic device 4000. As shown in fig. 40A, the electronic device 4000 is in a split-screen (e.g., multitasking) mode. In the split-screen mode, the electronic device 4000 simultaneously displays an application store User Interface (UI)4006 in a left area 4007 of the display 4002 and a browser UI 4008 in a right area 4009 of the display 4002. While simultaneously displaying the application store UI 4006 and the browser UI 4008, the electronic device 4000 receives an input 4010 at the purchase affordance 4012.
At figure 40B, in response to receiving input 4010 at purchase affordance 4012, electronic device 4000 swaps applications displayed in left region 4007 and right region 4009 of display 4002. Specifically, the electronic apparatus 4000 displays a browser UI 4008 in a left area 4007 and an application store UI 4006 in a right area 4009. The electronic device 4000 exchanges applications in order to place the application associated with the purchased good in an area closer to the biometric sensor 4003. By placing the browser UI 4008 in the left area 4007, the electronic device 4000 provides the user with an indication of the location of the biometric sensor 4003, which is used to authenticate the user before authorizing payment for the purchase of the goods. As shown in fig. 40B, the exchange application also places the application associated with the purchased good in an area closer to button 4004. In some examples, when the button 4004 and the biometric sensor 4003 are not in close proximity (e.g., on the same side), the electronic device 4000 exchanges applications as necessary to place the application associated with the purchased good in an area closer to the biometric sensor 4003. In some examples, when the button 4004 and biometric sensor 4003 are not in close proximity (e.g., on the same side), the electronic device 4000 exchanges applications as necessary to place the application associated with the purchased good in an area closer to the button 4004.
Further, as shown in fig. 40B, in response to receiving the input 4010 at the purchase affordance 4012, the electronic device 4000 dims the browser UI 4008 while dimming the application store UI 4006 to a greater degree than the browser UI 4008. By making the browser UI 4008 less dark than the application store UI 4006, the electronic device 4000 indicates to the user which application is associated with the payment page interface 4014 and the good that the user wishes to purchase.
Further, in response to receiving the input 4010 at the purchase affordance 4012, the electronic device 4000 concurrently displays a payment page interface 4014 having information about the goods purchased and a prompt 4016 to prompt the user to double-click the button 4004 to initiate a process for authorizing payment of the goods. Further, in response to receiving input 4010 at purchase affordance 4012, electronic device 4000 displays dynamic indication 4018 to emphasize the location of button 4004. Upon display of the payment page interface 4014, the electronic device receives an input 4020 at button 4004 (e.g., a double press of button 4004). In some examples, the prompt 4016 includes some or all of the features of the prompt 2416. In some examples, dynamic indication 4018 includes some or all of the features of dynamic indication 2418. In some examples, payment page interface 1014 includes a name of an application to which it corresponds (e.g., a name of an application from which the user initiated a process for authorizing payments).
At fig. 40C, in response to receiving input 4020 at button 4004, electronic device 4000 initiates a process for authorizing payment of goods. Authorizing payment for goods requires successful authentication of the user. Thus, in response to receiving the input 4020, the electronic device 4000 initiates biometric authentication using the biometric sensor 4003. After initiating biometric authentication, the electronic device 4000 displays a facial symbol 4022 that provides an indication that the electronic device 4000 is attempting to authenticate the user in a biometric manner (e.g., is attempting to obtain biometric information about the user using the biometric sensor 4003). In some examples, facial symbol 4022 includes some or all of the features of symbol 2468 of fig. 24F. In some examples, in response to receiving the input 4020 at the button 4004, the electronic device displays an animation of the facial symbol 4022 moving from the location of the prompt 4016 to the location of the facial symbol 4022, as shown in fig. 40C. In some examples, the animation causes the facial symbol 4022 to appear to slide out of the prompt 4016.
At fig. 40D, after displaying the facial symbol 4022, the electronic device transitions to displaying an authentication symbol 4024 that provides an indication that the electronic device 4000 is attempting to authenticate the user biometrically (e.g., continuing to attempt to obtain biometric information, attempting to match the obtained information with stored authorized credentials). The authentication symbol 4024 includes a plurality of spherically rotating rings. In some examples, the authentication symbol 4024 includes some or all of the features of one or more of the rings 2470 of fig. 24G.
Upon displaying the authentication symbol 4024, the electronic device 4000 detects that an error condition exists (e.g., a state in which the biometric sensor 4003 is prevented from obtaining sufficient information about the user's face). Specifically, the electronic device 4000 detects that the biometric sensor 4003 is covered by a physical object (e.g., a user's hand). In some examples, the electronic device 4000 does not detect an error condition and is able to obtain sufficient information about the user's face. In some examples, after obtaining sufficient information about the user's face and while displaying the authentication symbol 4024, the electronic device 4000 determines whether the obtained information satisfies biometric authentication criteria (e.g., determines whether the obtained biometric information matches a biometric template associated with the user (e.g., a stored authorized credential) within a threshold). In some examples, upon determining that the biometric authentication is successful (e.g., meets biometric authentication criteria), the electronic device 4000 transitions to the unlocked state.
At fig. 40E, in response to detecting that an error condition exists, electronic device 4000 displays an error indication 4026 (e.g., with respect to ground, with respect to the user) at a location at the top of display 4002. Error indication 4026 provides an indication of the error condition currently present. Further, in response to detecting the presence of an error condition, the electronic device 4000 displays an error icon 4028 at a location of the display 4002 adjacent to the biometric sensor 4003, thereby providing an indication of the location of the biometric sensor 4003. By providing an indication of the location of the biometric sensor 4003, the error icon 4028 suggests to the user the cause of the error condition. In some examples, in response to detecting the presence of the error condition, the electronic device 4000 displays an error indication 4026 at a location adjacent to the biometric sensor 4003. In some examples, the error indication 4026 includes some or all of the features of the error indication 3814A, including a blinking effect.
At fig. 40F, further in response to detecting that an error condition exists, electronics 4000 displays an animation of payment page interface 4014 moving from its initial position in fig. 40E to a position in fig. 40F that is closer to biometric sensor 4003. By moving the payment page interface toward the biometric sensor 4003, the electronic device 4000 indicates to the user the presence of the error icon 4028 (thereby suggesting to the user the cause of the error condition) in addition to indicating the location of the biometric sensor 4003.
In some examples, error icons 4028 are displayed at different locations of display 4002 depending on the positioning of the user's hand on display 4002. As shown in fig. 40F, the user's hand covers a portion of the display 4002 adjacent to the biometric sensor 4003. When the user's hand is in contact with the display 4002, the electronic apparatus 4000 detects an input due to the contact of the user's hand. In response to detecting the input, the electronic device 4000 displays an error icon 4028 at a location where the input is not detected. As another example, in fig. 40G, the user's hand covers less of display 4002 than the user's hand does in fig. 40F. In some examples, in response to detecting input by the user's hand in fig. 40G, electronic device 4000 displays error icon 4028 at a location different from the location in fig. 40F, where the location in fig. 40G is closer to biometric sensor 4003 than the location in fig. 40F. As another example, in fig. 40H, the user's hand covers most of the upper left corner of the display 4002. In some examples, in response to detecting input by the user's hand in fig. 40H, electronic device 4000 displays error icon 4028 at a location different from the location in fig. 40F-40G. In particular, in some examples, the electronic device 4000 displays the error icon 4028 at a location proximate to (or substantially next to) the biometric sensor 4003 rather than at a location where input by the user's hand is detected.
At fig. 40I, the user removes her hand so that it no longer covers the biometric sensor 4003. While displaying the error icon 4028 and the error icon 4028, the electronic device 4000 detects that the error condition no longer exists.
At fig. 40J, the electronic device 4000 automatically retries biometric authentication in response to detecting that the error condition no longer exists. When retrying the biometric authentication, the electronic device 4000 displays the authentication symbol 4024. Upon displaying the authentication symbol 4024, the electronic device 4000 attempts to authenticate the user in a biometric manner. In particular, the electronic device 4000 acquires information about the user's face using the biometric sensor 4003 and determines whether the biometric verification is successful (e.g., the acquired information matches a stored authorized credential).
When retrying the biometric authentication, the electronic device 4000 determines that the biometric authentication is successful. At fig. 40K, upon determining that the biometric authentication is successful, the electronic device 4000 displays a success symbol 4030, which provides an indication that the biometric authentication is successful. In some examples, success symbol 4030 includes some or all of the features of symbol 2474.
At fig. 40L, also in response to determining that the biometric authentication is successful, electronic device 4000 displays a processing indicator 4032 that provides an indication that the payment transaction is being processed (e.g., electronic device 4000 transmits payment information (e.g., credentials) to an external device (e.g., a server) to authorize the payment). In some examples, processing indicator 4032 has some or all of the features of processing indicator 2476.
At figure 40M, upon receiving an indication that payment has been completed (e.g., authorized), electronic device 4000 displays an indication 4034 of completion, which provides an indication that payment has been completed. In some examples, completed indication 4034 has some or all of the features of completed indication 2478 of fig. 24 AR.
Fig. 40N to 40S illustrate a technique for displaying the error indication 4026 and the error icon 4028 when the error indication 4026 and the error icon 4028 are to be displayed at about the same position. At FIG. 40N, the user wishes to unlock the device to access restricted content (e.g., home screen, most recently used application). Fig. 40N shows the electronic device 4000 in a portrait orientation relative to the ground, with the user covering the biometric sensor 4003 with her hand. In addition, the electronic apparatus 4000 displays a lock state UI 4036 using the lock icon 4038. A lock icon 4038 provides an indication that electronic device 4000 is in a locked state.
While displaying locked state UI 4036, electronic device 4000 receives a request to unlock the device. For example, the electronic device 4000 detects that the user lifts the device from a substantially horizontal position.
At fig. 40O, in response to receiving a request to unlock the device, the electronic device 4000 attempts to biometrically authenticate the user. In an attempt to authenticate a user in a biometric manner, the electronic device 4000 displays an authentication symbol 4024. In addition, the electronic device 4000 detects that an error condition exists (e.g., a state in which the biometric sensor 4003 is prevented from obtaining sufficient information about the face of the user) when attempting to authenticate the user in a biometric manner. Specifically, the electronic device 4000 detects that the biometric sensor 4003 is covered by a physical object (e.g., a user's hand).
At fig. 40P, in response to detecting the presence of an error condition, the electronic device 4000 displays an error icon 4028 at a location of the display 4002 that is near the biometric sensor 4003 (e.g., at the top of the display 4002). Further, in response to detecting that an error condition exists, the electronic device 4000 determines that an error indication 4026 is displayed at substantially the same location as the error icon 4028. Upon determining that the error indication 4026 is to be displayed at substantially the same location, the electronic device 4000 may not immediately display the error indication 4028, but instead display the error indication 4026 as part of an animation transitioning from the error icon 4028 to the error indication 4026 to the locked icon 4038, as described below with respect to fig. 40Q-40R.
At fig. 40Q, after displaying the error icon 4028, the electronic device 4000 displays (e.g., replaces the display of the error icon 4028 with) the error indication 4026 (as discussed above), providing an indication of the cause of the error condition.
Upon displaying the error indication 4026, the user removes her hand from the biometric sensor 4003 so that it no longer covers the biometric sensor 4003. In response to detecting that the error condition no longer exists, the electronic device 4000 automatically retries the biometric authentication.
At fig. 40R to 40S, upon determining that authentication is successful by retrying biometric authentication, the electronic device 4000 transitions from the locked state to the unlocked state. In particular, the electronic device 4000 displays (e.g., replaces the display of the error indication 4026 with) an animation of the lock icon 4038 transitioning to the unlock icon 4040 that provides an indication to the user that the electronic device 4000 has transitioned to the unlocked state. In some examples, instead of a successful biometric authentication, the electronic device 4000 determines that the authentication was unsuccessful due to retrying the biometric authentication. In some examples, when it is determined that the authentication is not successful, electronic device 4000 displays a passcode entry UI with an affordance that, when activated, triggers a retry of the biometric authentication. In some examples, when retrying biometric authentication, electronic device 4000 dims all portions of display 4002 except for a user interface associated with retrying biometric authentication.
FIG. 40T illustrates a technique for displaying the error icon 4028 when the error icon 4028 is to be displayed at approximately the same location as one of the notifications (e.g., 4044A-D) being displayed. In some examples, the user wishes to view restricted content of one or more notifications (e.g., 4044A-D) displayed when the electronic device 4000 is in the locked state. As shown in fig. 40T, when the electronic device is in a portrait orientation, the user is covering the biometric sensor 4003 with her hand, with the biometric sensor 4003 being located near the bottom of the device. In some examples, in attempting to biometrically authenticate the user to access restricted content of the notification, electronic device 4000 detects that there is an error condition that exists as a result of the user covering biometric sensor 4003 with her hand. In response to detecting that an error condition exists, the electronic device 4000 determines that an error icon 4028 is displayed at substantially the same location as one of the notifications (e.g., 4044A-D). In making this determination and in response to detecting that an error condition exists, electronic device 4000 displays UI element 4042 (e.g., the background) concurrently with error icon 4028 to provide a background on which the display of error icon 4028 is superimposed. As shown in fig. 40T, UI element 4042 is opaque such that the notification (e.g., 4042D) on which error icon 4028 is superimposed is not visible to the user. In some examples, UI element 4042 is transparent so that the notification on which error icon 4028 is superimposed is visible to the user.
Fig. 40T also illustrates a technique for hiding the unlock indication 4044 of fig. 40U when the error icon 4028 is to be displayed at substantially the same location as the unlock indication 4044. In some examples, the electronic device 4000 displays the unlock indication 4044, which provides an indication of an approximate location on the display 4002 from which the user can swipe up to initiate biometric authentication. In some examples, while displaying the unlock indication 4044, the electronic device 4000 detects that an error condition exists due to the user covering the biometric sensor 4003 with her hand. In some examples, in response to detecting that an error condition exists, the electronic device 4000 determines that the error icon 4028 is displayed at substantially the same location as the unlock indication 4044. In some examples, when this determination is made and in response to detecting that an error condition exists, the electronic device 4000 stops displaying the unlock indication 4044 and displays the error icon 4028 at substantially the same location as the unlock indication 4044.
Upon displaying the error icon 4028, the electronic device 4000 detects that the error condition no longer exists (e.g., because the user removed her hand from the biometric sensor 4003). As shown in fig. 40U, the user has removed her hand from the biometric sensor 4003. At figure 40U, upon detecting that the error condition no longer exists, the electronic device 4000 stops displaying the error icon 4028 and redisplays the unlock indication 4044 at the position where it was previously displayed.
41A-41C are flow diagrams illustrating methods for providing an indication of a biometric sensor during biometric authentication, according to some examples. Method 4100 is performed at an electronic device (e.g., 100, 300, 500, 4000) having a display (e.g., 4002) and a biometric sensor (e.g., 4003) at a first portion of the electronic device (e.g., a portion that is not part of the display) (e.g., a first biometric sensor of a device having a plurality of biometric sensors) (e.g., a fingerprint sensor, a contactless biometric sensor (e.g., a biometric sensor that does not require physical contact, such as a thermal or optical facial recognition sensor), an iris scanner). In some examples, the biometric sensor includes one or more cameras. Some operations in method 4100 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 4100 provides an intuitive way for providing an indication of a biometric sensor during biometric authentication. The method reduces the cognitive burden of the user in biometric authentication, thereby creating a more efficient human-computer interface. For battery-driven computing devices, enabling a user to perform biometric authentication more quickly and efficiently conserves power and increases the interval between battery charges.
An electronic device (e.g., 100, 300, 500, 4000) detects (4102) (e.g., in response to a request to perform an operation requiring authentication) that there is an error condition (e.g., a contactless biometric sensor such as a thermal or optical facial recognition sensor) that prevents the biometric sensor from obtaining biometric information about a device user from being occluded (e.g., partially occluded, fully occluded, occluded to an extent sufficient to inhibit operation of the sensor) (e.g., occluded by a portion of a user (e.g., a hand) while interacting with the electronic device).
In response to (4104) detecting the presence of an error condition, the electronic device (e.g., 100, 300, 500, 4000) displays an error indication (e.g., 4028) (e.g., a graphical icon) on a display (e.g., 4002). In some examples, the error indication includes text (e.g., indicating that the sensor is occluded). In some examples, the error indication does not include text. An error indication is displayed (4106) at a location adjacent to the first portion of the electronic device. In some examples, the location is at or near a portion of the display closest to the location of the biometric sensor (e.g., 4003). Displaying the error indication provides feedback to the user regarding the current state of the device (e.g., the error condition prevents successful biometric authentication) and prompts the user to take further action to correct the error condition. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. Displaying the error indication in proximity to the biometric sensor provides feedback to the user regarding the degree of association of the biometric sensor with a process (e.g., attempted authentication) occurring at the device. In particular, the user is aware of the biometric sensor during biometric authentication, such that the user is less likely to perform actions that interfere with the biometric sensor, or alternatively prompt the user to take corrective action. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the error indication (e.g., 4028) includes (4108) a biometric sensor occlusion icon and a reticle, the error indication providing an indication that the biometric sensor is occluded. (ISE, error indication associated with an electronic device performing biometric authentication (e.g., using a biometric sensor to obtain biometric information about a biometric feature (e.g., face, fingerprint)) providing an indication that the biometric sensor is occluded provides feedback to a user about the current state of the device (e.g., the biometric sensor is occluded), thus, providing improved feedback using an indication of proper movement of the biometric characteristic enhances the operability of the device, and makes the user-device interface more efficient (e.g., by helping the user provide appropriate input and reducing user error in operating/interacting with the device), this in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In accordance with a determination (4110) that a user interface of the electronic device is in a first orientation with respect to the biometric sensor, the electronic device (e.g., 100, 300, 500, 4000) displays an error indication at a first location in the user interface that is proximate to (e.g., adjacent to, next to, within a predetermined distance of) a first portion of the electronic device.
In accordance with (4112) a determination that a user interface of the electronic device is in a second orientation relative to the biometric sensor, the electronic device (e.g., 100, 300, 500, 4000) displays an error indication (e.g., 4028) at a second location in the user interface that is proximate to (e.g., adjacent to, next to, within a predetermined distance of) the first portion of the electronic device, the first orientation being different from the second orientation.
In some examples, when attempting (4114) to obtain biometric information using a biometric sensor (e.g., 4003), an electronic device (e.g., 100, 300, 500, 4000) displays (4116) a first progress indicator (e.g., 4024, 4026, 4038, 4040) on a display (e.g., 4002). In some examples, the first progress indicator provides an indication of a current state of the electronic device (e.g., a locked state, an unlocked state, performing biometric authentication, an error state, an error condition). In some examples, in accordance with (4118) determining that a user interface (e.g., 4006, 4008) of the electronic device is in a third orientation relative to the biometric sensor, the user interface in the third orientation having a first top side, the electronic device displays the first progress indicator adjacent to (e.g., adjacent to, next to, within a predetermined distance of) the first top side of the user interface in the third orientation. In some examples, it is determined from (4120) that the user interface of the electronic device is in a fourth orientation with respect to the biometric sensor, the user interface in the fourth orientation having a second top side, the electronic device displaying the first progress indicator adjacent to (e.g., adjacent, next to, within a predetermined distance of) the second top side of the user interface in the fourth orientation, the third orientation being different from the fourth orientation. In some examples, the first progress indicator is displayed on the display at a location closest to or proximate (e.g., adjacent, next to, within a predetermined distance of) the biometric sensor. Regardless of orientation, displaying the first progress indicator near the top of the display ensures that the user is more likely to be aware of the feedback (e.g., progress indicator) provided to the user. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication.
In some examples, an electronic device (e.g., 100, 300, 500, 4000) displays a second progress indicator (e.g., 4024, 4026, 4038, 4040) of the electronic device on a display (e.g., 4002). In some examples, the second progress indicator provides an indication of a current state of the electronic device (e.g., a locked state, an unlocked state, performing biometric authentication, an error state). In some examples, the first progress indicator is a second progress indicator. In some examples, the second progress indicator is an animation having a first portion (e.g., an indication (e.g., a rotating ring) that the electronic device performs biometric authentication using a biometric sensor (e.g., 4024)) and a second portion different from the first portion (e.g., an indication of an error condition or error state (e.g., 4026), an indication of a current locked or unlocked state of the electronic device (e.g., a locking icon (e.g., 4038), an unlocking icon (e.g., 4040))). In some examples, in accordance with a determination to display the second progress indicator at a location proximate to the first portion of the electronic device, the electronic device displays an error indication (e.g., 4028) as part of the animation that follows the first portion and precedes the second portion.
In some examples, an electronic device (e.g., 100, 300, 500, 4000) displays a home affordance (e.g., 4044) on a display (e.g., 4002) at a third location in a user interface (e.g., a location adjacent to a side (e.g., a bottom side) of the user interface) (e.g., an indication of a location of a gesture that, when executed, results in display of the home screen, such as a swipe-up gesture from an edge of the display or a tap gesture on the affordance). In some examples, in accordance with a determination that the error indication (e.g., 4028) is displayed at the third location, the electronic device stops displaying the home affordance (e.g., 4044) while displaying the error indication at the third location. Ceasing to display the home affordance while displaying the error indication enables the user to quickly recognize that the home affordance is inaccessible because of the error and prompt the user to take further action to correct the error condition. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication.
In some examples, after ceasing to display the home affordance (e.g., 4044), the electronic device (e.g., 100, 300, 500, 4000) detects a correction of an error condition that prevents the biometric sensor (e.g., 4003) from obtaining biometric information about the device user. In some examples, the electronic device detects the presence of an error condition after displaying the error indication (e.g., 4028) at the third location. In some examples, in response to detecting correction of the error condition, the electronic device displays a home affordance (e.g., and stops displaying the error indication (e.g., 4028)) at a third location in the user interface on the display (e.g., 4002).
In some examples, an electronic device (e.g., 100, 300, 500, 4000) detects an input (e.g., palm, finger) at a location that is proximate (e.g., adjacent, next to, within a predetermined distance of) a first portion of the electronic device. In some examples, in response to detecting an input at a location proximate to the first portion of the electronic device, the electronic device displays an error indication at a different location on the display (e.g., 4028). In some examples, the different location is a location where no input was detected. In some examples, the electronic device determines a different location based on the location of the input relative to the display before displaying the error indication at the new location. In some examples, the different location is adjacent to a location adjacent to the first portion of the electronic device. In some examples, the error indication is moved to the different location after being initially displayed at the first location proximate to the first portion of the electronic device. In some examples, the error indication is initially displayed at the selected location so as to be remote from any area of the display known to be occluded (e.g., by the detected touch input). Displaying error indications at different locations according to an input location (e.g., a user's hand) provides feedback to the user regarding the current state of the device (e.g., an error condition prevents successful biometric authentication) and prompts the user to take further action to correct the error condition. Furthermore, by adjusting the position, the device ensures that the error indication is visible to the user, so the user is more likely to take corrective action at the device. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, an electronic device (e.g., 100, 300, 500, 4000) displays a first transaction interface (e.g., 4014) (e.g., a transaction (or payment) interface separate from (or superimposed on top of) a user interface and including transaction information such as a credit card number, billing address, etc.) on a display (e.g., 4002) at a location that is proximate to (e.g., adjacent to, next to, within a predetermined distance of) a first portion of the electronic device. In some examples, the first transaction interface is displayed in response to receiving an input (e.g., 4010) corresponding to an affordance (e.g., 4012) of a user interface (e.g., 4008), e.g., an affordance for making a payment or completing a transaction.
In some examples, displaying the first transaction interface (e.g., 4014) includes displaying an animation of the first transaction interface transitioning (e.g., panning) from an initial position to a position proximate to the first portion of the electronic device, the initial position being substantially centered with respect to the display. In some examples, the animation includes displaying the first transaction interface (e.g., maintaining display of the first transaction interface) while the first transaction interface transitions (e.g., translates) from an initial position to a position adjacent to the first portion of the electronic device. In some examples, the animation includes a visual effect in which the first transaction interface appears to float while transitioning.
In some examples, an electronic device (e.g., 100, 300, 500, 4000) displays a prompt (e.g., 4016) on a display (e.g., 4002) to provide one or more activations of a hardware button (e.g., 4004) of the electronic device. In some examples, the electronic device prompts the user by displaying a "double click on Apple Pay. In some examples, the prompt is displayed adjacent to the button. In some examples, the prompt will be displayed when the device is displaying the transaction user interface area (e.g., 4014) without receiving any indication that the transaction terminal is nearby and requesting a transaction credential (e.g., displaying a prompt to provide one or more activations of the button before the device is placed in the NFC field of the NFC reader requesting payment information). In some examples, the hardware buttons are mechanical buttons or solid state buttons. In some examples, the button is a switch or any other type of switch. In some examples, the buttons have a fixed position relative to the electronic device, and in particular, relative to a display of the electronic device, such that the electronic device can display the prompt based on the position of the button. In some examples, the button is a solid state button that operates according to a capacitive and/or resistive touch and/or in response to a change in input intensity without a mechanical switch being pressed to activate the button, but instead monitors whether the intensity of the input is above an intensity threshold corresponding to activation of the solid state button. In some examples, an electronic device (e.g., 100, 300, 500, 4000) receives one or more activations of a hardware button (e.g., 4020) of the electronic device, and in response to receiving the one or more activations of the hardware button, the electronic device displays an authentication progress indicator (e.g., 4022, 4024, 4030, 4032, 4034) on a display (e.g., 4002). In some examples, displaying the authentication progress indicator includes displaying an animation of the authentication progress indicator transitioning from a location of the prompt (e.g., 4016) to a final location of the authentication progress indicator. In some examples, the authentication indicator provides a status of the authentication (e.g., in progress, successful, unsuccessful). In some examples, the animation includes displaying the authentication progress indicator (e.g., maintaining display of the authentication progress indicator) while the authentication progress indicator transitions (e.g., translates) from the prompted position to a final position. In some examples, the animation includes a visual effect in which the authentication progress indicator appears to slide out of the prompt. In some examples, the authentication progress indicator is displayed with (or superimposed on) a user interface (e.g., 4014) (or a transaction user interface region). Prompting the user to activate a hardware button directs the user to perform an action at the device in order to complete the transaction. Prompting the user in this manner enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide appropriate input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. Displaying an authentication progress indicator provides feedback to the user regarding the status of the authentication. The improved feedback enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, an electronic device (e.g., 100, 300, 500, 4000) simultaneously displays (4122) a first application (e.g., corresponding to 4006, 4008) in a first area (e.g., 4007, 4009) and a second application (e.g., corresponding to 4006, 4008) in a second area (e.g., 4007, 4009) that is adjacent (e.g., next to, proximate to, within a predetermined distance thereof) to the first application on a display (e.g., 4002). In some examples, the electronic device displays (4124) the second transaction interface (e.g., 4014) on the display. In some examples, the second transaction interface is the first transaction interface. In some examples, the second transaction interface is displayed overlaid on the first application and/or the second application. In some examples, in accordance with (4126) a determination that the second transaction interface corresponds to the first application, the electronic device modifies a first visual characteristic of the first application (e.g., obscuration, darkening, blurring). In some examples, the second transaction interface corresponds to the first application when the first application includes information about a good or service (or transaction) purchased (or completed) using (or through) the second transaction interface. In some examples, the determination is made while displaying the second transaction interface. In some examples, in accordance with (4130) a determination that the second transaction interface corresponds to the second application, the electronic device (e.g., 100, 300, 500, 4000) modifies a first visual characteristic (e.g., mask, dim, blur) of the second application. In some examples, the second transaction interface corresponds to the first application when the first application includes information about a good or service (or transaction) purchased (or completed) using (or through) the second transaction interface. In some examples, the determination is made while displaying the second transaction interface.
In some examples, modifying the first visual characteristic of the first application includes modifying a second visual characteristic of the second application. In some examples, modifying the second visual characteristic of the second application includes increasing a darkening and/or increasing a blur radius of a blur effect applied to the second application to a greater extent (or amount) than the first application. In some examples, modifying the first visual characteristic of the second application includes modifying a second visual characteristic of the first application. In some examples, modifying the second visual characteristic of the first application includes increasing a darkening and/or increasing a blur radius of a blur effect applied to the first application to a greater extent (or amount) than the second application. Modifying the visual characteristics of one application to a greater extent than another application provides feedback to the user as to which application is more relevant at the time. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication.
In some examples, modifying the first visual characteristic of the first application includes displaying (4128) the first application in the second region in accordance with a determination that the second region is closer (e.g., closer) to the first portion of the electronic device (e.g., the biometric sensor) than the first region. In some examples, displaying the first application in the second area includes ceasing to display the first application in the first area. In some examples, modifying the first visual characteristic of the second application includes displaying (4132) the second application in the first region in accordance with a determination that the first region is closer (e.g., closer) to a first portion of the electronic device (e.g., a biometric sensor) than the second region. In some examples, displaying the second application in the first area includes ceasing to display the second application in the second area. In some examples, the electronic device displays an animation of the first application swapping positions with the second application.
In some examples, in accordance with a determination that the second transaction interface (e.g., 4014) corresponds to the first application, the second transaction interface includes an indication of the first application (e.g., a name of the first application). In some examples, in accordance with a determination that the second transaction interface corresponds to a second application, the second transaction interface includes an indication of the second application (e.g., a name of the second application).
Note that the details of the processes described above with respect to method 4100 (e.g., fig. 41A-41C) may also be applied in a similar manner to the methods described below/above. For example, method 3700, method 3900, and/or method 4300 optionally include one or more features of the various methods described above with reference to method 4100. For example, error icon 4028, as described in method 4100, can be used to indicate that the biometric sensor is blocked when biometric authentication is performed in the process described in connection with methods 3700 and 3900. For the sake of brevity, these details are not repeated below.
Fig. 42A-42P illustrate exemplary user interfaces for orienting a device to register a biometric feature (e.g., a face for later use in biometric authentication), according to some examples. The user interfaces in these figures are used to illustrate the processes described below, including the processes in fig. 43A through 43C.
Fig. 42A shows an electronic device 4200 (e.g., portable multifunction device 100, device 300, device 500). In the illustrative example shown in fig. 42A-42P, the electronic device 4200 is a tablet computer. In other examples, the electronic device 4200 may be a different type of electronic device, such as a wearable device (e.g., a smart watch). The electronic device 4200 includes a display 4202, one or more input devices (e.g., a touchscreen, buttons, and a microphone of the display 4202), and a wireless communication radio. In some examples, the electronic device includes a plurality of cameras. In some examples, the electronic device includes only one camera. In fig. 42A, the electronic device includes a biometric sensor 4203. In some examples, the biometric sensor is one or more biometric sensors, which may include a camera, such as an infrared camera, a thermal imaging camera, or a combination thereof. In some examples, biometric sensor 4203 is biometric sensor 703. In some examples, the one or more biometric sensors include one or more fingerprint sensors (e.g., fingerprint sensors integrated into buttons). In some examples, the device further includes a light emitting device (e.g., a light projector), such as an IR floodlight, a structured light projector, or a combination thereof. Optionally, the light emitting device is for illuminating the biometric feature (e.g., face) during capture of biometric data of the biometric feature by the one or more biometric sensors.
At fig. 42A, a user wishes to set biometric (e.g., facial) authentication on the electronic device 4200. Successfully setting biometric authentication on a device enables a user to perform an operation on the device requiring authentication (e.g., unlocking the device) by presenting the user's face for biometric authentication. To set biometric authentication on an electronic device, a user must first register her face. The process for registering a face may include some or all of the features (or processes) in fig. 11A-11O.
As shown in fig. 42A, the electronic device 4200 displays an introduction User Interface (UI)4206 with an initiation affordance 4208. The electronic device 4200 receives an input 4210 at the initiation affordance 4208 to begin a process of registering a user's face for biometric authentication.
At fig. 42B, in response to receiving the input 4210 at the initiating affordance 4208, the electronic device 4200 determines that the orientation of the device is not appropriate for registering the user's face. In some examples, a suitable orientation for registering the face of the user is a vertical (e.g., upright) portrait orientation, where the portrait orientation is such that the biometric sensor 4203 is located at the top of the device (e.g., on the side of the device furthest from the ground). In response to determining that the orientation of the device is not suitable for registering the user's face, the electronic device 4200 displays (e.g., replaces the display of the introduction UI 4206) one or more prompts to prompt the user to orient the electronic device 4200 to the appropriate orientation. More specifically, the electronic device 4200 determines that the electronic device 4200 is in a substantially horizontal orientation (e.g., substantially parallel to the ground). Accordingly, as shown in fig. 42B, the electronic device 4200 displays a prompt 4212A to prompt the user to lift the electronic device 4200 to the vertical position.
In some examples, in response to receiving the input 4210 at the initiating affordance 4208, the electronic device determines that the orientation of the device is suitable for registering the user's face. In some examples, upon determining that the orientation is appropriate for registering the user's face, the electronic device 4200 will automatically initiate a process for registering the user's face, as described below in connection with fig. 42D.
At fig. 42C, in response to determining that the electronic device 4200 is in a vertical position but not in a portrait orientation (e.g., the user has lifted the device off the desktop in response to the prompt 4212A), the electronic device 4200 displays (e.g., replaces the display of the prompt 4212A with) the prompt 4212B to prompt the user to rotate the electronic device 4200 to the portrait orientation (e.g., with the top biometric sensor 4203). In particular, the prompt 4212B prompts the user to rotate in a particular direction such that a minimum rotation is required to achieve a desired (or appropriate) orientation. For example, rotating the electronic device 4200 clockwise requires less rotation than rotating the device counterclockwise to achieve the desired orientation.
At fig. 42D, in response to determining that the electronic device 4200 is in a suitable orientation, the electronic device 4200 automatically initiates a process for registering a user's face. As shown in fig. 42D to 42F, after the process for registering the face of the user is initiated, the electronic device 4200 displays the face registration UI 4214. The face registration UI 4214 includes some or all of the features of the face registration UI 1104.
At fig. 42G, upon successful completion of registration of the user's face, the electronic device 4200 displays (e.g., replaces display of the face registration UI 4214 with) the scan completion interface 4216, which includes the continuation affordance 4218. The scan completion interface 4216 includes some or all of the features of the scan completion interface 1130.
After the registration of the user's face is completed, a second iteration of the registration process is performed without the user reorienting the device. As shown in fig. 42G, while the scan completion interface 4216 is displayed, the electronic device 4200 receives an input 4220 at the continuation affordance 4218 to initiate a second iteration of the registration process.
At fig. 42H, in response to receiving the input 4220 at the continuation affordance 4218, the electronic device 4200 initiates a second iteration of the registration process, similar to the process described above in connection with fig. 42D-42F. The electronic device 4200 initiates the second iteration without prompting the user to reorient the device to a different orientation than its current orientation. Initiating a second iteration of the registration process includes displaying a second facial registration UI 4222. The second face registration UI 4222 includes some or all of the features of the second face registration UI 1138.
At fig. 42I, after successfully completing the second iteration of the registration process, the electronic device 4200 displays (e.g., replaces the display of the second facial registration UI 4222 with) a second scan completion interface 4224 that includes a continuation affordance 4226. The second scan completion interface 4224 includes some or all of the features of the second scan completion interface 1156. As shown in fig. 42I, the electronic device 4200 receives an input 4228 at the continuation affordance 4226.
At fig. 42J, in response to receiving the input 4228 at the continuation affordance 4226, the electronic device 4200 displays (e.g., replaces display of the second scan completion interface 4224 with) the registration completion interface 4230, providing an indication to the user that the biometric authentication has been successfully set on the electronic device 4200. Registration completion interface 4230 includes some or all of the features of registration completion interface 1166.
At fig. 42K, after the biometric authentication has been set on the electronic device 4200, the user may unlock the electronic device 4200 (e.g., transition the device from a locked state to an unlocked state) by presenting the user's face to the biometric sensor 4203 and using the biometric authentication. In some examples, the user initiates biometric authentication to unlock the device by lifting (or raising) the electronic device 4200 (e.g., from a substantially horizontal orientation). When the electronic device 4200 is lifted, the electronic device 4200 detects a change in device orientation and, in response, initiates biometric authentication to unlock the device. Note that while the electronic device 4200 is in the locked state, the electronic device 4200 displays a lock state interface 4232 that includes a biometric sensor 4234 that provides an indication to the user of the location of the biometric sensor 4203 and a lock icon 4236 that provides an indication that the electronic device 4200 is in the locked state. In some examples, the electronic device 4200 does not display the biometric sensor indicator 4234, while the electronic device 4200 is in a locked state.
As shown in fig. 42L, when the electronic device 4200 initiates biometric authentication, the user is holding the electronic device 4200 such that the user's face is outside the field of view 4238 of the biometric sensor 4203. In some examples, the user's face is outside the field of view 4238 when more than a threshold portion of the face is outside the field of view. In some examples, the user's face is outside of the field of view 4238 when no face is detected within the field of view. When attempting to biometrically authenticate a user's face, the electronic device 4200 cannot obtain sufficient information about the user's face using the biometric sensor 4203. Thus, electronic device 4200 does not have sufficient information to compare with the stored authorized credentials that result from the registration process described above with respect to fig. 42D-42J.
In fig. 42M, upon determining that the user's face is outside of the field of view 4238, the electronic device 4200 displays an error indication 4240 that provides an indication to the user that the user's face is outside of the field of view 4238. (the error indication 4240 includes some or all of the features of the error indication 3814G.) additionally, the electronic device 4200 does not automatically retry authentication upon determining that the user's face is outside the field of view 4238. In some examples, the electronic device 4200 also displays a biometric sensor indicator 4234. In some examples, the electronic device 4200 automatically retries biometric authentication if sufficient information has been obtained but authentication still fails (e.g., the obtained information does not match the stored authorized credentials).
As shown in fig. 42N, after learning from the error indication 4240 that the user's face is outside the field of view 4238 of the biometric sensor 4203, the user moves her face into the field of view 4238 such that the user's face is within the field of view 4238. In response to detecting that the cause of the false indication 4240 has been corrected (e.g., detecting more than a threshold amount of the user's face), the electronic device 4200 automatically retries the biometric authentication. Upon determining that the authentication is successful due to retrying the biometric authentication (e.g., the information obtained using the biometric sensor 4203 matches the stored authorized credentials), the electronic device 4200 transitions from the locked state to the unlocked state. After transitioning to the unlocked state, the electronic display unlocks state interface 4242.
In some examples, when the unlock state interface 4242 is displayed, the electronic device 4200 receives a request (e.g., a swipe up starting from within a region adjacent to the bottom edge of the display 4202) to access restricted content on the device (e.g., the home screen 4244 of fig. 42O, a recently used application). In response to receiving a request to access restricted content, the electronic device 4200 displays a home screen 4244 that includes a plurality of icons that, when activated, cause launching of an application corresponding to the activated icon. In some examples, instead of displaying the home screen 4244, the electronic device 4200 displays the most recently used application (e.g., a user interface for the application). Note that the process described above with respect to fig. 42K-42O is performed when the electronic device 4200 is in a landscape orientation. However, in some examples, some or all of the processes described above with respect to fig. 42K-42N are performed when the electronic device 4200 is in a portrait orientation.
In some examples, instead of transitioning to the unlocked state described with respect to fig. 42N, if the obtained information does not match the stored authorized credentials, electronic device 4200 remains in the locked state. In some examples, as shown in fig. 42P, upon determining that the obtained information does not match the stored authorized credentials, electronic device 4200 displays lock status interface 4232 while alternating the position of lock icon 4236 such that it simulates a "shake" effect, thereby providing an indication to the user that electronic device 4200 remains in a locked state.
Fig. 43A-43C are flow diagrams illustrating methods for orienting a device to register a biometric feature (e.g., a face for later use in biometric authentication), according to some examples. The method 4300 is performed at an electronic device (e.g., 100, 300, 500, 4200) having a display (e.g., 4202) and one or more biometric sensors (e.g., a first biometric sensor of a device having a plurality of biometric sensors) (e.g., a fingerprint sensor, a contactless biometric sensor (e.g., a biometric sensor that does not require physical contact, such as a thermal or optical facial recognition sensor), an iris scanner). In some examples, the one or more biometric sensors include one or more cameras. Some operations in the method 4300 are optionally combined, the order of some operations is optionally changed, and some operations are optionally omitted.
As described below, the method 4300 provides an intuitive way for prompting a user to orient a device to register a biometric feature. The method reduces the cognitive burden on the user when registering biometric features (e.g., faces for later use in biometric authentication), thereby creating a more efficient human-machine interface. For battery-driven computing devices, enabling users to more quickly and efficiently enroll biometric features conserves power and increases the interval between battery charges.
An electronic device (e.g., 100, 300, 500, 4200) displays (4302) a biometric enrollment user interface (e.g., 4206) on a display (e.g., 4202) for initiating a biometric enrollment with one or more biometric sensors.
Upon displaying (4304) the biometric enrollment user interface, the electronic device receives an input (e.g., 4210) (e.g., a touch gesture (e.g., a tap), a spoken user input) corresponding to the request to initiate the biometric enrollment.
In response to (4306) receiving the input (e.g., 4210) and determining from (4308) that the orientation of the electronic device (e.g., the current orientation, the orientation of the electronic device at (or near) the input time) satisfies a set of enrollment criteria, the electronic device initiates a process for enrolling a biometric feature using one or more biometric sensors (e.g., 4203). In some examples, the set of registration criteria includes whether the electronic device is oriented longitudinally with respect to a frame of reference (e.g., earth, ground), whether the one or more biometric sensors are oriented (or positioned) in a longitudinal direction on a particular side of the electronic device (e.g., the side furthest from the ground), or whether the electronic device is oriented such that it is substantially non-parallel with respect to the ground. In some examples, the set of registration criteria includes whether the electronic device is in a certain (e.g., appropriate) orientation relative to the biometric feature (e.g., face) (e.g., a major plane of the device (e.g., a face defined by a display of the device) faces the biometric feature). In some examples, initiating the process for registering the biometric feature includes capturing data corresponding to the user's face using one or more biometric sensors.
In response to (4306) receiving the input (e.g., 4210) and in accordance with (4322) determining that the orientation of the electronic device does not satisfy the set of registration criteria, one or more cues (e.g., 4212A-B) (e.g., visual, audio, and/or tactile cues) are output that change the orientation of the electronic device to a different orientation that satisfies the set of registration criteria. Outputting one or more prompts when the set of enrollment criteria is not satisfied provides feedback to the user as to what corrective action to take to continue enrolling the biometric feature. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication.
In some examples, outputting the one or more cues includes outputting (4324) a first cue (e.g., 4212A) that orients the electronic device in an initial direction. In some examples, the initial orientation is an orientation such that the electronic device is substantially non-parallel with respect to the ground. In some examples, the initial orientation is an orientation such that the electronic device is substantially parallel to gravity. In some examples, the set of registration criteria includes requiring that a major plane of the device be substantially aligned with a predetermined plane (e.g., a plane substantially perpendicular to the ground) such that a display of the device is substantially vertical. In some examples, the set of enrollment criteria includes requiring that a major plane of the device not be substantially aligned with a (second) predetermined plane (e.g., a plane substantially parallel to the ground) such that the device does not rest on a horizontal surface when attempting to enroll the biometric feature. In some examples, outputting the one or more prompts includes outputting (4326) a second prompt (e.g., 4212B) that orients the electronic device in a different orientation that satisfies the set of registration criteria after outputting the first prompt (e.g., 4212A), the first prompt being different from the second prompt. In some examples, the electronic device outputs the first prompt without outputting the second prompt. In some examples, the electronic device stops outputting the first prompt when the orientation of the electronic device changes to the initial orientation. In some examples, the electronic device outputs the second prompt when the orientation of the electronic device changes to the initial orientation. In some examples, the electronic device outputs the second alert without outputting the first alert (e.g., when the electronic device is already in an initial orientation). In some examples, the set of registration criteria includes whether the electronic device is oriented longitudinally with respect to a frame of reference (e.g., earth, ground), whether the one or more biometric sensors are oriented (or positioned) in a longitudinal direction on a particular side of the electronic device (e.g., the side furthest from the ground), or whether the electronic device is oriented such that it is substantially non-parallel with respect to the ground. Outputting the first prompt without outputting the second prompt provides improved feedback to the user because it reduces the likelihood of confusion when the user takes corrective action to trigger enrollment of the biometric feature. Providing improved visual feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently. In some examples, the indicator is not displayed during biometric authentication.
In some examples, outputting the one or more prompts includes outputting a third prompt (e.g., 4212B) to rotate the electronic device (e.g., about an axis perpendicular to the electronic device) to a different orientation that satisfies the set of registration criteria, the third prompt based on an orientation of the electronic device when receiving the input. In some examples, the third cue is the second cue. In some examples, in accordance with a determination that the orientation of the electronic device is in the first orientation, the electronic device outputs a first rotation hint that rotates the electronic device to a different orientation that satisfies the set of registration criteria. In some examples, in accordance with a determination that the orientation of the electronic device is in a second orientation different from the first orientation, the electronic device outputs a second rotation cue that rotates the electronic device to a different orientation that satisfies the set of registration criteria, the second rotation cue being different from the first rotation cue. In some examples, the first rotation cue or the second rotation cue is the second cue. In some examples, the set of registration criteria includes whether the electronic device is oriented longitudinally with respect to a frame of reference (e.g., earth, ground), whether the one or more biometric sensors are oriented (or positioned) in a longitudinal direction on a particular side of the electronic device (e.g., the side furthest from the ground), or whether the electronic device is oriented such that it is substantially non-parallel with respect to the ground. Outputting the prompt based on the orientation of the device provides the user with feedback on an efficient process of achieving the proper orientation of the device for enrolling biometric features. Providing improved feedback to the user enhances the operability of the device and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, outputting the one or more cues includes outputting (e.g., about an axis perpendicular to the electronic device) a fourth cue (e.g., 4212B) that rotates the electronic device (e.g., along an axis parallel to a major plane of the device (e.g., a plane defined by a display of the device)) to a different orientation that satisfies the set of registration criteria, the third cue being based on an alignment of the major plane of the device (e.g., a plane defined by the display of the device) with a predetermined plane (e.g., a plane substantially straight to the ground; a plane substantially parallel to the ground). In some examples, the electronic device outputs the fourth prompt in accordance with a determination that the electronic device is oriented substantially parallel to the ground. In some examples, the set of registration criteria includes requiring that a major plane of the device be substantially aligned with a predetermined plane (e.g., a plane substantially perpendicular to the ground) such that a display of the device is substantially vertical. In some examples, the set of enrollment criteria includes requiring that a major plane of the device not be substantially aligned with a (second) predetermined plane (e.g., a plane substantially parallel to the ground) such that the device does not rest on a horizontal surface when attempting to enroll the biometric feature.
In some examples, after initiating a process for registering a biometric feature (e.g., after successfully registering the biometric feature), an electronic device (e.g., 100, 300, 500, 4200) receives a request to perform an operation requiring authentication (e.g., a request to unlock the device (e.g., perform a swipe at a predefined location)). In some examples, after performing (or completing) biometric enrollment, the electronic device receives a request to perform an operation requiring authentication. In some examples, after outputting one or more prompts (e.g., 4212A-B) (e.g., visual, audio, and/or tactile prompts) that change the orientation of the electronic device to a different direction that satisfies a set of registration criteria, the electronic device receives a request to perform an operation that requires authentication. In some examples, in response to receiving a request to perform an operation requiring authentication, the electronic device attempts authentication using one or more biometric sensors (e.g., 4203) (e.g., including acquiring data by the one or more biometric sensors). In some examples, after attempting (e.g., unsuccessfully attempting) authentication using one or more biometric sensors and in accordance with a determination that data acquired by the one or more biometric sensors corresponds to less than a threshold amount of biometric features (e.g., a portion of a face/fingerprint rather than an entire face/fingerprint) (e.g., because the face is out of view (e.g., 4238)), the electronic device foregoes retrying authentication. In some examples, the electronic device foregoes automatic retry authentication. In some examples, after attempting authentication using one or more biometric sensors, the electronic device foregoes retrying authentication because biometric authentication has failed more than a predetermined number of times (e.g., 5, 10, 15 times) since the last successful authentication of the device. In some examples, the electronic device foregoes retrying authentication without an explicit request to perform an operation requiring authentication, such as a request to unlock the device (e.g., perform a swipe at a predefined location). In some examples, after the initial authentication attempt is unsuccessful, the electronic device retries the biometric authentication if it is not determined that the data obtained by the one or more biometric sensors corresponds to only a portion of the biometric characteristic. Discarding retry authentication when less than a threshold amount of biometric features are obtained avoids the user consuming an allowed number of attempts for repeated requests (e.g., repeated requests of the same type), thereby retaining at least one attempt for a request for other operations requiring biometric authentication. Saving at least one attempt enhances the operability of the device and makes the user-device interface more efficient (e.g., by avoiding exhausting authentication attempts on repeated similar requests), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, the electronic device retries authentication after attempting (e.g., unsuccessfully attempting) authentication using the one or more biometric sensors and in accordance with a determination that the data obtained by the one or more biometric sensors corresponds to no less than (e.g., more than) the threshold amount of biometric features. Automatically retrying authentication when a threshold amount of biometric features are obtained enables a user to attempt authentication when conditions are appropriate without the user explicitly requesting to retry authentication. Performing the operation when a set of conditions has been met without further user input enhances the operability of the device (e.g., increases the chances of successful authentication) and makes the user-device interface more efficient (e.g., by helping the user provide suitable input and reducing user error in operating/interacting with the device), which in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more quickly and efficiently.
In some examples, in accordance with a determination that authentication resulted from retrying authentication is successful, the electronic device (e.g., 100, 300, 500, 4200) performs an operation corresponding to the request. In some examples, in accordance with a determination that the authentication resulting from retrying the authentication is unsuccessful, the electronic device foregoes performing an operation corresponding to the request. In some examples, authentication is successful when biometric information captured using one or more biometric sensors corresponds to (or matches) authorized credentials (e.g., stored information about biometric features (e.g., face, fingerprint) authorized for use in biometric authentication). In some examples, authentication is unsuccessful when biometric information captured using one or more biometric sensors does not correspond to (or does not match) authorized credentials (e.g., stored information about biometric features (e.g., faces, fingerprints) authorized for use in biometric authentication). Aborting execution of operations when authentication is unsuccessful enhances device security by preventing spoofing and/or unauthorized access to the device. Improving the security of the device enhances the operability of the device by preventing unauthorized access to content and operations, and in turn reduces power usage and extends the battery life of the device by enabling the user to use the device more efficiently.
In some examples, after outputting one or more prompts (e.g., 4212A-B) (e.g., visual, audio, and/or tactile prompts) to change the orientation of the electronic device to a different orientation that satisfies the set of registration criteria, the electronic device (e.g., 100, 300, 500, 4200) detects (4328) that the current orientation of the electronic device satisfies the set of registration criteria. In some examples, in response to (4330) determining that the current orientation of the electronic device satisfies the set of enrollment criteria, the electronic device initiates a process for enrolling biometric features using one or more biometric sensors. In some examples, the set of registration criteria includes whether the electronic device is oriented longitudinally with respect to a frame of reference (e.g., earth, ground), whether the one or more biometric sensors are oriented (or positioned) in a longitudinal direction on a particular side of the electronic device (e.g., the side furthest from the ground), or whether the electronic device is oriented such that it is substantially non-parallel with respect to the ground. In some examples, the set of registration criteria includes requiring that a major plane of the device be substantially aligned with a predetermined plane (e.g., a plane substantially perpendicular to the ground) such that a display of the device is substantially vertical. In some examples, the set of enrollment criteria includes requiring that a major plane of the device not be substantially aligned with a (second) predetermined plane (e.g., a plane substantially parallel to the ground) such that the device does not rest on a horizontal surface when attempting to enroll the biometric feature. In some examples, the set of registration criteria includes whether the electronic device is in a certain (e.g., appropriate) orientation relative to the biometric feature (e.g., face) (e.g., a major plane of the device (e.g., a face defined by a display of the device) faces the biometric feature).
In some examples, initiating the process for enrolling biometric features using one or more biometric sensors includes successfully enrolling the biometric features. In some examples, after successfully registering the biometric feature, the electronic device (e.g., 100, 300, 500, 4200) outputs (4312) a prompt (e.g., corresponding to 4222) to re-register the biometric feature using one or more biometric sensors. In some examples, the electronic device outputs a prompt to register the biometric feature without prompting a change in orientation of the electronic device.
In some examples, initiating a process for enrolling biometric features using one or more biometric sensors includes (4310) successfully enrolling the biometric features. In some examples, after successfully registering the biometric feature, the electronic device (e.g., 100, 300, 500, 4200) receives (4314) a request to perform an operation requiring authentication (e.g., a request to unlock the device (e.g., perform a swipe at a predefined location), request access to a home screen (e.g., 4244)). In some examples, the electronic device performs an operation requiring authentication in response to (4316) receiving a request to perform the operation requiring authentication and in accordance with (4318) determining that data obtained by one or more biometric sensors corresponds to (e.g., matches) the registered biometric characteristic. In some examples, in response to receiving a request to perform an operation requiring authentication, the electronic device performs authentication (or attempts authentication) using one or more biometric sensors (e.g., 4203). In some examples, in response to (4316) receiving a request to perform an operation requiring authentication and in accordance with (4320) a determination that data obtained by one or more biometric sensors does not correspond to (e.g., does not match) a registered biometric characteristic, the electronic device forgoes performing the operation requiring authentication.
Note that the details of the process described above with respect to the method 4300 (e.g., fig. 43A-43C) may also be applied in a similar manner to the methods described below/above. For example, method 3700, method 3900, and/or method 4100 optionally include one or more features of the various methods described above with reference to method 4300. For example, the process described in method 4300 for registering biometric features may be used to register a face for later use in biometric authentication, such as retrying biometric authentication at a password entry user interface, as described in method 3700. As another example, as described in method 4100, the registered face may be used for authorized payment for the good. For the sake of brevity, these details are not repeated below.
The foregoing description, for purpose of explanation, has been described with reference to specific examples. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The examples were chosen and described in order to best explain the principles of the techniques and their practical applications. Those skilled in the art are thus well able to utilize the techniques and various examples with various modifications as are suited to the particular use contemplated.
Although the present disclosure and examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. It is to be understood that such changes and modifications are to be considered as included within the scope of the disclosure and examples as defined by the following claims.
As described above, one aspect of the present technology is to collect and use data from a variety of sources to improve the delivery of heuristic content or any other content to a user that may be of interest to the user. The present disclosure contemplates that, in some instances, this collected data may include personal information data that uniquely identifies or may be used to contact or locate a particular person. Such personal information data may include demographic data, location-based data, telephone numbers, email addresses, home addresses, or any other identifying information.
The present disclosure recognizes that the use of such personal information data in the present technology may be useful to benefit the user. For example, the personal information data may be used to deliver target content that is of greater interest to the user. Thus, the use of such personal information data enables planned control of delivered content. In addition, the present disclosure also contemplates other uses for which personal information data is beneficial to a user.
The present disclosure also contemplates that entities responsible for the collection, analysis, disclosure, transmission, storage, or other use of such personal information data will comply with established privacy policies and/or privacy practices. In particular, such entities should enforce and adhere to the use of privacy policies and practices that are recognized as meeting or exceeding industry or government requirements for maintaining privacy and security of personal information data. For example, personal information from a user should be collected for legitimate and legitimate uses by an entity and not shared or sold outside of these legitimate uses. In addition, such collection should only be done after the user has informed consent. In addition, such entities should take any required steps to secure and protect access to such personal information data, and to ensure that others who are able to access the personal information data comply with their privacy policies and procedures. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices.
Regardless of the foregoing, the present disclosure also contemplates examples in which a user selectively prevents use or access to personal information data. That is, the present disclosure contemplates that hardware elements and/or software elements may be provided to prevent or block access to such personal information data. For example, in the case of an ad delivery service, the techniques of the present invention may be configured to allow a user to opt-in to "join" or "opt-out of" participating in the collection of personal information data during registration with the service. As another example, the user may choose not to provide location information for the targeted content delivery service. As another example, the user may choose not to provide accurate location information, but to permit transmission of location area information.
Thus, while this disclosure broadly covers the use of personal information data to implement one or more of the various disclosed examples, this disclosure also contemplates that various examples may also be implemented without having to access such personal information data. That is, various examples of the present technology do not fail to function properly due to the lack of all or a portion of such personal information data. For example, content may be selected and delivered to a user by inferring preferences based on non-personal information data or an absolute minimum of personal information, such as content requested by a device associated with the user, other non-personal information available to a content delivery service, or publicly available information.

Claims (42)

1. An electronic device, comprising:
a display;
a biometric sensor located at a first portion of the electronic device;
one or more processors; and
memory storing one or more programs configured for execution by the one or more processors, the one or more programs including instructions for:
detecting whether an error condition exists that prevents the biometric sensor from obtaining biometric information about a user of the device while the device is locked;
in response to detecting the presence of the error condition, concurrently displaying on the display:
a first error indication in a first region of a user interface; and
a second error indication, wherein the second error indication is displayed at a location proximate to the first portion of the electronic device, comprising:
in accordance with a determination that the user interface of the electronic device is in a first orientation relative to the biometric sensor, displaying the second error indication at a first location in the user interface proximate to the first portion of the electronic device; and
in accordance with a determination that the user interface of the electronic device is in a second orientation relative to the biometric sensor, displaying the second error indication at a second location in the user interface proximate to the first portion of the electronic device, the first orientation being different from the second orientation; and
after the error condition has been corrected, obtaining biometric information about a user of the device, and in response to obtaining the biometric information about the user of the device, in accordance with a determination that the biometric information matches registered biometric information, unlocking the device and displaying an indication in the first region of the user interface that the device has been unlocked.
2. The electronic device of claim 1, wherein the second error indication includes a biometric sensor occlusion icon and a reticle, the second error indication providing an indication that the biometric sensor is occluded.
3. The electronic device of claim 1, the one or more programs further comprising instructions for:
in attempting to obtain biometric information using the biometric sensor:
displaying a first progress indicator on the display, comprising:
in accordance with a determination that the user interface of the electronic device is in a third orientation relative to the biometric sensor, wherein the user interface in the third orientation has a first top side, the first progress indicator is displayed proximate to the first top side of the user interface in the third orientation; and
in accordance with a determination that the user interface of the electronic device is in a fourth orientation relative to the biometric sensor, wherein the user interface in the fourth orientation has a second top side, the first progress indicator is displayed proximate to the second top side of the user interface in the fourth orientation, and the third orientation is different from the fourth orientation.
4. The electronic device of claim 1, the one or more programs further comprising instructions for:
displaying a second progress indicator of the electronic device on the display, the second progress indicator being an animation having a first portion and a second portion different from the first portion, comprising:
in accordance with a determination that the second progress indicator is displayed at the location proximate to the first portion of the electronic device, displaying the second error indication as a portion of the animation that follows the first portion and precedes the second portion.
5. The electronic device of claim 1, the one or more programs further comprising instructions for:
displaying a home affordance at a third location in the user interface on the display; and
in accordance with a determination that the second error indication is displayed at the third location, displaying the second error indication at the third location while ceasing to display the home affordance.
6. The electronic device of claim 5, the one or more programs further comprising instructions for:
after ceasing to display the home affordance, detecting a correction to the error condition that prevents the biometric sensor from obtaining biometric information about the user of the device; and
in response to detecting the correction to the error condition, displaying the home affordance on the display at the third location in the user interface.
7. The electronic device of claim 1, the one or more programs further comprising instructions for:
detecting an input at the location proximate to the first portion of the electronic device; and
in response to detecting the input at the location proximate to the first portion of the electronic device, displaying the second error indication at a different location on the display, wherein the different location is a location at which the input was not detected.
8. The electronic device of claim 1, the one or more programs further comprising instructions for:
displaying a first transaction interface on the display at a location proximate to the first portion of the electronic device.
9. The electronic device of claim 8, wherein displaying the first transaction interface includes displaying an animation of the first transaction interface transitioning from an initial position that is substantially centered with respect to the display to the position proximate to the first portion of the electronic device.
10. The electronic device of claim 1, the one or more programs further comprising instructions for:
displaying, on the display, a prompt providing one or more activations of a hardware button of the electronic device; receiving one or more activations of the hardware button of the electronic device; and
in response to receiving the one or more activations of the hardware button, displaying an authentication progress indicator on the display, wherein displaying the authentication progress indicator comprises displaying an animation of the authentication progress indicator transitioning from a position of the prompt to a final position of the authentication progress indicator.
11. The electronic device of claim 1, the one or more programs further comprising instructions for:
concurrently displaying a first application in a first area and a second application in a second area on the display, the second application adjacent to the first application;
displaying a second transaction interface on the display; in accordance with a determination that the second transaction interface corresponds to the first application, modifying a first visual characteristic of the first application; and
in accordance with a determination that the second transaction interface corresponds to the second application, modify a first visual characteristic of the second application.
12. The electronic device of claim 11, wherein:
modifying the first visual characteristic of the first application comprises modifying a second visual characteristic of the second application, and modifying the first visual characteristic of the second application comprises modifying a second visual characteristic of the first application.
13. The electronic device of claim 11, wherein:
modifying the first visual characteristic of the first application includes, in accordance with a determination that the second area is closer to the first portion of the electronic device than the first area, displaying the first application in the second area, and
modifying the first visual characteristic of the second application includes, in accordance with a determination that the first region is closer to the first portion of the electronic device than the second region, displaying the second application in the first region.
14. The electronic device of claim 11, wherein displaying the second transaction interface comprises:
in accordance with a determination that the second transaction interface corresponds to the first application, the second transaction interface includes an indication of the first application, an
In accordance with a determination that the second transaction interface corresponds to the second application, the second transaction interface includes an indication of the second application.
15. A non-transitory computer-readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device with a display and a biometric sensor located at a first portion of the electronic device, the one or more programs comprising instructions for:
detecting whether an error condition exists that prevents the biometric sensor from obtaining biometric information about a user of the device while the device is locked;
in response to detecting the presence of the error condition, concurrently displaying on the display:
a first error indication in a first region of a user interface; and
a second error indication, wherein the second error indication is displayed at a location proximate to the first portion of the electronic device, comprising:
in accordance with a determination that the user interface of the electronic device is in a first orientation relative to the biometric sensor, displaying the second error indication at a first location in the user interface proximate to the first portion of the electronic device; and
in accordance with a determination that the user interface of the electronic device is in a second orientation relative to the biometric sensor, displaying the second error indication at a second location in the user interface proximate to the first portion of the electronic device, the first orientation being different from the second orientation; and
after the error condition has been corrected, obtaining biometric information about a user of the device, and in response to obtaining the biometric information about the user of the device, in accordance with a determination that the biometric information matches registered biometric information, unlocking the device and displaying an indication in the first region of the user interface that the device has been unlocked.
16. The non-transitory computer readable storage medium of claim 15, wherein the second error indication includes a biometric sensor occlusion icon and a reticle, the second error indication providing an indication that the biometric sensor is occluded.
17. The non-transitory computer readable storage medium of claim 15, the one or more programs further comprising instructions for:
in attempting to obtain biometric information using the biometric sensor:
displaying a first progress indicator on the display, comprising:
in accordance with a determination that the user interface of the electronic device is in a third orientation relative to the biometric sensor, wherein the user interface in the third orientation has a first top side, the first progress indicator is displayed proximate to the first top side of the user interface in the third orientation; and
in accordance with a determination that the user interface of the electronic device is in a fourth orientation relative to the biometric sensor, wherein the user interface in the fourth orientation has a second top side, the first progress indicator is displayed proximate to the second top side of the user interface in the fourth orientation, and the third orientation is different from the fourth orientation.
18. The non-transitory computer readable storage medium of claim 15, the one or more programs further comprising instructions for:
displaying a second progress indicator of the electronic device on the display, the second progress indicator being an animation having a first portion and a second portion different from the first portion, comprising:
in accordance with a determination that the second progress indicator is displayed at the location proximate to the first portion of the electronic device, displaying the second error indication as a portion of the animation that follows the first portion and precedes the second portion.
19. The non-transitory computer readable storage medium of claim 15, the one or more programs further comprising instructions for:
displaying a home affordance at a third location in the user interface on the display; and
in accordance with a determination that the second error indication is displayed at the third location, displaying the second error indication at the third location while ceasing to display the home affordance.
20. The non-transitory computer readable storage medium of claim 19, the one or more programs further comprising instructions for:
after ceasing to display the home affordance, detecting a correction to the error condition that prevents the biometric sensor from obtaining biometric information about the user of the device; and
in response to detecting the correction to the error condition, displaying the home affordance on the display at the third location in the user interface.
21. The non-transitory computer readable storage medium of claim 15, the one or more programs further comprising instructions for:
detecting an input at the location proximate to the first portion of the electronic device; and
in response to detecting the input at the location proximate to the first portion of the electronic device, displaying the second error indication at a different location on the display, wherein the different location is a location at which the input was not detected.
22. The non-transitory computer readable storage medium of claim 15, the one or more programs further comprising instructions for:
displaying a first transaction interface on the display at a location proximate to the first portion of the electronic device.
23. The non-transitory computer-readable storage medium of claim 22, wherein displaying the first transaction interface includes displaying an animation of the first transaction interface transitioning from an initial position that is substantially centered with respect to the display to the position proximate to the first portion of the electronic device.
24. The non-transitory computer readable storage medium of claim 15, the one or more programs further comprising instructions for:
displaying, on the display, a prompt providing one or more activations of a hardware button of the electronic device; receiving one or more activations of the hardware button of the electronic device; and
in response to receiving the one or more activations of the hardware button, displaying an authentication progress indicator on the display, wherein displaying the authentication progress indicator comprises displaying an animation of the authentication progress indicator transitioning from a position of the prompt to a final position of the authentication progress indicator.
25. The non-transitory computer readable storage medium of claim 15, the one or more programs further comprising instructions for:
concurrently displaying a first application in a first area and a second application in a second area on the display, the second application adjacent to the first application;
displaying a second transaction interface on the display; in accordance with a determination that the second transaction interface corresponds to the first application, modifying a first visual characteristic of the first application; and
in accordance with a determination that the second transaction interface corresponds to the second application, modify a first visual characteristic of the second application.
26. The non-transitory computer-readable storage medium of claim 25, wherein:
modifying the first visual characteristic of the first application comprises modifying a second visual characteristic of the second application, and modifying the first visual characteristic of the second application comprises modifying a second visual characteristic of the first application.
27. The non-transitory computer-readable storage medium of claim 25, wherein:
modifying the first visual characteristic of the first application includes, in accordance with a determination that the second area is closer to the first portion of the electronic device than the first area, displaying the first application in the second area, and
modifying the first visual characteristic of the second application includes, in accordance with a determination that the first region is closer to the first portion of the electronic device than the second region, displaying the second application in the first region.
28. The non-transitory computer-readable storage medium of claim 25, wherein displaying the second transaction interface comprises:
in accordance with a determination that the second transaction interface corresponds to the first application, the second transaction interface includes an indication of the first application, an
In accordance with a determination that the second transaction interface corresponds to the second application, the second transaction interface includes an indication of the second application.
29. A method, comprising:
at an electronic device having a display and a biometric sensor located at a first portion of the electronic device:
detecting whether an error condition exists that prevents the biometric sensor from obtaining biometric information about a user of the device while the device is locked;
in response to detecting the presence of the error condition, concurrently displaying on the display:
a first error indication in a first region of a user interface; and
a second error indication, wherein the second error indication is displayed at a location proximate to the first portion of the electronic device, comprising:
in accordance with a determination that the user interface of the electronic device is in a first orientation relative to the biometric sensor, displaying the second error indication at a first location in the user interface proximate to the first portion of the electronic device; and
in accordance with a determination that the user interface of the electronic device is in a second orientation relative to the biometric sensor, displaying the second error indication at a second location in the user interface proximate to the first portion of the electronic device, the first orientation being different from the second orientation; and
after the error condition has been corrected, obtaining biometric information about a user of the device, and in response to obtaining the biometric information about the user of the device, in accordance with a determination that the biometric information matches registered biometric information, unlocking the device and displaying an indication in the first region of the user interface that the device has been unlocked.
30. The method of claim 29, wherein the second error indication includes a biometric sensor occlusion icon and a reticle, the second error indication providing an indication that the biometric sensor is occluded.
31. The method of claim 29, further comprising:
in attempting to obtain biometric information using the biometric sensor:
displaying a first progress indicator on the display, comprising:
in accordance with a determination that the user interface of the electronic device is in a third orientation relative to the biometric sensor, wherein the user interface in the third orientation has a first top side, the first progress indicator is displayed proximate to the first top side of the user interface in the third orientation; and
in accordance with a determination that the user interface of the electronic device is in a fourth orientation relative to the biometric sensor, wherein the user interface in the fourth orientation has a second top side, the first progress indicator is displayed proximate to the second top side of the user interface in the fourth orientation, and the third orientation is different from the fourth orientation.
32. The method of claim 29, further comprising:
displaying a second progress indicator of the electronic device on the display, the second progress indicator being an animation having a first portion and a second portion different from the first portion, comprising:
in accordance with a determination that the second progress indicator is displayed at the location proximate to the first portion of the electronic device, displaying the second error indication as a portion of the animation that follows the first portion and precedes the second portion.
33. The method of claim 29, further comprising:
displaying a home affordance at a third location in the user interface on the display; and
in accordance with a determination that the second error indication is displayed at the third location, displaying the second error indication at the third location while ceasing to display the home affordance.
34. The method of claim 33, further comprising:
after ceasing to display the home affordance, detecting a correction to the error condition that prevents the biometric sensor from obtaining biometric information about the user of the device; and
in response to detecting the correction to the error condition, displaying the home affordance on the display at the third location in the user interface.
35. The method of claim 29, further comprising:
detecting an input at the location proximate to the first portion of the electronic device; and
in response to detecting the input at the location proximate to the first portion of the electronic device, displaying the second error indication at a different location on the display, wherein the different location is a location at which the input was not detected.
36. The method of claim 29, further comprising:
displaying a first transaction interface on the display at a location proximate to the first portion of the electronic device.
37. The method of claim 36, wherein displaying the first transaction interface includes displaying an animation of the first transaction interface transitioning from an initial position that is substantially centered with respect to the display to the position proximate to the first portion of the electronic device.
38. The method of claim 29, further comprising:
displaying, on the display, a prompt providing one or more activations of a hardware button of the electronic device; receiving one or more activations of the hardware button of the electronic device; and
in response to receiving the one or more activations of the hardware button, displaying an authentication progress indicator on the display, wherein displaying the authentication progress indicator comprises displaying an animation of the authentication progress indicator transitioning from a position of the prompt to a final position of the authentication progress indicator.
39. The method of claim 29, further comprising:
concurrently displaying a first application in a first area and a second application in a second area on the display, the second application adjacent to the first application;
displaying a second transaction interface on the display; in accordance with a determination that the second transaction interface corresponds to the first application, modifying a first visual characteristic of the first application; and
in accordance with a determination that the second transaction interface corresponds to the second application, modify a first visual characteristic of the second application.
40. The method of claim 39, wherein:
modifying the first visual characteristic of the first application comprises modifying a second visual characteristic of the second application, and modifying the first visual characteristic of the second application comprises modifying a second visual characteristic of the first application.
41. The method of claim 39, wherein:
modifying the first visual characteristic of the first application includes, in accordance with a determination that the second area is closer to the first portion of the electronic device than the first area, displaying the first application in the second area, and
modifying the first visual characteristic of the second application includes, in accordance with a determination that the first region is closer to the first portion of the electronic device than the second region, displaying the second application in the first region.
42. The method of claim 39, wherein displaying the second transaction interface comprises:
in accordance with a determination that the second transaction interface corresponds to the first application, the second transaction interface includes an indication of the first application, an
In accordance with a determination that the second transaction interface corresponds to the second application, the second transaction interface includes an indication of the second application.
CN201911199010.1A 2017-09-09 2018-09-01 Implementation of biometric authentication Pending CN111258461A (en)

Applications Claiming Priority (27)

Application Number Priority Date Filing Date Title
US201762556413P 2017-09-09 2017-09-09
US62/556,413 2017-09-09
US201762557130P 2017-09-11 2017-09-11
US62/557,130 2017-09-11
DKPA201770714 2017-09-22
DKPA201770712A DK201770712A1 (en) 2017-09-09 2017-09-22 Implementation of biometric authentication
DKPA201770713A DK201770713A1 (en) 2017-09-09 2017-09-22 Implementation of biometric authentication
DKPA201770713 2017-09-22
DKPA201770714A DK179695B1 (en) 2017-09-09 2017-09-22 Implementation of biometric authentication
DKPA201770715 2017-09-22
DKPA201770712 2017-09-22
DKPA201770715A DK179696B1 (en) 2017-09-09 2017-09-22 Implementation of biometric authentication
US201762581025P 2017-11-02 2017-11-02
US62/581,025 2017-11-02
PCT/US2018/015603 WO2018226265A1 (en) 2017-09-09 2018-01-26 Implementation of biometric authentication
USPCT/US2018/015603 2018-01-26
US15/894,221 2018-02-12
US15/894,221 US10410076B2 (en) 2017-09-09 2018-02-12 Implementation of biometric authentication
US15/903,456 US10395128B2 (en) 2017-09-09 2018-02-23 Implementation of biometric authentication
US15/903,456 2018-02-23
US201862679955P 2018-06-03 2018-06-03
US62/679,955 2018-06-03
DKPA201870370 2018-06-12
DKPA201870370A DK179714B1 (en) 2017-09-09 2018-06-12 Implementation of biometric authentication
DKPA201870371A DK179715B1 (en) 2017-09-09 2018-06-12 Implementation of biometric authentication
DKPA201870371 2018-06-12
CN201880003211.7A CN110100249A (en) 2017-09-09 2018-09-01 The realization of biometric authentication

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201880003211.7A Division CN110100249A (en) 2017-09-09 2018-09-01 The realization of biometric authentication

Publications (1)

Publication Number Publication Date
CN111258461A true CN111258461A (en) 2020-06-09

Family

ID=65817653

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201880003211.7A Pending CN110100249A (en) 2017-09-09 2018-09-01 The realization of biometric authentication
CN201911199010.1A Pending CN111258461A (en) 2017-09-09 2018-09-01 Implementation of biometric authentication

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201880003211.7A Pending CN110100249A (en) 2017-09-09 2018-09-01 The realization of biometric authentication

Country Status (5)

Country Link
EP (1) EP3528173A1 (en)
JP (2) JP6792056B2 (en)
KR (2) KR102103866B1 (en)
CN (2) CN110100249A (en)
AU (3) AU2018312629B2 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10810293B2 (en) * 2018-10-16 2020-10-20 Motorola Solutions, Inc. Method and apparatus for dynamically adjusting biometric user authentication for accessing a communication device
CN112449088A (en) * 2019-08-30 2021-03-05 华为技术有限公司 Camera control method and device and terminal equipment
CN111027978B (en) * 2019-12-10 2023-05-02 腾讯科技(深圳)有限公司 Payment method, device, computer-readable storage medium and computer apparatus
JP7174730B2 (en) * 2020-03-17 2022-11-17 ヤフー株式会社 Terminal device, information processing method and information processing program
KR20210121548A (en) * 2020-03-30 2021-10-08 삼성전자주식회사 Electronic device for displaying guide information for inducing user input on set location and a method for the same
KR102199137B1 (en) * 2020-05-12 2021-01-06 스티븐 상근 오 Managing method, apparatus and program for management object using dual biometric authentication
WO2021261267A1 (en) * 2020-06-26 2021-12-30 ソニーグループ株式会社 Information processing device, information processing method, information processing program, and information processing system
CN112580434B (en) * 2020-11-25 2024-03-15 奥比中光科技集团股份有限公司 Face false detection optimization method and system based on depth camera and face detection equipment
US20240094895A1 (en) * 2021-04-28 2024-03-21 Google Llc Systems and methods for efficient multimodal input collection with mobile devices
WO2023230290A1 (en) * 2022-05-26 2023-11-30 Apple Inc. Devices, methods, and graphical user interfaces for user authentication and device management
WO2024003989A1 (en) * 2022-06-27 2024-01-04 日本電気株式会社 Information processing system, information processing method, and recording medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104361302A (en) * 2014-10-21 2015-02-18 天津三星电子有限公司 Method for protecting private information based on communication equipment and communication equipment
CN104539924A (en) * 2014-12-03 2015-04-22 深圳市亿思达科技集团有限公司 Holographic display method and holographic display device based on eye tracking
US20160294557A1 (en) * 2015-04-01 2016-10-06 Northrop Grumman Systems Corporation System and method for providing an automated biometric enrollment workflow
CN106020436A (en) * 2015-03-31 2016-10-12 富士通株式会社 Image analyzing apparatus and image analyzing method
CN106503514A (en) * 2016-09-28 2017-03-15 北京用友政务软件有限公司 Unlocking method and system based on the electric terminal equipment of iris identification
AU2017100556A4 (en) * 2016-06-12 2017-06-15 Apple Inc. User interfaces for transactions

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4963388B2 (en) * 2006-09-12 2012-06-27 日立オムロンターミナルソリューションズ株式会社 Biometric authentication device and operation guidance notification method
JP5110983B2 (en) * 2007-06-29 2012-12-26 日立オムロンターミナルソリューションズ株式会社 Biometric authentication processing system
JP5084712B2 (en) * 2008-12-24 2012-11-28 日立オムロンターミナルソリューションズ株式会社 User authentication terminal, authentication system, user authentication method, and user authentication program
JP5816677B2 (en) * 2009-10-16 2015-11-18 日立オムロンターミナルソリューションズ株式会社 Biometric authentication device and biometric authentication method
JP2011097287A (en) * 2009-10-28 2011-05-12 Nikon Corp Camera
US8736561B2 (en) * 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
JP5023193B2 (en) * 2010-06-28 2012-09-12 株式会社東芝 Information processing device
US10489570B2 (en) * 2011-09-09 2019-11-26 Google Llc Preventing computing device from timing out
JP5869316B2 (en) * 2011-11-25 2016-02-24 京セラ株式会社 Portable electronic devices
JP5715968B2 (en) * 2012-01-23 2015-05-13 富士通フロンテック株式会社 Bidding apparatus, bidding system, and bidding method
KR101443960B1 (en) * 2012-02-22 2014-11-03 주식회사 팬택 Electronic device and method for user identification
US9177130B2 (en) * 2012-03-15 2015-11-03 Google Inc. Facial feature detection
US8254647B1 (en) * 2012-04-16 2012-08-28 Google Inc. Facial image quality assessment
US10073541B1 (en) * 2012-06-22 2018-09-11 Amazon Technologies, Inc. Indicators for sensor occlusion
US9202099B2 (en) 2012-06-29 2015-12-01 Apple Inc. Fingerprint sensing and enrollment
US9832189B2 (en) * 2012-06-29 2017-11-28 Apple Inc. Automatic association of authentication credentials with biometrics
KR101443021B1 (en) 2013-03-08 2014-09-22 주식회사 슈프리마 Apparatus and method for registering face, and Apparatus for guiding pose, and Apparatus for recognizing face
US9898642B2 (en) * 2013-09-09 2018-02-20 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
KR20150029495A (en) * 2013-09-10 2015-03-18 삼성전자주식회사 Method and apparatus for outputting recognized error of sensor in a electronic device
KR101773233B1 (en) * 2014-07-25 2017-09-12 이동광 Managing method for repeating fail of biometric recognition
US20160224973A1 (en) * 2015-02-01 2016-08-04 Apple Inc. User interface for payments
KR102338864B1 (en) * 2015-02-12 2021-12-13 삼성전자주식회사 Electronic device and method for registration finger print
US10210318B2 (en) 2015-12-09 2019-02-19 Daon Holdings Limited Methods and systems for capturing biometric data
JP2017138846A (en) * 2016-02-04 2017-08-10 レノボ・シンガポール・プライベート・リミテッド Information processing apparatus, display method by the same, and computer-executable program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104361302A (en) * 2014-10-21 2015-02-18 天津三星电子有限公司 Method for protecting private information based on communication equipment and communication equipment
CN104539924A (en) * 2014-12-03 2015-04-22 深圳市亿思达科技集团有限公司 Holographic display method and holographic display device based on eye tracking
CN106020436A (en) * 2015-03-31 2016-10-12 富士通株式会社 Image analyzing apparatus and image analyzing method
US20160294557A1 (en) * 2015-04-01 2016-10-06 Northrop Grumman Systems Corporation System and method for providing an automated biometric enrollment workflow
AU2017100556A4 (en) * 2016-06-12 2017-06-15 Apple Inc. User interfaces for transactions
CN106503514A (en) * 2016-09-28 2017-03-15 北京用友政务软件有限公司 Unlocking method and system based on the electric terminal equipment of iris identification

Also Published As

Publication number Publication date
EP3528173A1 (en) 2019-08-21
JP2019204494A (en) 2019-11-28
AU2018312629A1 (en) 2019-03-28
JP6792056B2 (en) 2020-11-25
JP6812483B2 (en) 2021-01-13
KR20190029742A (en) 2019-03-20
AU2020200795B2 (en) 2020-03-12
AU2020200795A1 (en) 2020-02-20
AU2018312629B2 (en) 2019-11-21
KR102103866B1 (en) 2020-04-24
KR102056033B1 (en) 2019-12-13
JP2020500344A (en) 2020-01-09
KR20190029706A (en) 2019-03-20
CN110100249A (en) 2019-08-06
AU2020203899A1 (en) 2020-07-02

Similar Documents

Publication Publication Date Title
CN111274562B (en) Electronic device, method, and medium for biometric enrollment
JP6945697B2 (en) Implementation of biometrics
US11393258B2 (en) Implementation of biometric authentication
US20220342972A1 (en) Implementation of biometric authentication
KR102103866B1 (en) Implementation of biometric authentication
CN112243510A (en) Implementation of biometric authentication
CN110032849B (en) Implementation of biometric authentication
JP7386214B2 (en) Implementation of biometric authentication
DK179714B1 (en) Implementation of biometric authentication
AU2022203027A1 (en) Implementation of biometric authentication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination