WO2017132061A1 - Authentication via photoplethysmography - Google Patents

Authentication via photoplethysmography Download PDF

Info

Publication number
WO2017132061A1
WO2017132061A1 PCT/US2017/014312 US2017014312W WO2017132061A1 WO 2017132061 A1 WO2017132061 A1 WO 2017132061A1 US 2017014312 W US2017014312 W US 2017014312W WO 2017132061 A1 WO2017132061 A1 WO 2017132061A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
cardiac
computing device
electronic device
Prior art date
Application number
PCT/US2017/014312
Other languages
French (fr)
Inventor
Ari Juels
Original Assignee
Pcms Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pcms Holdings, Inc. filed Critical Pcms Holdings, Inc.
Publication of WO2017132061A1 publication Critical patent/WO2017132061A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/34User authentication involving the use of external additional devices, e.g. dongles or smart cards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • H04W12/065Continuous authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/30Security of mobile devices; Security of mobile applications
    • H04W12/33Security of mobile devices; Security of mobile applications using wearable devices, e.g. using a smartwatch or smart-glasses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan

Definitions

  • Sensitive information is increasingly being saved electronically and is accessible through various different computing devices, such as computers, tablets, and smart phones. Access to sensitive information is controlled via authentication methods, such as username and passwords, fingerprints, facial recognition, two-part authentication, and the like.
  • a method includes authenticating a wearer of a wearable device; operating the wearable device to collect a first set of cardiac data from the wearer; capturing video data of a user of a target computing device; extracting a second set of cardiac data from the captured video data; and authenticating the user of the target computing device by a method that includes comparing the first set of cardiac data to the second set of cardiac data.
  • Another embodiment takes the form of a method of authentication includes, the method including measuring, via a wearable electronic device, a first set of biometric parameter data; measuring, via a camera associated with the computing device, a second set of biometric parameter data; comparing the first and second sets of biometric parameter data; responsive to the first and second sets of biometric parameter data being associated with a user, outputting an authentication message.
  • FIG. 1A depicts an example communications system in which one or more disclosed embodiments may be implemented.
  • FIG. IB depicts an example electronic device that may be used within the communications system of FIG. 1A.
  • FIG. 1C depicts an example network entity 190, that may be used within the communication system 100 of FIG. 1A.
  • FIG. 2 depicts a first method, in accordance with an embodiment.
  • FIG. 3 depicts a block diagram of a system, in accordance with an embodiment.
  • FIG. 4A depicts an example method to recover cardiac data, in accordance with an embodiment.
  • FIGs. 4B-C depict example steps for the recovery of cardiac data from video data that may be used in some embodiments.
  • FIG. 5 depicts a user in relation to a computing device and a wearable electronic device, in accordance with an embodiment.
  • FIG. 6 is a functional block diagram illustrating components of an authentication system as describe herein.
  • FIG. 7 is a flow diagram illustrating a method performed by the exemplary system of FIG. 6.
  • FIG. 8 depicts a second method, in accordance with an embodiment.
  • FIG. 1 A is a diagram of an example communications system 100 in which one or more disclosed embodiments may be implemented.
  • the communications system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, and the like, to multiple wireless users.
  • the communications system 100 may enable multiple wired and wireless users to access such content through the sharing of system resources, including wired and wireless bandwidth.
  • the communications systems 100 may employ one or more channel-access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), and the like.
  • the communications systems 100 may also employ one or more wired communications standards (e.g.: Ethernet, DSL, radio frequency (RF) over coaxial cable, fiber optics, and the like.
  • RF radio frequency
  • the communications system 100 may include electronic devices 102a, 102b, 102c, and/or 102d, Radio Access Networks (RAN) 103/104/105, a core network 106/107/109, a public switched telephone network (PSTN) 108, the Internet 110, and other networks 112, and communication links 115/116/117, and 119, though it will be appreciated that the disclosed embodiments contemplate any number of electronic devices, base stations, networks, and/or network elements.
  • Each of the electronic devices 102a, 102b, 102c, 102d may be any type of device configured to operate and/or communicate in a wired or wireless environment.
  • the electronic device 102a is depicted as a tablet computer
  • the electronic device 102b is depicted as a smart phone
  • the electronic device 102c is depicted as a computer
  • the electronic device 102d is depicted as a television, although certainly other types of devices could be utilized.
  • the communications systems 100 may also include a base station 114a and a base station 114b.
  • Each of the base stations 114a, 114b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102a, 102b, 102c, 102d to facilitate access to one or more communication networks, such as the core network 106/107/109, the Internet 110, and/or the networks 112.
  • the base stations 114a, 114b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and the like. While the base stations 114a, 114b are each depicted as a single element, it will be appreciated that the base stations 114a, 114b may include any number of interconnected base stations and/or network elements.
  • the base station 114a may be part of the RAN 103/104/105, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, and the like.
  • the base station 114a and/or the base station 114b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown).
  • the cell may further be divided into sectors.
  • the cell associated with the base station 114a may be divided into three sectors.
  • the base station 114a may include three transceivers, i.e., one for each sector of the cell.
  • the base station 114a may employ multiple- input multiple output (MEVIO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.
  • MMVIO multiple- input multiple output
  • the base stations 114a, 114b may communicate with one or more of the electronic devices 102a, 102b, 102c, and 102d over an air interface 115/116/117, or communication link 119, which may be any suitable wired or wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, and the like).
  • the air interface 115/116/117 may be established using any suitable radio access technology (RAT).
  • RAT radio access technology
  • the communications system 100 may be a multiple access system and may employ one or more channel-access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like.
  • the base station 114a in the RAN 103/104/105 and the electronic devices 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 115/116/117 using wideband CDMA (WCDMA).
  • WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+).
  • HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).
  • the base station 114a and the electronic devices 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 115/116/117 using Long Term Evolution (LTE) and/or LTE-Advanced (LTE-A).
  • E-UTRA Evolved UMTS Terrestrial Radio Access
  • LTE Long Term Evolution
  • LTE-A LTE-Advanced
  • the base station 114a and the electronic devices 102a, 102b, 102c may implement radio technologies such as IEEE 802.16 (i.e., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.
  • IEEE 802.16 i.e., Worldwide Interoperability for Microwave Access (WiMAX)
  • CDMA2000, CDMA2000 IX, CDMA2000 EV-DO Code Division Multiple Access 2000
  • IS-95 Interim Standard 95
  • IS-856 Interim Standard 856
  • GSM Global System for Mobile communications
  • GSM Global System for Mobile communications
  • EDGE Enhanced Data rates for GSM Evolution
  • GERAN GSM EDGERAN
  • the base station 114b in FIG. 1 A may be a wired router, a wireless router, Home Node
  • the base station 114b and the electronic devices 102c, 102d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN).
  • the base station 114b and the electronic devices 102c, 102d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN).
  • WLAN wireless local area network
  • WPAN wireless personal area network
  • the base station 114b and the electronic devices 102c, 102d may utilize a cellular-based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE-A, and the like) to establish a picocell or femtocell.
  • the base station 114b communicates with electronic devices 102a, 102b, 102c, and 102d through communication links 119. As shown in FIG. 1 A, the base station 114b may have a direct connection to the Internet 110. Thus, the base station 114b may not be required to access the Internet 110 via the core network 106/107/109.
  • the RAN 103/104/105 may be in communication with the core network 106/107/109, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the electronic devices 102a, 102b, 102c, 102d.
  • the core network 106/107/109 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, and the like, and/or perform high-level security functions, such as user authentication.
  • the RAN 103/104/105 and/or the core network 106/107/109 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 103/104/105 or a different RAT.
  • the core network 106/107/109 may also be in communication with another RAN (not shown) employing a GSM radio technology.
  • the core network 106/107/109 may also serve as a gateway for the electronic devices 102a, 102b, 102c, 102d to access the PSTN 108, the Internet 110, and/or other networks 112.
  • the PSTN 108 may include circuit-switched telephone networks that provide plain old telephone service (POTS).
  • POTS plain old telephone service
  • the Internet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and IP in the TCP/IP Internet protocol suite.
  • the networks 112 may include wired and/or wireless communications networks owned and/or operated by other service providers.
  • the networks 112 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 103/104/105 or a different RAT.
  • Some or all of the electronic devices 102a, 102b, 102c, 102d in the communications system 100 may include multi-mode capabilities, i.e., the electronic devices 102a, 102b, 102c, 102d may include multiple transceivers for communicating with different wired or wireless networks over different communication links.
  • the WTRU 102c shown in FIG. 1A may be configured to communicate with the base station 114a, which may employ a cellular-based radio technology, and with the base station 114b, which may employ an IEEE 802 radio technology.
  • FIG. IB depicts an example electronic device that may be used within the communications system of FIG. 1A.
  • FIG. IB is a system diagram of an example electronic device 102.
  • the electronic device 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, a non-removable memory 130, a removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and other peripherals 138.
  • GPS global positioning system
  • the electronic device 102 may represent any of the electronic devices 102a, 102b, 102c, and 102d, and include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
  • the base stations 114a and 114b, and/or the nodes that base stations 114a and 114b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include some or all of the elements depicted in FIG. IB and described herein.
  • BTS transceiver station
  • AP access point
  • eNodeB evolved home node-B
  • HeNB home evolved node-B gateway
  • proxy nodes among others, may include some or all of the elements depicted in FIG. IB and described herein
  • the processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
  • the processor 1 18 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the electronic device 102 to operate in a wired or wireless environment.
  • the processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. IB depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
  • the transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 114a) over the air interface 115/116/117 or communication link 119.
  • a base station e.g., the base station 114a
  • the transmit/receive element 122 may be an antenna configured to transmit and/or receive RE signals.
  • the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR,
  • the transmit/receive element is UV, or visible light signals, as examples.
  • the transmit/receive element is UV, or visible light signals, as examples.
  • the transmit/receive element 122 may be configured to transmit and receive both RF and light signals.
  • the transmit/receive element may be a wired communication port, such as an
  • the transmit/receive element 122 may be configured to transmit and/or receive any combination of wired or wireless signals.
  • the transmit/receive element 122 is depicted in FIG. IB as a single element, the electronic device 102 may include any number of transmit/receive elements 122. More specifically, the electronic device 102 may employ MFMO technology. Thus, in one embodiment, the electronic device 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 115/116/117.
  • the transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122.
  • the electronic device 102 may have multi-mode capabilities.
  • the transceiver 120 may include multiple transceivers for enabling the electronic device 102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
  • the processor 118 of the electronic device 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128.
  • the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132.
  • the non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like.
  • SIM subscriber identity module
  • SD secure digital
  • the processor 118 may access information from, and store data in, memory that is not physically located on the electronic device 102, such as on a server or a home computer (not shown).
  • the processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the electronic device 102.
  • the power source 134 may be any suitable device for powering the electronic device 102.
  • the power source 134 may include one or more dry cell batteries (e.g., nickel- cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), and the like), solar cells, fuel cells, a wall outlet and the like.
  • the processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the electronic device 102.
  • location information e.g., longitude and latitude
  • the electronic device 102 may receive location information over the air interface 115/116/117 from a base station (e.g., base stations 114a, 114b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the electronic device 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment. In accordance with an embodiment, the electronic device 102 does not comprise a GPS chipset and does not acquire location information.
  • the processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, a thermometer, a barometer, an altimeter, an air sampler, a light detector, an accelerometer, a compass, a humidity detector, and the like.
  • the various peripherals may be configured to detect surrounding events in order to capture video and audio streams and associated contextual information.
  • FIG. 1C depicts an example network entity 190 that may be used within the communication system 100 of FIG. 1A.
  • network entity 190 includes a communication interface 192, a processor 194, and non-transitory data storage 196, all of which are communicatively linked by a bus, network, or other communication path 198.
  • Communication interface 192 may include one or more wired communication interfaces and/or one or more wireless-communication interfaces. With respect to wired communication, communication interface 192 may include one or more interfaces such as Ethernet interfaces, as an example. With respect to wireless communication, communication interface 192 may include components such as one or more antennae, one or more transceivers/chipsets designed and configured for one or more types of wireless (e.g., LTE) communication, and/or any other components deemed suitable by those of skill in the relevant art. And further with respect to wireless communication, communication interface 192 may be equipped at a scale and with a configuration appropriate for acting on the network side— as opposed to the client side— of wireless communications (e.g., LTE communications, Wi-Fi communications, and the like). Thus, communication interface 192 may include the appropriate equipment and circuitry (perhaps including multiple transceivers) for serving multiple mobile stations, UEs, or other access terminals in a coverage area.
  • wireless communication interface 192 may include the appropriate equipment and circuitry (perhaps including multiple transceivers)
  • Processor 194 may include one or more processors of any type deemed suitable by those of skill in the relevant art, some examples including a general-purpose microprocessor and a dedicated DSP.
  • Data storage 196 may take the form of any non-transitory computer-readable medium or combination of such media, some examples including flash memory, read-only memory (ROM), and random-access memory (RAM) to name but a few, as any one or more types of non- transitory data storage deemed suitable by those of skill in the relevant art could be used.
  • data storage 196 contains program instructions 197 executable by processor 194 for carrying out various combinations of the various network-entity functions described herein.
  • the network-entity functions described herein are carried out by a network entity having a structure similar to that of network entity 190 of FIG. 1C. In some embodiments, one or more of such functions are carried out by a set of multiple network entities in combination, where each network entity has a structure similar to that of network entity 190 of FIG. 1C.
  • network entity 190 is— or at least includes— one or more of the encoders, one or more of (one or more entities in) RAN 103, (one or more entities in) RAN 104, (one or more entities in) RAN 105, (one or more entities in) core network 106, (one or more entities in) core network 107, (one or more entities in) core network 109, base station 114a, base station 114b, Node-B 140a, Node-B 140b, Node-B 140c, RNC 142a, RNC 142b, MGW 144, MSC 146, SGSN 148, GGSN 150, eNode-B 160a, eNode-B 160b, eNode-B 160c, MME 162, serving gateway 164, PDN gateway 166, base station 180a, base station 180b, base station 180c, ASN gateway 182, MIP-HA 184, AAA 186, and gateway 188. And certainly other network entities and/or combinations
  • FIG. 2 depicts a first method, in accordance with an embodiment.
  • the method 200 includes measuring a first set of biometric parameter data via a wearable electronic device at step 202, measuring a second set of biometric data via a video camera associated with a computing device at step 204, comparing the first and second sets of biometric parameter data at step 206, and responsive to a match between the first and second sets of biometric parameter data, outputting an authentication message at step 208.
  • the first set of biometric data parameters is measured by a wearable electronic device, such as a wrist-worn fitness bracelet, a smart watch, or a heart-rate monitor.
  • the measured biometric parameters include data relating to the cardiac activity of the person wearing the electronic device.
  • the wearable electronic device uses a built-in sensor to record the user's heartbeats and constructs a time series of the heartbeat data.
  • the time series of heartbeat data may include, for example, a series of values representing the peak time of each pulse.
  • the time values may be absolute or relative time values.
  • each time value may represent the actual time of day at which the pulse was detected, they may represent the elapsed time since the start of measurement, or they may represent the elapsed time since the previous pulse. Other representations of the time series data may also be used.
  • the time series of heartbeat data may include a time series of measured values that correlate with a user's pulse, sampled at a rate substantially higher than a user's heartrate.
  • the time series may represent a value of skin resistance, a value of red light intensity, or a value representing strain on a strain gauge, as a function of time.
  • These values may be filtered values (e.g. with components substantially outside the frequency range of ordinary heartrate signals being filtered out).
  • the user is associated with the wearable electronic device.
  • a user may be associated with the wearable device in many different ways.
  • the user links a user profile to the wearable electronic device.
  • the user profile may be linked to the wearable electronic device by the user logging into wearable electronic device either directly, or via a computer communicatively coupled to the wearable electronic device.
  • the user is associated with the wearable device by placing the wearable electronic device on the user's body. When placing the wearable electronic device on the user's body, the user may enter in a password, a PIN (personal identification number), scan a fingerprint, or the like.
  • the association may be continual, as long as the user is wearing the electronic device as sensed by the clasp becoming undone or an interruption in the sensed biometric parameters.
  • the association between the user and the electronic device occurs when the photoplethysmography authentication occurs.
  • the second set of biometric parameters are measured with an imaging video camera associated with a computing device and a physiological parameter extraction module.
  • the computing device may be a laptop computer with a built-in camera, a desktop computer with a wireless or wirelessly connected camera, a smartphone or tablet, or the like.
  • the camera may be an RGB imaging camera.
  • the second set of biometric parameters is measured via non-contact photoplethysmography (PPG), the optical sensing of cardiac data.
  • PPG photoplethysmography
  • the camera associated with the computing device is configured to operate as a video camera to detect cardiac activity of a user in field of view of the camera.
  • An exemplary technique for the detection of cardiac data using a video camera and a physiological parameter extraction module is described in, for example, FIGs. 4A- 4B and its accompanying description. Other techniques that may be used for the detection of cardiac data include those described in the references of Goldin, Jonathan, and Poh.
  • a user' s cardiac activity includes inter-pulse intervals that are known to vary randomly over time, due to the non-linearly interacting processes in the parasympathetic nervous system that governs heart rhythms, as discussed in the references by Brownley and Nagel.
  • the cardiac data may be extracted from the video by the computing device, or the computing device may transmit the video data to a remote computer for extraction of the cardiac data.
  • the first and second sets of biometric parameter data are compared.
  • the comparison between the first and second sets of biometric parameters determines, under an appropriate distance metric, whether the first and second sets of biometric parameters are associated with the same user. This comparison determines whether the user visible in the computing device's camera is the same as the user of the wearable electronic device.
  • a positive authentication message is output in response to a determination that the first and second sets of biometric parameter data are associated with the same user.
  • the positive authentication message may be used by a variety of different services.
  • a service may provide local access to information on the computing device or may permit the user access to information stored remotely in response to a positive authentication message.
  • the matching module will output a negative authentication message in response to the two sets of biometric parameter data not matching. Responsive to receiving the negative authentication message, a security module associated with the computing device may prohibit access to the user, or an alternate form of authentication may be triggered.
  • FIG. 3 depicts a block diagram of a system, in accordance with an embodiment.
  • the system 300 includes a wearable electronic device 302 that includes a biometric sensor 312, a computing device 304 that includes a video camera 314, a matching module 306, and communication links 308-310.
  • the components of system 300 may be used to perform the method 200.
  • both the wearable electronic device 302 and the computing device 304 may be configured to operate and communicate with other devices similar to the electronic device 102 discussed in FIGs. 1 A and IB.
  • the wearable electronic device 302 may be any of a number of wearable electronic devices that is capable of measuring biometric parameters. To perform step 202 of the method 200, the wearable electronic device 302 measures the biometric data associated with the user. One example of measuring biometric parameters is measuring cardiac data associated with the user wearing the wearable electronic device 302. For example, the wearable electronic device 302 may include the biometric sensor 312 that is located adjacent to the user's skin and is configured to detect cardiac activity.
  • the computing device 304 may be used to perform step 204 of the method 200.
  • the computing device 304 may be any number of computing devices, including, but not limited to a desktop computer, a laptop computer, a tablet computer, a smart phone, an in-car computing system, and the like.
  • the computing device 304 also includes the video camera 314 that is associated with the computing device.
  • the video camera 314 may be a part of the computing device 304, for example a built-in webcam in a laptop, or the video camera 314 may be communicatively coupled to the computing device 304.
  • the video camera 314 may be a component of a camera conferencing system connected via a USB cable or wirelessly to the computing device 304.
  • the computing device 304 is configured to detect a user's cardiac activity via the associated camera via PPG.
  • the cardiac activity is detected by detecting the effects of a cardiovascular pulse wave on the user.
  • the wave changes the volume of a person's blood vessels and in turn affects the absorption of light by the person's skin.
  • the matching module 306 may be used to perform steps 206 and 208 of the method
  • the matching module 306 receives the first set of biometric data from the wearable electronic device 302 via the communication link 308 and receives the second set of biometric data from the computing device 304 via the communication link 310.
  • the matching module 306 may be located as part of the computing device 304 or be part of a remote server, or other similar location as known by those with skill in the art.
  • the matching module 306 may receive either raw biometric data, or the biometric data may be further processed via the communication links 308-310.
  • the first set of biometric data received by the matching module is the first set of biometric data received by the matching module
  • the second set of biometric data received by the matching module 306 via the communication link 310 includes times of detected heartbeats associated with the user within the camera's field of view.
  • the conversion of raw data of detected biometric parameters into data used by the matching module may occur locally at the devices 302-
  • the matching module outputs an authentication message indicating whether or not the biometric data in the first set is a sufficient match to the biometric data in the second set.
  • the authentication message may be received by a security module associated with another computer that is responsible for executing security protocols for access to information stored on a computer. Additional biometric parameter data that may be compared by the matching module includes respiratory data.
  • electrooculography EOG may be used by a wearable electronic device (e.g. smart glasses or other headset) to measure electrical potentials caused by eye movements, and a separate computing device may use a video camera to measure eye movements, such that sufficient correlation between the EOG signals and eye movements detected in the video image leads to authentication of the user.
  • the matching module compares two different types of biometric parameters.
  • the wearable electronic device 302 detects and measures a user's cardiac activity and the computing device 304 detects and measures a user's respiratory activity.
  • the user's cardiac activity measured by the wearable electronic device 302 may be converted to an estimate of the user's respiratory activity based on detection of the respiratory sinus arrhythmia (RSA) (according to which heartrate increases during inhalation and decreases during exhalation).
  • RSA respiratory sinus arrhythmia
  • the matching module compares the estimated respiratory activity (based on heartrate) with the measured respiratory activity to determine whether there is a sufficient match to authenticate the user.
  • the communication links 308-310 may also be used to transmit additional authentication information to the matching module.
  • the additional authentication information may include information regarding the user authenticating with the wearable electronic device 302 or the computing device 304.
  • the matching module 306 may also match the additional authentication information.
  • the additional authentication information includes an identity of the user associated with the wearable electronic device 302.
  • the biometric data may be transmitted in batches, on a series basis, or a similar manner as known by those with skill in the art.
  • the electronic devices can measure and transmit biometric parameter data in any order, but a time-based overlap of the two biometric parameters is used to perform the match.
  • the wearable electronic device 302 does not continuously stream biometric parameter data.
  • the biometric parameter data may be transmitted from the wearable electronic device 302 in response to receiving a trigger from the computing device 304, the trigger indicating that the user is attempting to authenticate via PPG.
  • the communication links 308-310, and any other communication links between devices may be secure channels providing confidentiality and integrity between devices. Different channel-authentication requirements may be used between different devices.
  • the matching module 306 receives from a wearable electronic device 302 both verified credentials associated with the user and a first set of biometric data.
  • the matching module 306 receives from the computing device 304 a second set of biometric data.
  • the matching module 306 compares the first and second sets of biometric data, and responsive to the data matching outputs both an authentication message and the identity associated with the user of the wearable electronic device.
  • FIG. 4A depicts an example method to recover cardiac data, in accordance with an embodiment.
  • FIG. 4A depicts the example method 400.
  • the example method 400 includes a video camera detecting a region of interest at step 402, decomposing the region of interest into separate color channels at step 404, detrending and normalizing the raw color channel data at step 406, and applying independent component analysis at step 408.
  • the exemplary independent component analysis generates three separate source signals (indicated by reference number 458 in FIG. 4B), one of which (“Separated Source 2”) is determined to be a cardiac signal, specifically a blood volume pulse (BVP) signal.
  • BVP blood volume pulse
  • the example method 400 may be accomplished by a video camera that is able to capture video data that is communicatively coupled with a physiological parameter extraction module.
  • FIGs. 4B and 4C depict example steps of a method to recover cardiac data from video data in accordance with an embodiment.
  • FIGs. 4B and 4C are adapted from Poh, Ming-Zher, Daniel J. McDuff, and Rosalind W. Picard, "Advancements in noncontact, multiparameter physiological measurements using a webcam," Biomedical Engineering, IEEE Transactions on 58.1 (2011): 7-11, the entirety of which is incorporated herein by reference.
  • Other techniques for capturing cardiac data such as IPI times using video imaging may alternatively be used.
  • FIG. 4B depicts the set of steps 450, that illustrate the steps of the example method 400.
  • the video camera detects a region of interest 452.
  • the region of interest 452 is detected by software that is configured to identify the coordinates of a face's location in a video frame.
  • the video is captured in color, such as 24-bit RGB with three channels and 8 bits/channel at 15 frames per second with pixel resolution of 640 x 480.
  • the region of interest 452 is decomposed into separate color channels.
  • the separate color channels include the red channel 454a, the green channel 454b, and the blue channel 454c.
  • the data from each raw color channel is detrended and normalized to form raw color signals 456a-c for each color.
  • Detrending and normalizing each channel includes spatially averaging overall all pixels in the region of interest to yield a red, blue, and green measurement point for each frame.
  • the detrending of each channel includes a procedure based on a smoothness priors approach.
  • independent component analysis (ICA) 457 is performed to extract cardiac data.
  • the ICA 457 is based on the joint approximate diagonalization of eigenmatricies (JADE) algorithm.
  • JADE joint approximate diagonalization of eigenmatricies
  • the ICA 457 is able to perform motion-artifact removal by separating the fluctuations caused predominately by the Blood Volume Pulse (BVP) (illustrated as "Separated Source 2”) from other fluctuations.
  • BVP Blood Volume Pulse
  • the BVP source signal is smoothed using a five-point moving average filter and bandpass filtered.
  • the signal is interpolated with a cubic spline function at a sampling frequency of 256 Hz to refine the BVP peak fiducial point.
  • the BVP peaks in the interpolated signal are detected to obtain the inter-pulse intervals (IPIs) 462.
  • FIG. 5 depicts a user in relation to a computing device and a wearable electronic device, in accordance with an embodiment.
  • FIG. 5 depicts the view 500.
  • the view 500 includes a user 502, the wearable electronic device 302, the computing device 304, a video camera 504, a camera field of view 506.
  • the user 502 is wearing the wearable electronic device 302 on the wrist and is interacting with the computing device 304.
  • the wearable electronic device is measuring a first set of biometric parameters, here the user's cardiac activity.
  • the camera 504 associated with the computing device 304 detects a human in the camera's field of view 506.
  • the camera 504 is used to measure the second set of biometric parameters, the user 502' s cardiac activity via PPG.
  • the computing device 504 measures biometric parameters via PPG in response to detecting a human face in the camera's field of view 506. Depending on security policies, the user may need to satisfy a user-engagement check before cardiac activity will be measured. In an alternative embodiment, the computing device 504 also outputs a detected level of user engagement. Access to information may then be responsive to both a match via the PPG authentication and satisfaction of the user-engagement check.
  • the user-engagement check may take any number of forms.
  • the user-engagement check may be satisfied only when a single face is present in the camera's field of view 506, when a face is of sufficient size (a close enough distance to the camera 504), identification of the closest face to the camera 504, or the like. If no face is present, the authentication method is aborted and the matching module fails to transmit an authentication message or alternatively transmits a failed authentication message.
  • the authentication process may also include a facial recognition check in parallel with the PPG authentication.
  • the matching module 306 receives the first and second sets of biometric parameters and matches the users via an appropriate distance metric.
  • the distance metric compares two sets of biometric parameter data are in the form of a time-series (ti, t2, t n ), where t, denotes the time of the i th pulse. The time of the pulse may be at the peak time, or another suitable time of pulse.
  • the time-series A and B can be compared on the basis of absolute time.
  • the distance between A and B can be computed according to any number of distance metrics for time-series.
  • Example distance metrics that may be used in embodiments include: a statistical hypothesis-testing framework described in the reference by Rostami, a time- series similarity query as described in the reference of Goldin, a dynamic time warping distance metric, and a Euclidian distance metric, among others.
  • the match may be made on the basis of relative time if the clocks are not synchronized.
  • a distance metric that measures the similarity of interpulse intervals (PIs) would be suitable.
  • a match using this distance metric finds an alignment of A and B that minimizes differences in IPIs between the two time-series.
  • a technique of capturing IPI measurements using a webcam, which may be used in some embodiments, is described in the reference by Poh 2011.
  • the matching module 306 Responsive to the time-series distance being less than a threshold distance, the matching module 306 outputs the authentication message.
  • the distance metric is selected to be robust against time-series "stuffing" attacks, in which a malicious user creates a dense time-series to include a low apparent distance. For example, a metric that sums the minimum distance between every point in ⁇ 4 and the closest point in B would be vulnerable to a "stuffing" attacking against B.
  • the matching module compares biometric data other than cardiac data.
  • the matching may be based on respiration.
  • a wearable electronic device with a respiration sensor such as an ultrasonic respiration rate measurement as described in the reference by Min, measures a user's respiration rate.
  • a computing device with a camera is also able to measure a user's respiration rate via PPG.
  • a matching module is configured to perform a time-series distance measurement on the two sets of repertory data to determine if they match the same user.
  • the PPG authentication method is used to authenticate a user on a local computer.
  • a computing device includes both a video camera and a matching module.
  • the matching module establishes a communication link with the wearable electronic device to receive biometric data and additional authorization data.
  • the communication link may be a secure communication link.
  • the matching module is local to the computing device, and does not transmit biometric data over a network.
  • the matching module outputs an authentication message to a security module located at the computing device.
  • the security module permits the user associated with the received biometric data to access and use the computing device.
  • the security module may permit access to the computing device along a set security profile that varies access to the portions of the computing device based on the identity associated with the user profile.
  • authentication based on PPG provides for continual authentication.
  • the continual authentication maintains a user authenticated as long as certain parameters are met.
  • Example parameters include a continuous, periodic, or random time interval matches of multiple sets of biometric parameters.
  • the matching module initially authenticates the user via matching two sets of biometric data. The authentication remains valid until the user is no longer in close proximity to the computing device, as determined by the user's face being visible by the camera associated with the computing device, a received signal strength from the wearable electronic device, or the like.
  • authentication based on PPG may be combined with other authentication methods, serving as a primary, secondary, or even a tertiary form of authentication.
  • the authentication based on PPG may be used with facial identity, password protection, biometric identification, PIN number verification, RSA tokens, and proximity-based authentication mechanisms, among others.
  • a security level associated with the authentication based on PPG can be increased by sampling longer periods of cardiac data. Approximately four (4) bits of entropy is able to be extracted per IPI. The longer period of time can be selected to match minimum-entropy levels of user-selected passwords.
  • an anonymous credential is used. The stable, individual, component to a person's pulse can be used to serve as a biometric identity. A fully anonymous credential system would include the matching module residing on the computing device or be well secured with a remote server.
  • authentication via PPG can be used to grant a person physical access to a space.
  • a wearable electronic device is worn by a user and measures a first set of biometric parameters, such as a time series of heartbeats.
  • the user approaches a camera near a physical barrier, such as a locked door, a turnstile, or the like.
  • the camera measures a second set of biometric parameters, which may also be a time series of heartbeats.
  • a matching module compares the sets of biometric parameters. Responsive to the sets of biometric parameters matching, an authentication message is output. Responsive to receiving the authentication message, physical access to the space is granted to the user, through unlocking a door, or the like.
  • a user with a tablet computer is attempting to access a financial services application that has access to information stored on a remote server. Access to the financial services application, and data available through it, is restricted to users that cannot successfully pass authentication.
  • the authentication can include successfully passing authentication via PPG.
  • the wearable electronic device can transmit the measured biometric parameter data to the matching module, which may include be software running on the tablet computer or a remote server. The transmission may be to the tablet computer via a wireless connection, such as Bluetooth.
  • the user launches the financial services application on the tablet computer.
  • the user is also wearing and identified with a wearable electronic device, such as a fitness bracelet or a smartwatch, that has the ability to measure biometric parameters such as cardiac activity.
  • a wearable electronic device such as a fitness bracelet or a smartwatch
  • biometric parameters such as cardiac activity.
  • a user-profile is stored on the wearable electronic device.
  • the wearable electronic device can also transmit the identity associated with the user profile when transmitting biometric parameters.
  • the tablet computer is equipped with a camera and a physiological parameter extraction module that together are configured to measure biometric parameters, such as the cardiac activity of a user via PPG.
  • the biometric parameters are converted to a time-series of data.
  • a matching module on the tablet computer receives the two time-series of data representing biometric parameter data from both the tablet computer and the wearable electronic device. Responsive to the comparison of the time- series data being a successful match, the matching module outputs the authentication verification message to the remote server, and the user is granted access to the financial services application and the data available through it.
  • This example use case preserves the privacy of the user by not transmitting biometric parameter data via the Internet to the remote server.
  • the second example use case can be modified by locating the matching module at the remote server.
  • the matching module then receives the time-series data from the wearable electronic device and the tablet computer to perform the comparison.
  • This method provides greater resilience to malware on the tablet computer affecting the matching module to falsely output a match.
  • the time-series data can be extracted from the raw data of the biometric parameters at the wearable electronic device, at the tablet computer, or transmitted to a trusted remote server for extraction and forwarding to the matching module.
  • an alternative form of authentication may be used, or access may remain restricted. Additional forms of authentication can include passwords, requiring the user to transcribe digits from the wearable electronic device, answer personal questions, respond to an SMS text message, or the like.
  • the user can maintain access to the financial services application until access is terminated. Access may be terminated in response to the biometric parameters failing to match, the camera not detecting the user for a predetermined period of time, or the like.
  • Access may be terminated in response to the biometric parameters failing to match, the camera not detecting the user for a predetermined period of time, or the like.
  • an alternative computing system that authenticates a user based only on the proximity of the user's wearable computing device (e.g., access to the system is permitted only when the user's smartwatch is within Bluetooth range). Such a system would still allow access even when the user is, for example, across the room (or possibly even in the next office).
  • exemplary systems and methods descried herein permit access only when the user herself is present at the computing system (e.g. is sitting within the field of view of the computer's webcam).
  • FIG. 6 depicts an example system in accordance with an embodiment.
  • FIG. 6 depicts the system 600 that includes a user U 602, a wearable electronic device W 604, a target computing device T 606, a matching module MM 608, a relying party R 610, and a user session 612.
  • the system of FIG. 6 may be used in conjunction with steps 718-720 of method 700 illustrated in FIG. 7.
  • the user 602 provides authentication information to the wearable electronic device W 604, such as a PIN number.
  • the authentication of the user to the wearable device W remains valid until the device W detects that it has been detached from the user, e.g. as detected by a sensor associated with a clasp closure.
  • the user U 602 also provides a video presence to the target computing device T 606.
  • W 604 provides authentication data to the matching module MM 608 that may include identity information of the user U 602 and a physiological value (PV) series of data A.
  • the target computing device T 606 obtains and provides the PV series B data to the matching module MM
  • the wearable electronic device W 604 authenticates the user U 602 to which it is attached. For example, if the wearable electronic device W 604 is a wrist-worn fitness device, the user U 602 may authenticate to the wearable electronic device W 604 when placing it on her body via a biometric or by entering a PIN. This authentication remains valid until the wearable electronic device W604 is detached, as sensed by a clasp. Alternatively, the wearable electronic device W 604 may explicitly authenticate the user when prompted by MM 608 using, e.g., a pulse-based biometric.
  • the wearable electronic device W 604 provides user authenticating information to the matching module MM 608 and asserts user identity U 602 to the matching module MM 608.
  • the wearable electronic device W 604 stores credentials for the user U 602 and can authenticate on IPs 602 behalf to the matching moduleA4M608.
  • the matching module MM 608 resides on the target computing device T 606 and a secure channel is established between the wearable electronic device W 604 and the target computing device T 606. Then the wearable electronic device W 604 may transmit a user-specific password to the matching module MM 608 as a means of authenticating the user U 602, thereby asserting that the user U 602 is wearing the wearable electronic device W 604.
  • the wearable electronic device W 604 measures a time series A of cardiac data.
  • the wearable electronic device W 604 uses a built-in sensor to record, e.g., the user LP 602 heartbeats.
  • the wearable electronic device W 604 can then construct and transmit a time series of heartbeat data.
  • the wearable electronic device W 604 transmits the physiological value (PV) series A to the matching module MM 608.
  • the wearable electronic device W 604 may do so in batches or on a seriesing basis. It should be understood that steps 702-708 may be performed in orders other than those described above.
  • the target computing device ⁇ 606 identifies a face in the field of view of its video camera. Depending on the precise security goal of the system, different engagement policies may be implemented.
  • An engagement policy specifies the conditions under which a visible face is deemed valid and, in the case that multiple faces are visible, which face should be identified as that of the user.
  • Exemplary policies are: "Identify the closest face” or “Identify each face larger than a given size (within a certain distance of the camera).” In the latter case, the system may attempt to authenticate each visible user in turn. The fact that face-recognition technologies are rapidly approaching human levels of accuracy offers the possibility of support for a rich range of engagement policies. If no valid face is present, the target computing device ⁇ 606 may abort the protocol and fail to authenticate the user U 602.
  • the target computing device T 606 extracts a time series B of photoplethysmographic data from the subject.
  • the target computing device ⁇ 606 extracts cardiac data, e.g., heartbeat times from the identified face via video camera.
  • the target computing device T 606 transmits the PV series B data to the matching module MM 608.
  • PV series ⁇ this may be done either in batches or in a seriesing manner. If the matching module MM 608 resides with the relying party R 610, then the target computing device ⁇ 606 transmits series B data over the channel by which the user has established a session with relying party R 610. For an internet session, this channel might, for instance, be HTTPS.
  • the matching module MM 608 compares series A data with series B data.
  • the matching module MM 608 determines whether the data from series A and B are sufficiently close, under an appropriate distance metric, to constitute a match and thus authenticate the user U 602. Any of a number of distance metrics may be appropriate for the matching operation.
  • the matching module MM 608 outputs to the relying party R 610 the result of the matching operation. If a match is obtained, then the matching module MM 608 "accepts" user the user U 602 at step 720, meaning that it validates the presence of the user U in the field of view of the video camera in the target computing device T 606 under the engagement policy for the system. Otherwise, at step 720, a rejection of the authentication occurs.
  • FIG. 8 depicts a second method, in accordance with an embodiment.
  • FIG. 8 depicts the method 100 that includes authenticating a wearer of a wearable device 802, operating a wearable device to collect a first set of cardiac data at 804, capturing video data of the user of a target computing device at 806, extracting a second set of cardiac data from the captured video data at 808, and authenticating the user of the target device by comparing the first and second sets of cardiac data at 810.
  • a wearer of a wearable device is authenticated.
  • the authentication comprises authenticating the wearer to the wearable device and authenticating the wearable device to the target computing device.
  • the wearable device is a smart watch or a wrist-worn fitness device.
  • capturing video data of the user of the target computing device comprises capturing video data using a camera on the target computing device, in some embodiments.
  • the authentication may further include performing facial recognition on the video data.
  • access to a user device may be terminated in response to a failure to authenticate the user of the target computing device.
  • the first and second set of cardia data comprise heartbeat times or inter-pulse interval times.
  • comparing the first and second set of cardiac data comprises determining a distance metric between the first and second sets of cardiac data.
  • the authentication may occur only in response to the distance metric being less than a threshold.
  • modules that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules.
  • a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation.
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer- readable medium or media, such as commonly referred to as RAM, ROM, etc.
  • Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
  • ROM read only memory
  • RAM random access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
  • a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Computer Hardware Design (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Cardiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physiology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The present disclosure describes systems and methods for authenticating via photoplethysmography. In accordance with an embodiment a method includes authenticating a wearer of a wearable device; operating the wearable device to collect a first set of cardiac data from the wearer; capturing video data of a user of a target computing device; extracting a second set of cardiac data from the captured video data; and authenticating the user of the target computing device by a method that includes comparing the first set of cardiac data to the second set of cardiac data.

Description

AUTHENTICATION VIA PHOTOPLETHYSMOGRAPHY
CROSS-REFERENCE TO RELATED APPLICATION
[0001] The present application is a non-provisional filing of, and claims benefit under 35 U.S.C. §119(e) from, U.S. Provisional Patent Application Serial No. 62/288,399, entitled "Authentication via Photoplesythmography", filed January 28, 2016, the entirety of which is incorporated herein by reference.
BACKGROUND
[0002] Sensitive information is increasingly being saved electronically and is accessible through various different computing devices, such as computers, tablets, and smart phones. Access to sensitive information is controlled via authentication methods, such as username and passwords, fingerprints, facial recognition, two-part authentication, and the like.
[0003] Many of the authentication methods are cumbersome for users attempting to access the information, are error prone, or have vulnerabilities that may allow unauthorized access to the information.
SUMMARY
[0004] Described herein are systems and methods for authentication via photoplethysmography. In accordance with an embodiment a method includes authenticating a wearer of a wearable device; operating the wearable device to collect a first set of cardiac data from the wearer; capturing video data of a user of a target computing device; extracting a second set of cardiac data from the captured video data; and authenticating the user of the target computing device by a method that includes comparing the first set of cardiac data to the second set of cardiac data.
[0005] Another embodiment takes the form of a method of authentication includes, the method including measuring, via a wearable electronic device, a first set of biometric parameter data; measuring, via a camera associated with the computing device, a second set of biometric parameter data; comparing the first and second sets of biometric parameter data; responsive to the first and second sets of biometric parameter data being associated with a user, outputting an authentication message.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] A more detailed understanding may be had from the following description, presented by way of example in conjunction with the accompanying drawings, wherein: [0007] FIG. 1A depicts an example communications system in which one or more disclosed embodiments may be implemented.
[0008] FIG. IB depicts an example electronic device that may be used within the communications system of FIG. 1A.
[0009] FIG. 1C depicts an example network entity 190, that may be used within the communication system 100 of FIG. 1A.
[0010] FIG. 2 depicts a first method, in accordance with an embodiment.
[0011] FIG. 3 depicts a block diagram of a system, in accordance with an embodiment.
[0012] FIG. 4A depicts an example method to recover cardiac data, in accordance with an embodiment.
[0013] FIGs. 4B-C depict example steps for the recovery of cardiac data from video data that may be used in some embodiments.
[0014] FIG. 5 depicts a user in relation to a computing device and a wearable electronic device, in accordance with an embodiment.
[0015] FIG. 6 is a functional block diagram illustrating components of an authentication system as describe herein.
[0016] FIG. 7 is a flow diagram illustrating a method performed by the exemplary system of FIG. 6.
[0017] FIG. 8 depicts a second method, in accordance with an embodiment.
DETAILED DESCRIPTION
[0018] A detailed description of illustrative embodiments will now be provided with reference to the various figures. Although this description provides detailed examples of possible implementations, it should be noted that the provided details are intended to be by way of example and in no way limit the scope of the application. The systems and methods relating to authentication via photoplethysmography may be used with the wired and wireless communication systems described with respect to FIGS. 1A-1C. As an initial matter, these wired and wireless systems will be described.
[0019] FIG. 1 A is a diagram of an example communications system 100 in which one or more disclosed embodiments may be implemented. The communications system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, and the like, to multiple wireless users. The communications system 100 may enable multiple wired and wireless users to access such content through the sharing of system resources, including wired and wireless bandwidth. For example, the communications systems 100 may employ one or more channel-access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), and the like. The communications systems 100 may also employ one or more wired communications standards (e.g.: Ethernet, DSL, radio frequency (RF) over coaxial cable, fiber optics, and the like.
[0020] As shown in FIG. 1A, the communications system 100 may include electronic devices 102a, 102b, 102c, and/or 102d, Radio Access Networks (RAN) 103/104/105, a core network 106/107/109, a public switched telephone network (PSTN) 108, the Internet 110, and other networks 112, and communication links 115/116/117, and 119, though it will be appreciated that the disclosed embodiments contemplate any number of electronic devices, base stations, networks, and/or network elements. Each of the electronic devices 102a, 102b, 102c, 102d may be any type of device configured to operate and/or communicate in a wired or wireless environment. By way of example, the electronic device 102a is depicted as a tablet computer, the electronic device 102b is depicted as a smart phone, the electronic device 102c is depicted as a computer, and the electronic device 102d is depicted as a television, although certainly other types of devices could be utilized.
[0021] The communications systems 100 may also include a base station 114a and a base station 114b. Each of the base stations 114a, 114b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102a, 102b, 102c, 102d to facilitate access to one or more communication networks, such as the core network 106/107/109, the Internet 110, and/or the networks 112. By way of example, the base stations 114a, 114b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and the like. While the base stations 114a, 114b are each depicted as a single element, it will be appreciated that the base stations 114a, 114b may include any number of interconnected base stations and/or network elements.
[0022] The base station 114a may be part of the RAN 103/104/105, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, and the like. The base station 114a and/or the base station 114b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown). The cell may further be divided into sectors. For example, the cell associated with the base station 114a may be divided into three sectors. Thus, in one embodiment, the base station 114a may include three transceivers, i.e., one for each sector of the cell. In another embodiment, the base station 114a may employ multiple- input multiple output (MEVIO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.
[0023] The base stations 114a, 114b may communicate with one or more of the electronic devices 102a, 102b, 102c, and 102d over an air interface 115/116/117, or communication link 119, which may be any suitable wired or wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, and the like). The air interface 115/116/117 may be established using any suitable radio access technology (RAT).
[0024] More specifically, as noted above, the communications system 100 may be a multiple access system and may employ one or more channel-access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like. For example, the base station 114a in the RAN 103/104/105 and the electronic devices 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 115/116/117 using wideband CDMA (WCDMA). WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+). HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).
[0025] In another embodiment, the base station 114a and the electronic devices 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 115/116/117 using Long Term Evolution (LTE) and/or LTE-Advanced (LTE-A).
[0026] In other embodiments, the base station 114a and the electronic devices 102a, 102b, 102c may implement radio technologies such as IEEE 802.16 (i.e., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.
[0027] The base station 114b in FIG. 1 A may be a wired router, a wireless router, Home Node
B, Home eNode B, or access point, as examples, and may utilize any suitable wired transmission standard or RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and the like. In one embodiment, the base station 114b and the electronic devices 102c, 102d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN). In another embodiment, the base station 114b and the electronic devices 102c, 102d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN). In yet another embodiment, the base station 114b and the electronic devices 102c, 102d may utilize a cellular-based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE-A, and the like) to establish a picocell or femtocell. In yet another embodiment, the base station 114b communicates with electronic devices 102a, 102b, 102c, and 102d through communication links 119. As shown in FIG. 1 A, the base station 114b may have a direct connection to the Internet 110. Thus, the base station 114b may not be required to access the Internet 110 via the core network 106/107/109.
[0028] The RAN 103/104/105 may be in communication with the core network 106/107/109, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the electronic devices 102a, 102b, 102c, 102d. As examples, the core network 106/107/109 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, and the like, and/or perform high-level security functions, such as user authentication. Although not shown in FIG. 1 A, it will be appreciated that the RAN 103/104/105 and/or the core network 106/107/109 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 103/104/105 or a different RAT. For example, in addition to being connected to the RAN 103/104/105, which may be utilizing an E-UTRA radio technology, the core network 106/107/109 may also be in communication with another RAN (not shown) employing a GSM radio technology.
[0029] The core network 106/107/109 may also serve as a gateway for the electronic devices 102a, 102b, 102c, 102d to access the PSTN 108, the Internet 110, and/or other networks 112. The PSTN 108 may include circuit-switched telephone networks that provide plain old telephone service (POTS). The Internet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and IP in the TCP/IP Internet protocol suite. The networks 112 may include wired and/or wireless communications networks owned and/or operated by other service providers. For example, the networks 112 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 103/104/105 or a different RAT.
[0030] Some or all of the electronic devices 102a, 102b, 102c, 102d in the communications system 100 may include multi-mode capabilities, i.e., the electronic devices 102a, 102b, 102c, 102d may include multiple transceivers for communicating with different wired or wireless networks over different communication links. For example, the WTRU 102c shown in FIG. 1A may be configured to communicate with the base station 114a, which may employ a cellular-based radio technology, and with the base station 114b, which may employ an IEEE 802 radio technology.
[0031] FIG. IB depicts an example electronic device that may be used within the communications system of FIG. 1A. In particular, FIG. IB is a system diagram of an example electronic device 102. As shown in FIG. IB, the electronic device 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, a non-removable memory 130, a removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and other peripherals 138. It will be appreciated that the electronic device 102 may represent any of the electronic devices 102a, 102b, 102c, and 102d, and include any sub-combination of the foregoing elements while remaining consistent with an embodiment. Also, embodiments contemplate that the base stations 114a and 114b, and/or the nodes that base stations 114a and 114b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include some or all of the elements depicted in FIG. IB and described herein.
[0032] The processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 1 18 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the electronic device 102 to operate in a wired or wireless environment. The processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. IB depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
[0033] The transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 114a) over the air interface 115/116/117 or communication link 119. For example, in one embodiment, the transmit/receive element 122 may be an antenna configured to transmit and/or receive RE signals. In another embodiment, the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR,
UV, or visible light signals, as examples. In yet another embodiment, the transmit/receive element
122 may be configured to transmit and receive both RF and light signals. In yet another embodiment, the transmit/receive element may be a wired communication port, such as an
Ethernet port. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wired or wireless signals.
[0034] In addition, although the transmit/receive element 122 is depicted in FIG. IB as a single element, the electronic device 102 may include any number of transmit/receive elements 122. More specifically, the electronic device 102 may employ MFMO technology. Thus, in one embodiment, the electronic device 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 115/116/117.
[0035] The transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122. As noted above, the electronic device 102 may have multi-mode capabilities. Thus, the transceiver 120 may include multiple transceivers for enabling the electronic device 102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
[0036] The processor 118 of the electronic device 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128. In addition, the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132. The non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 118 may access information from, and store data in, memory that is not physically located on the electronic device 102, such as on a server or a home computer (not shown).
[0037] The processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the electronic device 102. The power source 134 may be any suitable device for powering the electronic device 102. As examples, the power source 134 may include one or more dry cell batteries (e.g., nickel- cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), and the like), solar cells, fuel cells, a wall outlet and the like. [0038] The processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the electronic device 102. In addition to, or in lieu of, the information from the GPS chipset 136, the electronic device 102 may receive location information over the air interface 115/116/117 from a base station (e.g., base stations 114a, 114b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the electronic device 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment. In accordance with an embodiment, the electronic device 102 does not comprise a GPS chipset and does not acquire location information.
[0039] The processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, a thermometer, a barometer, an altimeter, an air sampler, a light detector, an accelerometer, a compass, a humidity detector, and the like. The various peripherals may be configured to detect surrounding events in order to capture video and audio streams and associated contextual information.
[0040] FIG. 1C depicts an example network entity 190 that may be used within the communication system 100 of FIG. 1A. As depicted in FIG. 1C, network entity 190 includes a communication interface 192, a processor 194, and non-transitory data storage 196, all of which are communicatively linked by a bus, network, or other communication path 198.
[0041] Communication interface 192 may include one or more wired communication interfaces and/or one or more wireless-communication interfaces. With respect to wired communication, communication interface 192 may include one or more interfaces such as Ethernet interfaces, as an example. With respect to wireless communication, communication interface 192 may include components such as one or more antennae, one or more transceivers/chipsets designed and configured for one or more types of wireless (e.g., LTE) communication, and/or any other components deemed suitable by those of skill in the relevant art. And further with respect to wireless communication, communication interface 192 may be equipped at a scale and with a configuration appropriate for acting on the network side— as opposed to the client side— of wireless communications (e.g., LTE communications, Wi-Fi communications, and the like). Thus, communication interface 192 may include the appropriate equipment and circuitry (perhaps including multiple transceivers) for serving multiple mobile stations, UEs, or other access terminals in a coverage area.
[0042] Processor 194 may include one or more processors of any type deemed suitable by those of skill in the relevant art, some examples including a general-purpose microprocessor and a dedicated DSP.
[0043] Data storage 196 may take the form of any non-transitory computer-readable medium or combination of such media, some examples including flash memory, read-only memory (ROM), and random-access memory (RAM) to name but a few, as any one or more types of non- transitory data storage deemed suitable by those of skill in the relevant art could be used. As depicted in FIG. 1C, data storage 196 contains program instructions 197 executable by processor 194 for carrying out various combinations of the various network-entity functions described herein.
[0044] In some embodiments, the network-entity functions described herein are carried out by a network entity having a structure similar to that of network entity 190 of FIG. 1C. In some embodiments, one or more of such functions are carried out by a set of multiple network entities in combination, where each network entity has a structure similar to that of network entity 190 of FIG. 1C. In various different embodiments, network entity 190 is— or at least includes— one or more of the encoders, one or more of (one or more entities in) RAN 103, (one or more entities in) RAN 104, (one or more entities in) RAN 105, (one or more entities in) core network 106, (one or more entities in) core network 107, (one or more entities in) core network 109, base station 114a, base station 114b, Node-B 140a, Node-B 140b, Node-B 140c, RNC 142a, RNC 142b, MGW 144, MSC 146, SGSN 148, GGSN 150, eNode-B 160a, eNode-B 160b, eNode-B 160c, MME 162, serving gateway 164, PDN gateway 166, base station 180a, base station 180b, base station 180c, ASN gateway 182, MIP-HA 184, AAA 186, and gateway 188. And certainly other network entities and/or combinations of network entities could be used in various embodiments for carrying out the network-entity functions described herein, as the foregoing list is provided by way of example and not by way of limitation.
[0045] FIG. 2 depicts a first method, in accordance with an embodiment. In particular, FIG. 2 depicts the method 200. The method 200 includes measuring a first set of biometric parameter data via a wearable electronic device at step 202, measuring a second set of biometric data via a video camera associated with a computing device at step 204, comparing the first and second sets of biometric parameter data at step 206, and responsive to a match between the first and second sets of biometric parameter data, outputting an authentication message at step 208. [0046] In step 202, the first set of biometric data parameters is measured by a wearable electronic device, such as a wrist-worn fitness bracelet, a smart watch, or a heart-rate monitor. The measured biometric parameters include data relating to the cardiac activity of the person wearing the electronic device.
[0047] In an exemplary embodiment, the wearable electronic device uses a built-in sensor to record the user's heartbeats and constructs a time series of the heartbeat data. The time series of heartbeat data may include, for example, a series of values representing the peak time of each pulse. The time values may be absolute or relative time values. For example, each time value may represent the actual time of day at which the pulse was detected, they may represent the elapsed time since the start of measurement, or they may represent the elapsed time since the previous pulse. Other representations of the time series data may also be used. In some embodiments, the time series of heartbeat data may include a time series of measured values that correlate with a user's pulse, sampled at a rate substantially higher than a user's heartrate. For example, the time series may represent a value of skin resistance, a value of red light intensity, or a value representing strain on a strain gauge, as a function of time. These values may be filtered values (e.g. with components substantially outside the frequency range of ordinary heartrate signals being filtered out).
[0048] In some embodiments, the user is associated with the wearable electronic device. A user may be associated with the wearable device in many different ways. In one embodiment, the user links a user profile to the wearable electronic device. The user profile may be linked to the wearable electronic device by the user logging into wearable electronic device either directly, or via a computer communicatively coupled to the wearable electronic device. In another embodiment, the user is associated with the wearable device by placing the wearable electronic device on the user's body. When placing the wearable electronic device on the user's body, the user may enter in a password, a PIN (personal identification number), scan a fingerprint, or the like. The association may be continual, as long as the user is wearing the electronic device as sensed by the clasp becoming undone or an interruption in the sensed biometric parameters. In another embodiment, the association between the user and the electronic device occurs when the photoplethysmography authentication occurs.
[0049] At step 204, the second set of biometric parameters are measured with an imaging video camera associated with a computing device and a physiological parameter extraction module. The computing device may be a laptop computer with a built-in camera, a desktop computer with a wireless or wirelessly connected camera, a smartphone or tablet, or the like. The camera may be an RGB imaging camera. The second set of biometric parameters is measured via non-contact photoplethysmography (PPG), the optical sensing of cardiac data. The camera associated with the computing device is configured to operate as a video camera to detect cardiac activity of a user in field of view of the camera. An exemplary technique for the detection of cardiac data using a video camera and a physiological parameter extraction module is described in, for example, FIGs. 4A- 4B and its accompanying description. Other techniques that may be used for the detection of cardiac data include those described in the references of Goldin, Jonathan, and Poh.
[0050] A user' s cardiac activity includes inter-pulse intervals that are known to vary randomly over time, due to the non-linearly interacting processes in the parasympathetic nervous system that governs heart rhythms, as discussed in the references by Brownley and Nagel. The cardiac data may be extracted from the video by the computing device, or the computing device may transmit the video data to a remote computer for extraction of the cardiac data.
[0051] At step 206, the first and second sets of biometric parameter data are compared. The comparison between the first and second sets of biometric parameters determines, under an appropriate distance metric, whether the first and second sets of biometric parameters are associated with the same user. This comparison determines whether the user visible in the computing device's camera is the same as the user of the wearable electronic device.
[0052] At step 208, a positive authentication message is output in response to a determination that the first and second sets of biometric parameter data are associated with the same user. The positive authentication message may be used by a variety of different services. For example, a service may provide local access to information on the computing device or may permit the user access to information stored remotely in response to a positive authentication message.
[0053] In some embodiments, the matching module will output a negative authentication message in response to the two sets of biometric parameter data not matching. Responsive to receiving the negative authentication message, a security module associated with the computing device may prohibit access to the user, or an alternate form of authentication may be triggered.
[0054] FIG. 3 depicts a block diagram of a system, in accordance with an embodiment. In particular, FIG. 3 depicts the system 300. The system 300 includes a wearable electronic device 302 that includes a biometric sensor 312, a computing device 304 that includes a video camera 314, a matching module 306, and communication links 308-310. The components of system 300 may be used to perform the method 200. Additionally, both the wearable electronic device 302 and the computing device 304 may be configured to operate and communicate with other devices similar to the electronic device 102 discussed in FIGs. 1 A and IB.
[0055] The wearable electronic device 302 may be any of a number of wearable electronic devices that is capable of measuring biometric parameters. To perform step 202 of the method 200, the wearable electronic device 302 measures the biometric data associated with the user. One example of measuring biometric parameters is measuring cardiac data associated with the user wearing the wearable electronic device 302. For example, the wearable electronic device 302 may include the biometric sensor 312 that is located adjacent to the user's skin and is configured to detect cardiac activity.
[0056] The computing device 304 may be used to perform step 204 of the method 200. The computing device 304 may be any number of computing devices, including, but not limited to a desktop computer, a laptop computer, a tablet computer, a smart phone, an in-car computing system, and the like. The computing device 304 also includes the video camera 314 that is associated with the computing device. The video camera 314 may be a part of the computing device 304, for example a built-in webcam in a laptop, or the video camera 314 may be communicatively coupled to the computing device 304. For example, the video camera 314 may be a component of a camera conferencing system connected via a USB cable or wirelessly to the computing device 304.
[0057] The computing device 304 is configured to detect a user's cardiac activity via the associated camera via PPG. The cardiac activity is detected by detecting the effects of a cardiovascular pulse wave on the user. The wave changes the volume of a person's blood vessels and in turn affects the absorption of light by the person's skin.
[0058] The matching module 306 may be used to perform steps 206 and 208 of the method
200. The matching module 306 receives the first set of biometric data from the wearable electronic device 302 via the communication link 308 and receives the second set of biometric data from the computing device 304 via the communication link 310. The matching module 306 may be located as part of the computing device 304 or be part of a remote server, or other similar location as known by those with skill in the art. The matching module 306 may receive either raw biometric data, or the biometric data may be further processed via the communication links 308-310.
[0059] In one embodiment, the first set of biometric data received by the matching module
306 via the communication link 308 includes times of detected heartbeats associated with the user wearing the wearable electronic device 302. The second set of biometric data received by the matching module 306 via the communication link 310 includes times of detected heartbeats associated with the user within the camera's field of view. The conversion of raw data of detected biometric parameters into data used by the matching module may occur locally at the devices 302-
304 or at a separate entity. In performing step 208, the matching module outputs an authentication message indicating whether or not the biometric data in the first set is a sufficient match to the biometric data in the second set. The authentication message may be received by a security module associated with another computer that is responsible for executing security protocols for access to information stored on a computer. Additional biometric parameter data that may be compared by the matching module includes respiratory data. In an alternative embodiment, electrooculography (EOG) may be used by a wearable electronic device (e.g. smart glasses or other headset) to measure electrical potentials caused by eye movements, and a separate computing device may use a video camera to measure eye movements, such that sufficient correlation between the EOG signals and eye movements detected in the video image leads to authentication of the user.
[0060] In some embodiments, the matching module compares two different types of biometric parameters. In such an embodiment, the wearable electronic device 302 detects and measures a user's cardiac activity and the computing device 304 detects and measures a user's respiratory activity. The user's cardiac activity measured by the wearable electronic device 302 may be converted to an estimate of the user's respiratory activity based on detection of the respiratory sinus arrhythmia (RSA) (according to which heartrate increases during inhalation and decreases during exhalation). The matching module compares the estimated respiratory activity (based on heartrate) with the measured respiratory activity to determine whether there is a sufficient match to authenticate the user.
[0061] The communication links 308-310 may also be used to transmit additional authentication information to the matching module. The additional authentication information may include information regarding the user authenticating with the wearable electronic device 302 or the computing device 304. The matching module 306 may also match the additional authentication information. In another embodiment, the additional authentication information includes an identity of the user associated with the wearable electronic device 302. The biometric data may be transmitted in batches, on a series basis, or a similar manner as known by those with skill in the art. The electronic devices can measure and transmit biometric parameter data in any order, but a time-based overlap of the two biometric parameters is used to perform the match.
[0062] In some embodiments, the wearable electronic device 302 does not continuously stream biometric parameter data. The biometric parameter data may be transmitted from the wearable electronic device 302 in response to receiving a trigger from the computing device 304, the trigger indicating that the user is attempting to authenticate via PPG.
[0063] The communication links 308-310, and any other communication links between devices may be secure channels providing confidentiality and integrity between devices. Different channel-authentication requirements may be used between different devices.
[0064] In one embodiment, the matching module 306 receives from a wearable electronic device 302 both verified credentials associated with the user and a first set of biometric data. The matching module 306 receives from the computing device 304 a second set of biometric data. The matching module 306 compares the first and second sets of biometric data, and responsive to the data matching outputs both an authentication message and the identity associated with the user of the wearable electronic device.
[0065] FIG. 4A depicts an example method to recover cardiac data, in accordance with an embodiment. FIG. 4A depicts the example method 400. The example method 400 includes a video camera detecting a region of interest at step 402, decomposing the region of interest into separate color channels at step 404, detrending and normalizing the raw color channel data at step 406, and applying independent component analysis at step 408. The exemplary independent component analysis generates three separate source signals (indicated by reference number 458 in FIG. 4B), one of which ("Separated Source 2") is determined to be a cardiac signal, specifically a blood volume pulse (BVP) signal. Smoothing and filtering of the BVP signal waveform is performed at step 410, and peaks are extracted from the BVP waveform to form inter-pulse intervals (IPIs) at step 412. The example method 400 may be accomplished by a video camera that is able to capture video data that is communicatively coupled with a physiological parameter extraction module.
[0066] FIGs. 4B and 4C depict example steps of a method to recover cardiac data from video data in accordance with an embodiment. FIGs. 4B and 4C are adapted from Poh, Ming-Zher, Daniel J. McDuff, and Rosalind W. Picard, "Advancements in noncontact, multiparameter physiological measurements using a webcam," Biomedical Engineering, IEEE Transactions on 58.1 (2011): 7-11, the entirety of which is incorporated herein by reference. Other techniques for capturing cardiac data such as IPI times using video imaging may alternatively be used.
[0067] In particular, FIG. 4B depicts the set of steps 450, that illustrate the steps of the example method 400. At step 402, the video camera detects a region of interest 452. The region of interest 452 is detected by software that is configured to identify the coordinates of a face's location in a video frame. The video is captured in color, such as 24-bit RGB with three channels and 8 bits/channel at 15 frames per second with pixel resolution of 640 x 480.
[0068] At step 404, the region of interest 452 is decomposed into separate color channels. The separate color channels include the red channel 454a, the green channel 454b, and the blue channel 454c.
[0069] At step 406, the data from each raw color channel is detrended and normalized to form raw color signals 456a-c for each color. Detrending and normalizing each channel includes spatially averaging overall all pixels in the region of interest to yield a red, blue, and green measurement point for each frame. The detrending of each channel includes a procedure based on a smoothness priors approach. A normalized raw trace y (t) is produced per equation (1) for each color channel. yi(0 = *¾^ (i)
[0070] Where yt(t) is the spatial average of the raw signal, έ is the mean, and ot is the standard deviation.
[0071] At step 408, independent component analysis (ICA) 457 is performed to extract cardiac data. The ICA 457 is based on the joint approximate diagonalization of eigenmatricies (JADE) algorithm. The ICA 457 is able to perform motion-artifact removal by separating the fluctuations caused predominately by the Blood Volume Pulse (BVP) (illustrated as "Separated Source 2") from other fluctuations.
[0072] At step 410, the BVP source signal is smoothed using a five-point moving average filter and bandpass filtered.
[0073] At step 412, the signal is interpolated with a cubic spline function at a sampling frequency of 256 Hz to refine the BVP peak fiducial point. The BVP peaks in the interpolated signal are detected to obtain the inter-pulse intervals (IPIs) 462.
[0074] FIG. 5 depicts a user in relation to a computing device and a wearable electronic device, in accordance with an embodiment. In particular, FIG. 5 depicts the view 500. The view 500 includes a user 502, the wearable electronic device 302, the computing device 304, a video camera 504, a camera field of view 506.
[0075] The user 502 is wearing the wearable electronic device 302 on the wrist and is interacting with the computing device 304. The wearable electronic device is measuring a first set of biometric parameters, here the user's cardiac activity. The camera 504 associated with the computing device 304 detects a human in the camera's field of view 506. The camera 504 is used to measure the second set of biometric parameters, the user 502' s cardiac activity via PPG.
[0076] In some embodiments, the computing device 504 measures biometric parameters via PPG in response to detecting a human face in the camera's field of view 506. Depending on security policies, the user may need to satisfy a user-engagement check before cardiac activity will be measured. In an alternative embodiment, the computing device 504 also outputs a detected level of user engagement. Access to information may then be responsive to both a match via the PPG authentication and satisfaction of the user-engagement check.
[0077] The user-engagement check may take any number of forms. The user-engagement check may be satisfied only when a single face is present in the camera's field of view 506, when a face is of sufficient size (a close enough distance to the camera 504), identification of the closest face to the camera 504, or the like. If no face is present, the authentication method is aborted and the matching module fails to transmit an authentication message or alternatively transmits a failed authentication message. The authentication process may also include a facial recognition check in parallel with the PPG authentication.
[0078] In accordance with an embodiment, the matching module 306 receives the first and second sets of biometric parameters and matches the users via an appropriate distance metric. In one embodiment, the distance metric compares two sets of biometric parameter data are in the form of a time-series (ti, t2, tn), where t, denotes the time of the ith pulse. The time of the pulse may be at the peak time, or another suitable time of pulse. The first set of biometric parameters sensed by the wearable electronic device 302 is denoted as the time-series A = ai, a.2, a„ and the second set of biometric parameters sensed by the computing device 304 is denoted as the time- series B = bi, b2, b„.
[0079] Assuming synchronization between clocks associated with the wearable electronic device 302 and the computing device 304, the time-series A and B can be compared on the basis of absolute time. The distance between A and B can be computed according to any number of distance metrics for time-series. Example distance metrics that may be used in embodiments include: a statistical hypothesis-testing framework described in the reference by Rostami, a time- series similarity query as described in the reference of Goldin, a dynamic time warping distance metric, and a Euclidian distance metric, among others.
[0080] Alternatively, the match may be made on the basis of relative time if the clocks are not synchronized. A distance metric that measures the similarity of interpulse intervals ( PIs) would be suitable. A match using this distance metric finds an alignment of A and B that minimizes differences in IPIs between the two time-series. A technique of capturing IPI measurements using a webcam, which may be used in some embodiments, is described in the reference by Poh 2011.
[0081] Responsive to the time-series distance being less than a threshold distance, the matching module 306 outputs the authentication message.
[0082] In one embodiment, the distance metric is selected to be robust against time-series "stuffing" attacks, in which a malicious user creates a dense time-series to include a low apparent distance. For example, a metric that sums the minimum distance between every point in ^4 and the closest point in B would be vulnerable to a "stuffing" attacking against B.
[0083] In an example embodiment, the matching module compares biometric data other than cardiac data. For example, the matching may be based on respiration. In an example of matching based on respiration, a wearable electronic device with a respiration sensor, such as an ultrasonic respiration rate measurement as described in the reference by Min, measures a user's respiration rate. A computing device with a camera is also able to measure a user's respiration rate via PPG. A matching module is configured to perform a time-series distance measurement on the two sets of repertory data to determine if they match the same user.
[0084] In an example embodiment, the PPG authentication method is used to authenticate a user on a local computer. In this embodiment, a computing device includes both a video camera and a matching module. The matching module establishes a communication link with the wearable electronic device to receive biometric data and additional authorization data. The communication link may be a secure communication link. In this embodiment, the matching module is local to the computing device, and does not transmit biometric data over a network. In response to biometric parameters from the wearable electronic device matching the biometric parameters detected by the video camera, the matching module outputs an authentication message to a security module located at the computing device. In response to receiving the authentication message, the security module permits the user associated with the received biometric data to access and use the computing device. When coupled with a user profile associated with the wearable electronic device, the security module may permit access to the computing device along a set security profile that varies access to the portions of the computing device based on the identity associated with the user profile.
[0085] In one embodiment, authentication based on PPG provides for continual authentication. The continual authentication maintains a user authenticated as long as certain parameters are met. Example parameters include a continuous, periodic, or random time interval matches of multiple sets of biometric parameters. In another embodiment, the matching module initially authenticates the user via matching two sets of biometric data. The authentication remains valid until the user is no longer in close proximity to the computing device, as determined by the user's face being visible by the camera associated with the computing device, a received signal strength from the wearable electronic device, or the like.
[0086] In one embodiment, authentication based on PPG may be combined with other authentication methods, serving as a primary, secondary, or even a tertiary form of authentication. The authentication based on PPG may be used with facial identity, password protection, biometric identification, PIN number verification, RSA tokens, and proximity-based authentication mechanisms, among others.
[0087] A security level associated with the authentication based on PPG can be increased by sampling longer periods of cardiac data. Approximately four (4) bits of entropy is able to be extracted per IPI. The longer period of time can be selected to match minimum-entropy levels of user-selected passwords. [0088] In one embodiment, an anonymous credential is used. The stable, individual, component to a person's pulse can be used to serve as a biometric identity. A fully anonymous credential system would include the matching module residing on the computing device or be well secured with a remote server.
[0089] In a first exemplary use case, authentication via PPG can be used to grant a person physical access to a space. In this embodiment, a wearable electronic device is worn by a user and measures a first set of biometric parameters, such as a time series of heartbeats. The user approaches a camera near a physical barrier, such as a locked door, a turnstile, or the like. The camera measures a second set of biometric parameters, which may also be a time series of heartbeats. A matching module compares the sets of biometric parameters. Responsive to the sets of biometric parameters matching, an authentication message is output. Responsive to receiving the authentication message, physical access to the space is granted to the user, through unlocking a door, or the like.
[0090] In a second example use case, a user with a tablet computer is attempting to access a financial services application that has access to information stored on a remote server. Access to the financial services application, and data available through it, is restricted to users that cannot successfully pass authentication. The authentication can include successfully passing authentication via PPG. The wearable electronic device can transmit the measured biometric parameter data to the matching module, which may include be software running on the tablet computer or a remote server. The transmission may be to the tablet computer via a wireless connection, such as Bluetooth.
[0091] In this example use case, the user launches the financial services application on the tablet computer. The user is also wearing and identified with a wearable electronic device, such as a fitness bracelet or a smartwatch, that has the ability to measure biometric parameters such as cardiac activity. To identify the user with the wearable electronic device, a user-profile is stored on the wearable electronic device. The wearable electronic device can also transmit the identity associated with the user profile when transmitting biometric parameters. The tablet computer is equipped with a camera and a physiological parameter extraction module that together are configured to measure biometric parameters, such as the cardiac activity of a user via PPG. The biometric parameters are converted to a time-series of data. A matching module on the tablet computer receives the two time-series of data representing biometric parameter data from both the tablet computer and the wearable electronic device. Responsive to the comparison of the time- series data being a successful match, the matching module outputs the authentication verification message to the remote server, and the user is granted access to the financial services application and the data available through it.
[0092] This example use case preserves the privacy of the user by not transmitting biometric parameter data via the Internet to the remote server.
[0093] The second example use case can be modified by locating the matching module at the remote server. The matching module then receives the time-series data from the wearable electronic device and the tablet computer to perform the comparison. This method provides greater resilience to malware on the tablet computer affecting the matching module to falsely output a match. The time-series data can be extracted from the raw data of the biometric parameters at the wearable electronic device, at the tablet computer, or transmitted to a trusted remote server for extraction and forwarding to the matching module.
[0094] If the user fails to authenticate via PPG, an alternative form of authentication may be used, or access may remain restricted. Additional forms of authentication can include passwords, requiring the user to transcribe digits from the wearable electronic device, answer personal questions, respond to an SMS text message, or the like.
[0095] The user can maintain access to the financial services application until access is terminated. Access may be terminated in response to the biometric parameters failing to match, the camera not detecting the user for a predetermined period of time, or the like. By way of comparison, consider an alternative computing system that authenticates a user based only on the proximity of the user's wearable computing device (e.g., access to the system is permitted only when the user's smartwatch is within Bluetooth range). Such a system would still allow access even when the user is, for example, across the room (or possibly even in the next office). In contrast, exemplary systems and methods descried herein permit access only when the user herself is present at the computing system (e.g. is sitting within the field of view of the computer's webcam). The systems and methods disclosed herein cannot be defeated by a still image or recorded video of the user's face (which may otherwise defeat a facial recognition system) because a still image or recorded video would not provide the proper time series of physiological data. Systems disclosed herein thus help to prevent others from taking advantage of a user who has stepped away from the terminal but neglected to log out of all sensitive accounts.
[0096] FIG. 6 depicts an example system in accordance with an embodiment. In particular, FIG. 6 depicts the system 600 that includes a user U 602, a wearable electronic device W 604, a target computing device T 606, a matching module MM 608, a relying party R 610, and a user session 612. The system of FIG. 6 may be used in conjunction with steps 718-720 of method 700 illustrated in FIG. 7. [0097] In the system 600, the user 602 provides authentication information to the wearable electronic device W 604, such as a PIN number. In some embodiments, the authentication of the user to the wearable device W remains valid until the device W detects that it has been detached from the user, e.g. as detected by a sensor associated with a clasp closure. The user U 602 also provides a video presence to the target computing device T 606. The wearable electronic device
W 604 provides authentication data to the matching module MM 608 that may include identity information of the user U 602 and a physiological value (PV) series of data A. The target computing device T 606 obtains and provides the PV series B data to the matching module MM
608.
[0098] At step 702 of FIG. 7, the wearable electronic device W 604 authenticates the user U 602 to which it is attached. For example, if the wearable electronic device W 604 is a wrist-worn fitness device, the user U 602 may authenticate to the wearable electronic device W 604 when placing it on her body via a biometric or by entering a PIN. This authentication remains valid until the wearable electronic device W604 is detached, as sensed by a clasp. Alternatively, the wearable electronic device W 604 may explicitly authenticate the user when prompted by MM 608 using, e.g., a pulse-based biometric.
[0099] At step 704, the wearable electronic device W 604 provides user authenticating information to the matching module MM 608 and asserts user identity U 602 to the matching module MM 608. The wearable electronic device W 604 stores credentials for the user U 602 and can authenticate on IPs 602 behalf to the matching moduleA4M608. As a simple example, suppose that the matching module MM 608 resides on the target computing device T 606 and a secure channel is established between the wearable electronic device W 604 and the target computing device T 606. Then the wearable electronic device W 604 may transmit a user-specific password to the matching module MM 608 as a means of authenticating the user U 602, thereby asserting that the user U 602 is wearing the wearable electronic device W 604.
[0100] At step 706, the wearable electronic device W 604 measures a time series A of cardiac data. The wearable electronic device W 604 uses a built-in sensor to record, e.g., the user LP 602 heartbeats. The wearable electronic device W 604 can then construct and transmit a time series of heartbeat data.
[0101] At step 708, the wearable electronic device W 604 transmits the physiological value (PV) series A to the matching module MM 608. The wearable electronic device W 604 may do so in batches or on a seriesing basis. It should be understood that steps 702-708 may be performed in orders other than those described above. [0102] At step 710, the target computing device Γ 606 identifies a face in the field of view of its video camera. Depending on the precise security goal of the system, different engagement policies may be implemented. An engagement policy specifies the conditions under which a visible face is deemed valid and, in the case that multiple faces are visible, which face should be identified as that of the user. Exemplary policies are: "Identify the closest face" or "Identify each face larger than a given size (within a certain distance of the camera)." In the latter case, the system may attempt to authenticate each visible user in turn. The fact that face-recognition technologies are rapidly approaching human levels of accuracy offers the possibility of support for a rich range of engagement policies. If no valid face is present, the target computing device Γ 606 may abort the protocol and fail to authenticate the user U 602.
[0103] At step 712, the target computing device T 606 extracts a time series B of photoplethysmographic data from the subject. The target computing device Γ606 extracts cardiac data, e.g., heartbeat times from the identified face via video camera.
[0104] At step 714, the target computing device T 606 transmits the PV series B data to the matching module MM 608. As for PV series ^, this may be done either in batches or in a seriesing manner. If the matching module MM 608 resides with the relying party R 610, then the target computing device Γ606 transmits series B data over the channel by which the user has established a session with relying party R 610. For an internet session, this channel might, for instance, be HTTPS.
[0105] At step 716, the matching module MM 608 compares series A data with series B data. The matching module MM 608 determines whether the data from series A and B are sufficiently close, under an appropriate distance metric, to constitute a match and thus authenticate the user U 602. Any of a number of distance metrics may be appropriate for the matching operation.
[0106] At step 718, the matching module MM 608 outputs to the relying party R 610 the result of the matching operation. If a match is obtained, then the matching module MM 608 "accepts" user the user U 602 at step 720, meaning that it validates the presence of the user U in the field of view of the video camera in the target computing device T 606 under the engagement policy for the system. Otherwise, at step 720, a rejection of the authentication occurs.
[0107] FIG. 8 depicts a second method, in accordance with an embodiment. In particular, FIG. 8 depicts the method 100 that includes authenticating a wearer of a wearable device 802, operating a wearable device to collect a first set of cardiac data at 804, capturing video data of the user of a target computing device at 806, extracting a second set of cardiac data from the captured video data at 808, and authenticating the user of the target device by comparing the first and second sets of cardiac data at 810. [0108] At 802, a wearer of a wearable device is authenticated. In some embodiments, the authentication comprises authenticating the wearer to the wearable device and authenticating the wearable device to the target computing device.
[0109] In some embodiments, the wearable device is a smart watch or a wrist-worn fitness device.
[0110] At 806, capturing video data of the user of the target computing device comprises capturing video data using a camera on the target computing device, in some embodiments.
[0111] At 810, the authentication may further include performing facial recognition on the video data.
[0112] In some embodiments, access to a user device may be terminated in response to a failure to authenticate the user of the target computing device.
[0113] In some embodiments, the first and second set of cardia data comprise heartbeat times or inter-pulse interval times.
[0114] In some embodiments, comparing the first and second set of cardiac data comprises determining a distance metric between the first and second sets of cardiac data. The authentication may occur only in response to the distance metric being less than a threshold.
[0115] Note that various hardware elements of one or more of the described embodiments are referred to as "modules" that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer- readable medium or media, such as commonly referred to as RAM, ROM, etc.
[0116] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer- readable medium for execution by a computer or processor. Examples of computer-readable media include electronic signals (transmitted over wired or wireless connections) and computer- readable storage media. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.
REFERENCES
[0117] Brownley, Hurwitz, and Schneiderman. Cardiovascular psychophysiology. Handbook of psychophysiology, 2:224-264, 2000.
[0118] Bao, Poon, Zhang, and Shen. Using the timing information of heartbeats as an entity identifier to secure body sensor network. IEEE Trans, on Info. Tech. in Biomedicine, 12(6): 772- 779, 2008
[0119] Cho and Lee. Biometric based secure communications without pre-deployed key for biosensor implanted in body sensor networks. In Information Security Applications, pages 203- 218, 2012.
[0120] Francis, Hancke, Mayes, and Markantonakis. Practical NFC peer-to-peer relay attack using mobile phones. In Radio Frequency Identification: Security and Privacy Issues (pp. 35-49). Springer Berlin Heidelberg, 2010.
[0121] Goldin and Kanellakis. On similarity queries for time-series data: constraint specification and implementation. In Principles and Practice of Constraint Programming— CP'95 (pp. 137-153). Springer Berlin Heidelberg, January 1995.
[0122] Grimaldi, Kurylyak, Lamonaca, and Nastro. Photoplethysmography detection by smartphone's videocamera. In Intelligent Data Acquisition and Advanced Computing Systems (IDAACS), 2011 IEEE 6th International Conference on (Vol. 1, pp. 488-491). IEEE, September 2011.
[0123] Hancke. A practical relay attack on ISO 14443 proximity cards. Technical report, University of Cambridge Computer Laboratory, 59, 382-385, 2005.
[0124] Jonathan and Leahy. Investigating a smartphone imaging unit for photoplethysmography. Physiological measurement, 31(11), N79, 2010.
[0125] Hu, Cheng, Zhangand, Wuand, Liao, and Chen. OPFKA: Secure and efficient ordered- physiological-feature-based key agreement for wireless body area networks. In IEEE INFOCOM, 2013. [0126] Min, Kim, Shin, Yun, Lee, and Lee. Noncontact respiration rate measurement system using an ultrasonic proximity sensor. Sensors Journal, IEEE, 10(11), 1732-1739, 2010.
[0127] Nagel, Han, Hurwitz, and Schneiderman. Assessment and diagnostic applications of heart rate variability. Biomedical engineering-applications, basis & communications, 5: 147-158,
1993.
[0128] Poh, McDuff, and Ricard. IEEE TRANSACTIONS ON BIOMEDICAL
ENGINEERING, VOL. 58, NO. 1, JANUARY 2011 7 Advancements in Noncontact,
Multiparameter Physiological Measurements Using a Webcam, January 2011.
[0129] Poh, McDuff, and Picard. Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Optics Express, 18: 10762-10774, 2010.
[0130] Poon, Zhang, and Bao. A novel biometrics method to secure wireless body area sensor networks for telemedicine and m-health. IEEE Communications Magazine, 44(4):73-81, 2006.
[0131] Rostami, Masoud, Juels, and Koushanfar. "Heart-to-heart (H2H): authentication for implanted medical devices." Proceedings of the 2013 ACM SIGSAC conference on Computer & communications security. ACM, 2013.
[0132] Taigman, Yang, Ranzato, and Wolf. Deepface: Closing the gap to human-level performance in face verification. In Computer Vision and Pattern Recognition (CVPR), 2014 IEEE Conference on (pp. 1701-1708). IEEE, June 2014.
[0133] Venkatasubramanian, Krishna, Banerjee, and Gupta. "Plethysmogram-based secure inter-sensor communication in body area networks." Military Communications Conference, 2008. MILCOM 2008. IEEE. IEEE, 2008.
[0134] Venkatasubramanian and Gupta. Physiological value-based efficient usable security solutions for body sensor networks. ACM Trans. Sensor Networks, 6(4):31 : 1-31 :36, July 2010.

Claims

CLAIMS We claim:
1. A method comprising:
authenticating a wearer of a wearable device;
operating the wearable device to collect a first set of cardiac data from the wearer;
capturing video data of a user of a target computing device;
extracting a second set of cardiac data from the captured video data; and
authenticating the user of the target computing device by a method that includes comparing the first set of cardiac data to the second set of cardiac data.
2. The method of claim 1, wherein authenticating the wearer of the wearable device comprises:
authenticating the wearer to the wearable device; and
authenticating the wearable device to the target computing device.
3. The method of any of claims 1-2, wherein the wearable device is selected from the group consisting of a smart watch and a wrist-worn fitness device.
4. The method of any of claims 1-3, wherein authenticating the user further comprises performing facial recognition on the video data.
5. The method of any of claims 1-4, the method further comprising terminating access of the user to the target computing device in response to a failure to authenticate the user of the target computing device.
6. The method of any of claims 1-5, wherein capturing video data comprises capturing video data using a camera on the target computing device.
7. The method of any of claims 1-6, wherein the first and second set of cardiac data comprise heartbeat times.
8. The method of any of claims 1-7, wherein the first and second set of cardiac data comprise inter-pulse interval times.
9. The method of any of claims 1-8, wherein comparing the first and second set of cardiac data comprises determining a distance metric between the first and second sets of cardiac data.
10. The method of claim 9, wherein the user of the target computing device is authenticated only if the distance metric is less than a threshold.
11. A method of performing user authentication comprising:
receiving a first plurality of cardiac activity measurements derived from sensor data from a wearable device worn by a user;
receiving a second plurality of cardiac activity measurements derived from video data from a video camera;
comparing the first and second plurality of cardiac activity measurements; and responsive to a determination that the compared plurality of cardiac measurements are derived from a common user, outputting an authentication message.
12. The method of claim 11, wherein at least one of the first and second plurality of cardiac activity measurements comprises interpulse interval data.
13. The method of any of claims 11-12, wherein the second plurality of cardiac activity measurements is derived from the video data using photoplethysmography.
14. The method of any of claims 11-13, wherein the first and second pluralities of cardiac activity are continually compared.
15. The method of any of claims 11-14, further comprising, responsive a determination that the first and second pluralities of cardiac data are not associated with the common user, outputting an authentication-failure message.
16. The method of any of claims 11-15, wherein outputting an authentication message comprises outputting the authentication message to the computing device.
17. The method of any of claims 11-16, the method further comprising detecting a face in the video data, wherein the comparison is performed responsive to detecting the face.
18. The method of any of claims 11-17, wherein at least one of the first and second pluralities of cardiac activity data comprises time series data of heartbeat times.
19. The method of any of claims 11-18, wherein at least one of the first and second pluralities of cardiac activity data comprises time series data of interpulse interval times.
20. A computing device comprising a processor and a non-transitory storage medium storing instructions operative, when performed on the processor, to perform a set of functions, the set of functions including:
receiving a first plurality of cardiac activity measurements derived from sensor data from a wearable device worn by a user;
receiving a second plurality of cardiac activity measurements derived from video data from the camera on the computing device;
comparing the first and second plurality of cardiac activity measurements; and responsive to a determination that the compared plurality of cardiac measurements are derived from a common user, outputting an authentication message.
PCT/US2017/014312 2016-01-28 2017-01-20 Authentication via photoplethysmography WO2017132061A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662288399P 2016-01-28 2016-01-28
US62/288,399 2016-01-28

Publications (1)

Publication Number Publication Date
WO2017132061A1 true WO2017132061A1 (en) 2017-08-03

Family

ID=57956412

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/014312 WO2017132061A1 (en) 2016-01-28 2017-01-20 Authentication via photoplethysmography

Country Status (1)

Country Link
WO (1) WO2017132061A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107669252A (en) * 2017-11-09 2018-02-09 京东方科技集团股份有限公司 A kind of wearable device and its control method
WO2019032706A1 (en) * 2017-08-10 2019-02-14 Riaan Conradie User verification by comparing physiological sensor data with physiological data derived from facial video
US11593469B2 (en) 2020-03-30 2023-02-28 Tata Consultancy Services Limited Continuously validating a user during an established authenticated session using photoplethysmogram and accelerometer data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150135310A1 (en) * 2013-10-04 2015-05-14 Salutron, Inc. Persistent authentication using sensors of a user-wearable device
US20150161371A1 (en) * 2013-03-18 2015-06-11 Kabushiki Kaisha Toshiba Electronic device and authentication control method
US20150378433A1 (en) * 2014-06-27 2015-12-31 Amazon Technologies, Inc. Detecting a primary user of a device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150161371A1 (en) * 2013-03-18 2015-06-11 Kabushiki Kaisha Toshiba Electronic device and authentication control method
US20150135310A1 (en) * 2013-10-04 2015-05-14 Salutron, Inc. Persistent authentication using sensors of a user-wearable device
US20150378433A1 (en) * 2014-06-27 2015-12-31 Amazon Technologies, Inc. Detecting a primary user of a device

Non-Patent Citations (19)

* Cited by examiner, † Cited by third party
Title
BAO; POON; ZHANG; SHEN: "Using the timing information of heartbeats as an entity identifier to secure body sensor network", IEEE TRANS. ON INFO. TECH. IN BIOMEDICINE, vol. 12, no. 6, 2008, pages 772 - 779, XP011345509, DOI: doi:10.1109/TITB.2008.926434
BROWNLEY; HURWITZ; SCHNEIDERMAN: "Handbook of psychophysiology", vol. 2, 2000, article "Cardiovascular psychophysiology.", pages: 224 - 264
CHO; LEE: "Biometric based secure communications without pre-deployed key for biosensor implanted in body sensor networks", INFORMATION SECURITY APPLICATIONS, 2012, pages 203 - 218, XP019173666
FRANCIS; HANCKE; MAYES; MARKANTONAKIS: "Radio Frequency Identification: Security and Privacy Issues", 2010, SPRINGER, article "Practical NFC peer-to-peer relay attack using mobile phones", pages: 35 - 49
GOLDIN; KANELLAKIS: "Principles and Practice of Constraint Programming— CP'95", January 1995, SPRINGER, article "On similarity queries for time-series data: constraint specification and implementation", pages: 137 - 153
GRIMALDI; KURYLYAK; LAMONACA; NASTRO: "Intelligent Data Acquisition and Advanced Computing Systems (IDAACS), 2011 IEEE 6th International Conference", vol. 1, September 2011, IEEE, article "Photoplethysmography detection by smartphone's videocamera", pages: 488 - 491
HANCKE.: "A practical relay attack on ISO 14443 proximity cards", TECHNICAL REPORT, UNIVERSITY OF CAMBRIDGE COMPUTER LABORATORY, vol. 59, 2005, pages 382 - 385
HU; CHENG; ZHANGAND; WUAND; LIAO; CHEN: "OPFKA: Secure and efficient ordered-physiological-feature-based key agreement for wireless body area networks", IEEE INFOCOM, 2013
JONATHAN; LEAHY: "Investigating a smartphone imaging unit for photoplethysmography", PHYSIOLOGICAL MEASUREMENT, vol. 31, no. 11, 2010, pages N79, XP020200084, DOI: doi:10.1088/0967-3334/31/11/N01
MIN; KIM; SHIN; YUN; LEE; LEE: "Noncontact respiration rate measurement system using an ultrasonic proximity sensor. Sensors Journal", IEEE, vol. 10, no. 11, 2010, pages 1732 - 1739
NAGEL; HAN; HURWITZ; SCHNEIDERMAN: "Assessment and diagnostic applications of heart rate variability", BIOMEDICAL ENGINEERING-APPLICATIONS, BASIS & COMMUNICATIONS, vol. 5, 1993, pages 147 - 158
POH, MING-ZHER; DANIEL J. MCDUFF; ROSALIND W. PICARD: "Advancements in noncontact, multiparameter physiological measurements using a webcam", BIOMEDICAL ENGINEERING, IEEE TRANSACTIONS, vol. 58.1, 2011, pages 7 - 11, XP011372860, DOI: doi:10.1109/TBME.2010.2086456
POH; MCDUFF; PICARD: "Non-contact, automated cardiac pulse measurements using video imaging and blind source separation", OPTICS EXPRESS, vol. 18, 2010, pages 10762 - 10774, XP002686060, DOI: doi:10.1364/OE.18.010762
POH; MCDUFF; RICARD: "Advancements in Noncontact, Multiparameter Physiological Measurements Using a Webcam", IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, vol. 58, no. 1, January 2011 (2011-01-01), XP011372860, DOI: doi:10.1109/TBME.2010.2086456
POON; ZHANG; BAO: "A novel biometrics method to secure wireless body area sensor networks for telemedicine and m-health", IEEE COMMUNICATIONS MAGAZINE, vol. 44, no. 4, 2006, pages 73 - 81, XP055013440, DOI: doi:10.1109/MCOM.2006.1632652
ROSTAMI; MASOUD; JUELS; KOUSHANFAR: "Proceedings of the 2013 ACM SIGSAC conference on Computer & communications security", 2013, ACM, article "Heart-to-heart (H2H): authentication for implanted medical devices"
TAIGMAN; YANG; RANZATO; WOLF: "Computer Vision and Pattern Recognition (CVPR), 2014 IEEE Conference on", June 2014, IEEE, article "Deepface: Closing the gap to human-level performance in face verification", pages: 1701 - 1708
VENKATASUBRAMANIAN; GUPTA: "Physiological value-based efficient usable security solutions for body sensor networks", ACM TRANS. SENSOR NETWORKS, vol. 6, no. 4, July 2010 (2010-07-01), pages 31.1 - 31.36, XP058184573, DOI: doi:10.1145/1777406.1777410
VENKATASUBRAMANIAN; KRISHNA; BANERJEE; GUPTA: "Military Communications Conference, 2008. MILCOM 2008", 2008, IEEE, article "Plethysmogram-based secure inter-sensor communication in body area networks"

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019032706A1 (en) * 2017-08-10 2019-02-14 Riaan Conradie User verification by comparing physiological sensor data with physiological data derived from facial video
US11488250B2 (en) 2017-08-10 2022-11-01 Lifeq Global Limited User verification by comparing physiological sensor data with physiological data derived from facial video
CN107669252A (en) * 2017-11-09 2018-02-09 京东方科技集团股份有限公司 A kind of wearable device and its control method
US11593469B2 (en) 2020-03-30 2023-02-28 Tata Consultancy Services Limited Continuously validating a user during an established authenticated session using photoplethysmogram and accelerometer data

Similar Documents

Publication Publication Date Title
Yaacoub et al. Securing internet of medical things systems: Limitations, issues and recommendations
US11580203B2 (en) Method and apparatus for authenticating a user of a computing device
US9871779B2 (en) Continuous authentication confidence module
Wu et al. Access control schemes for implantable medical devices: A survey
CN111758096A (en) Live user authentication apparatus, system and method
US10652237B2 (en) Continuous authentication system and method based on BioAura
CN114846527A (en) User state monitoring system and method using motion, and user access authorization system and method employing the same
WO2017132061A1 (en) Authentication via photoplethysmography
Habib et al. A novel authentication framework based on biometric and radio fingerprinting for the IoT in eHealth
Ojala et al. Wearable authentication device for transparent login in nomadic applications environment
TW201926101A (en) Login method and apparatus, and electronic device
TW201931382A (en) User verification by comparing physiological sensor data with physiological data derived from facial video
KR20180061819A (en) Multiple biometric identity authentication apparatus and authentication system, multiple biometric identity authentication method therewith
Liu et al. Cardiocam: Leveraging camera on mobile devices to verify users while their heart is pumping
US20220229895A1 (en) Live user authentication device, system and method and fraud or collusion prevention using same
Wang et al. Continuous user authentication by contactless wireless sensing
Shah et al. Smart user identification using cardiopulmonary activity
CN115606218A (en) Monitoring system and method related to user activity, and user access authorization system and method adopting same
US20170078281A1 (en) User Login Method and System Capable of Analyzing User Face Validity
Liu et al. Leveraging breathing for continuous user authentication
Li et al. Video is all you need: Attacking PPG-based biometric authentication
US11544360B2 (en) Masking biometric markers by sensor path control
EP3448078B1 (en) Electronic device, system and method for data communication
WO2018009692A1 (en) Methods and systems for augmenting security of biometric user authentication
Blasco et al. Wearables security and privacy

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17702735

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17702735

Country of ref document: EP

Kind code of ref document: A1