WO2016183541A1 - Augmented reality systems and methods for tracking biometric data - Google Patents

Augmented reality systems and methods for tracking biometric data Download PDF

Info

Publication number
WO2016183541A1
WO2016183541A1 PCT/US2016/032583 US2016032583W WO2016183541A1 WO 2016183541 A1 WO2016183541 A1 WO 2016183541A1 US 2016032583 W US2016032583 W US 2016032583W WO 2016183541 A1 WO2016183541 A1 WO 2016183541A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
biometric data
eye
data
transaction
Prior art date
Application number
PCT/US2016/032583
Other languages
English (en)
French (fr)
Inventor
Gary R. Bradski
Original Assignee
Magic Leap, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Leap, Inc. filed Critical Magic Leap, Inc.
Priority to EP16793671.5A priority Critical patent/EP3295347A4/en
Priority to CA2984455A priority patent/CA2984455C/en
Priority to JP2017558979A priority patent/JP6863902B2/ja
Priority to CN201680027161.7A priority patent/CN107533600A/zh
Priority to AU2016262579A priority patent/AU2016262579B2/en
Priority to KR1020177035995A priority patent/KR102393271B1/ko
Priority to NZ736861A priority patent/NZ736861B2/en
Publication of WO2016183541A1 publication Critical patent/WO2016183541A1/en
Priority to IL255325A priority patent/IL255325B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/16Implementing security features at a particular protocol layer
    • H04L63/168Implementing security features at a particular protocol layer above the transport layer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/30Security of mobile devices; Security of mobile applications
    • H04W12/33Security of mobile devices; Security of mobile applications using wearable devices, e.g. using a smartwatch or smart-glasses

Definitions

  • the present disclosure relates to systems and methods for utilizing biometric data to facilitate business transactions conducted through an augmented reality (AR) device.
  • AR augmented reality
  • a virtual reality, or "VR” scenario typically involves presentation of digital or virtual image information without transparency to other actual real-world visual input.
  • An augmented reality, or "AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user.
  • an augmented reality scene is depicted wherein a user of an AR technology sees a real-world park-like setting featuring people, trees, buildings in the background, and a concrete platform 1120.
  • the user of the AR technology also perceives a robot statue 1 110 standing upon the real-world platform 1120, and a cartoon-like avatar character 2 flying by, even though these elements (2, 1 110) do not exist in the real world.
  • the human visual perception system is very complex, and producing such an augmented reality scene that facilitates a comfortable, natural-feeling, rich presentation of virtual image elements amongst other virtual or real- world imagery elements is challenging.
  • an AR device may be used to present all types of virtual content to the user.
  • the AR devices may be used in the context of various gaming applications, enabling users to participate in single-player or multi-player video/augmented reality games that mimic real-life situations. For example, rather than playing a video game at a personal computer, the AR user may play the game on a larger scale in conditions that very closely resemble real life (e.g., "true-to-scale" 3D monsters may appear from behind a real building when the AR user is taking a walk in the park, etc.). Indeed, this greatly enhances the believability and enjoyment of the gaming experience.
  • Fig.1 illustrates the possibility of AR devices in the context of gaming applications
  • AR devices may be used in a myriad of other applications, and may be anticipated to take the place of everyday computing devices (e.g., personal computers, cell phones, tablet devices etc.).
  • the AR device may be thought of as a walking personal computer that allows the user to perform a variety of computing tasks (e.g., check email, look up a term on the web, teleconference with other AR users, watch a movie, etc.) while at the same time being connected to the user's physical environment.
  • computing tasks e.g., check email, look up a term on the web, teleconference with other AR users, watch a movie, etc.
  • the AR user may be "on the go" (e.g., on a walk, on a daily commute, at a physical location other than his/her office, be away from his/her computer, etc.), but still be able to pull up a virtual email screen to check email, for example, or have a video conference with a friend by virtually populating a screen on the AR device, or in another example, be able to construct a virtual office at a make-shift location.
  • a myriad of similar virtual reality/augmented reality scenarios may be envisioned.
  • the AR device To present an augmented reality scene such as the ones described above that is sensitive to the physiological limitations of the human visual system, the AR device must be aware of the user's physical surroundings in order to project desired virtual content in relation to one or more real objects in the user's physical environment.
  • the AR device is typically equipped with various tracking devices (e.g., eye-tracking devices, GPS, etc.), cameras (e.g., field-of view cameras, infrared cameras, depth cameras, etc.) and sensors (e.g., accelerometers, gyroscopes, etc.) to assess the user's position, orientation, distance, etc. in relation to various real objects in the user's surroundings, to detect and identify objects of the real world and other such functionalities.
  • tracking devices e.g., eye-tracking devices, GPS, etc.
  • cameras e.g., field-of view cameras, infrared cameras, depth cameras, etc.
  • sensors e.g., accelerometers, gyroscopes, etc.
  • this data may be
  • the AR devices may be configured to allow users to seamlessly perform many types of transactions without requiring the user to perform the onerous procedures described above.
  • Embodiments of the present invention are directed to devices, systems and methods for facilitating virtual reality and/or augmented reality interaction for one or more users.
  • a method of conducting a transaction through an augmented reality device comprises capturing biometric data from a user, determining, based at least in part on the captured biometric data, an identity of the user, and authenticating the user for the transaction based on the determined identity.
  • the method further comprises transmitting a set of data regarding the transaction to a financial institution.
  • the biometric data is an iris pattern.
  • the biometric data is a voice recording of the user.
  • the biometric data is a retinal signature.
  • the biometric data is a characteristic associated with the user's skin.
  • the biometric data is captured through one or more eye tracking cameras that capture a movement of the user's eyes.
  • the biometric data is a pattern of movement of the user's eyes.
  • the biometric data is a blinking pattern of the user's eyes.
  • the augmented reality device is head mounted, and the augmented reality device is individually calibrated for the user.
  • the biometric data is compared to a predetermined data pertaining to the user.
  • the predetermined data is a known signature movement of the user's eyes.
  • the predetermined data is a known iris pattern. In one or more embodiments, the predetermined data is a known retinal pattern. In one or more embodiments, the method further comprises detecting a desire of the user to make a transaction, requesting the biometric data from the user based at least in part on the detected desire, and comparing the biometric data with a predetermined biometric data to generate a result, wherein the user is authenticated based at least in part on the result.
  • the transaction is a business transaction.
  • the method further comprises communicating an authentication of the user to a financial institution associated with the user, wherein the financial institution releases payment on behalf of the user based at least in part on the authentication.
  • the financial institution transmits the payment to one or more vendors indicated by the user.
  • the method further comprises detecting an interruption event or transaction event associated with the augmented reality device. In one or more embodiments, capturing new biometric data from the user in order to re-authenticate the user based at least in part on the detected event. In one or more embodiments, the interruption of activity is detected based at least in part on a removal of the augmented reality device from the user's head.
  • the interruption of activity is detected based at least in part on a loss of connectivity of the augmented reality device with a network.
  • the transaction event is detected based at least in part on an express approval of a transaction by the user.
  • the transaction event is detected based at least in part on a heat map associated with the user's gaze.
  • the transaction event is detected based at least in part on user input received through the augmented reality device.
  • the user input comprises an eye gesture.
  • the user input comprises a hand gesture.
  • an augmented reality display system comprises a biometric data tracking device to capture biometric data from a user, a processor operatively coupled to the biometric data tracking device to process the captured biometric data, and to determine an identity of the user based at least in part on the captured biometric data, and a server to communicate with at least a financial institution to authenticate the user for a transaction.
  • the biometric data is eye movement data. In one or more embodiments, the biometric data corresponds to an image of an iris of the user. In one or more embodiments, the server also transmits a set of data regarding the transaction to a financial institution. In one or more embodiments, the biometric data is an iris pattern.
  • the biometric data is a voice recording of the user. In one or more embodiments, the biometric data is a retinal signature. In one or more embodiments, the biometric data is a characteristic associated with the user's skin. In one or more embodiments, the biometric tracking device comprises one or more eye tracking cameras to capture a movement of the user's eyes. In one or more embodiments, the biometric data is a pattern of movement of the user's eyes.
  • the biometric data is a blinking pattern of the user's eyes.
  • the augmented reality display system is head mounted, and the augmented reality display system is individually calibrated for the user.
  • the processor also compares the biometric data to a predetermined data pertaining to the user.
  • the predetermined data is a known signature movement of the user's eyes.
  • the predetermined data is a known iris pattern.
  • the predetermined data is a known iris pattern.
  • the processor detects that a user desires to make a transaction, and further comprising a user interface to request the biometric data from the user based at least in part on the detection, the processor comparing the biometric data with a predetermined biometric data, and authenticating the user based at least in part on the comparison.
  • the transaction is a business transaction.
  • the processor communicates the authentication of the user to a financial institution associated with the user, and wherein the financial institution releases payment on behalf of the user based at least in part on the authentication.
  • the financial institution transmits the payment to one or more vendors indicated by the user.
  • the processor detects an interruption event or transaction event associated with the augmented reality device, and wherein the biometric tracking device captures new biometric data from the user in order to re-authenticate the user based at least in part on the detected event.
  • the interruption of activity is detected based at least in part on a removal of the augmented reality device from the user's head.
  • the interruption of activity is detected based at least in part on a loss of connectivity of the augmented reality device with a network.
  • the transaction event is detected based at least in part on an express approval of a transaction by the user.
  • the transaction event is detected based at least in part on a heat map associated with the user's gaze.
  • the transaction event is detected based at least in part on user input received through the augmented reality device.
  • the user input comprises an eye gesture.
  • the user input comprises a hand gesture.
  • the biometric tracking device comprises an eye tracking system. In one or more embodiments, the biometric tracking device comprises a haptic device. In one or more embodiments, the biometric tracking device comprises a sensor that measures physiological data pertaining to a user's eye.
  • FIG. 1 illustrates an example augmented reality scene being displayed to a user.
  • FIG. 2A-2D illustrates various configurations of an example augmented reality device.
  • FIG. 3 illustrates an augmented reality device communicating with one or more servers in the cloud, according to one embodiment.
  • 4A-4D illustrates various eye and head measurements taken in order to configure the augmented reality device for a particular user.
  • FIG. 5 shows a plan view of various components of an augmented reality device according to one embodiment.
  • FIG. 6 shows a system architecture of the augmented reality system for conducting business transactions, according to one embodiment.
  • FIG. 7 is an example flowchart depicting a method for conducting a business transaction through the augmented reality device.
  • FIG. 8A and 8B illustrate an example eye-identification method to identify a user, according to one embodiment.
  • FIG. 9 illustrates an example flowchart depicting a method of using eye-movements to authenticate a user, according to one embodiment.
  • FIG. 10A-10I illustrates a series of process flow diagrams depicting an example scenario of conducting a business transaction using an augmented reality device.
  • the AR device may utilize eye identification techniques (e.g., iris patterns, eye vergence, eye motion, patterns of cones and rods, patterns in eye movements, etc.) to authenticate a user for a purchase.
  • eye identification techniques e.g., iris patterns, eye vergence, eye motion, patterns of cones and rods, patterns in eye movements, etc.
  • this type of user authentication minimizes friction costs in conducting business transactions, and allows the user to make purchases (e.g., brick and mortar stores, online stores, in response to an advertisement, etc.) seamlessly with minimal effort and/or interruption.
  • FIG. 2A an AR system user 60 is depicted wearing a frame 64 structure coupled to an AR display system 62 positioned in front of the eyes of the user.
  • a speaker 66 is coupled to the frame 64 in the depicted configuration and positioned adjacent the ear canal of the user (in one embodiment, another speaker, not shown, is positioned adjacent the other ear canal of the user to provide for stereo / shapeable sound control).
  • the display 62 is operatively coupled 68, such as by a wired lead or wireless connectivity, to a local processing and data module 70 which may be mounted in a variety of configurations, such as fixedly attached to the frame 64, fixedly attached to a helmet or hat 80 as shown in the embodiment of Figure 2B, embedded in headphones, removably attached to the torso 82 of the user 60 in a backpack-style configuration as shown in the embodiment of Figure 2C, or removably attached to the hip 84 of the user 60 in a belt-coupling style configuration as shown in the embodiment of Figure 2D.
  • a local processing and data module 70 which may be mounted in a variety of configurations, such as fixedly attached to the frame 64, fixedly attached to a helmet or hat 80 as shown in the embodiment of Figure 2B, embedded in headphones, removably attached to the torso 82 of the user 60 in a backpack-style configuration as shown in the embodiment of Figure 2C, or removably attached to the hip 84 of the user 60 in
  • the local processing and data module 70 may comprise a power-efficient processor or controller, as well as digital memory, such as flash memory, both of which may be utilized to assist in the processing, caching, and storage of data a) captured from sensors which may be operatively coupled to the frame 64, such as image capture devices (such as cameras), microphones, inertial measurement units, accelerometers, compasses, GPS units, radio devices, and/or gyros; and/or b) acquired and/or processed using the remote processing module 72 and/or remote data repository 74, possibly for passage to the display 62 after such processing or retrieval.
  • image capture devices such as cameras
  • microphones such as inertial measurement units
  • accelerometers compasses
  • GPS units GPS units
  • radio devices radio devices
  • the local processing and data module 70 may be operatively coupled (76, 78), such as via a wired or wireless communication links, to the remote processing module 72 and remote data repository 74 such that these remote modules (72, 74) are operatively coupled to each other and available as resources to the local processing and data module 70.
  • the remote processing module 72 may comprise one or more relatively powerful processors or controllers configured to analyze and process data and/or image information.
  • the remote data repository 74 may comprise a relatively large-scale digital data storage facility, which may be available through the Internet or other networking configuration in a "cloud" resource configuration. In one embodiment, all data is stored and all computation is performed in the local processing and data module, allowing fully autonomous use without any remote modules.
  • the AR system continually receives input from various devices that collect data about the AR user and the surrounding environment. Referring now to Fig. 3, the various components of an example augmented reality display device will be described. It should be appreciated that other embodiments may have additional components. Nevertheless, Fig. 3 provides a basic idea of the various
  • a schematic illustrates coordination between the cloud computing assets 46 and local processing assets (308, 120).
  • the cloud 46 assets are operatively coupled, such as via wired or wireless networking (wireless being preferred for mobility, wired being preferred for certain high-bandwidth or high-data-volume transfers that may be desired), directly to (40, 42) one or both of the local computing assets (120, 308), such as processor and memory configurations which may be housed in a structure configured to be coupled to a user's head mounted device 120 or belt 308.
  • These computing assets local to the user may be operatively coupled to each other as well, via wired and/or wireless connectivity configurations 44.
  • primary transfer between the user and the cloud 46 may be via the link between the belt-based subsystem 308 and the cloud, with the head mounted subsystem 120 primarily data-tethered to the belt-based subsystem 308 using wireless connectivity, such as ultra-wideband ("UWB") connectivity, as is currently employed, for example, in personal computing peripheral connectivity applications.
  • wireless connectivity such as ultra-wideband (“UWB") connectivity
  • UWB ultra-wideband
  • the AR display system 120 may interact with one or more AR servers 110 hosted in the cloud.
  • the various AR servers 110 may have communication links 115 that allows the servers 110 to communicate with one another.
  • a map of the world is continually updated at a storage location which may partially reside on the user's AR system and partially reside in the cloud resources.
  • the map (also referred to as a passable world model) may be a large database comprising raster imagery, 3D and 2D points, parametric information and other information about the real world. As more and more AR users continually capture information about their real environment (e.g., through cameras, sensors, IMUs, etc.), the map becomes more and more accurate.
  • AR systems similar to those described in Figs. 2A-2D provide unique access to a user's eyes, which may be advantageously used to uniquely identify the user based on a set of biometric data tracked through the AR system.
  • This unprecedented access to the user's eyes naturally lends itself to various applications.
  • the AR device interacts crucially with the user's eye to allow the user to perceive 3D virtual content, and in many embodiments, tracks various biometrics related to the user's eyes (e.g., eye vergence, eye motion, cones and rods, patterns of eye movements, etc.), the resultant tracked data may be advantageously used in user identification and authentication for various transactions, as will be described in further detail below.
  • the AR device is typically fitted for a particular user's head, and the optical components are aligned to the user's eyes. These configuration steps may be used in order to ensure that the user is provided with an optimum augmented reality experience without causing any physiological side-effects, such as headaches, nausea, discomfort, etc.
  • the AR device is configured (both physically and digitally) for each individual user, and may be calibrated specifically for the user.
  • a loose fitting AR device may be used comfortably by a variety of users.
  • the AR device knows the distance between the user's eyes, the distance from the head worn display and the user's eyes, a distance between the user's eyes, a curvature of the user's forehead. All of these measurements may be used to provide the appropriate head-worn display system for a given user. In other embodiments, such measurements may not be necessary in order to perform the identification and authentication function described in this application.
  • the AR device may be customized for each user.
  • the user's head shape 402 may be taken into account when fitting the head-mounted AR system, in one or more embodiments, as shown in Fig. 4A.
  • the eye components 404 e.g., optics, structure for the optics, etc.
  • the eye components 404 may be rotated or adjusted for the user's comfort both horizontally and vertically, or rotated for the user's comfort, as shown in Fig. 4B.
  • a rotation point 406 of the head set with respect to the user's head may be adjusted based on the structure of the user's head.
  • the inter-pupillary distance (IPD) 408 i.e., the distance between the user's eyes
  • IPD inter-pupillary distance
  • this aspect of the head-worn AR devices is crucial because the system already possesses a set of measurements about the user's physical features (e.g. , eye size, head size, distance between eyes, etc.), and other data that may be used to easily identify the user, and allow the user to complete one or more business transactions. Additionally, the AR system may easily be able to detect when the AR system is being worn by a different AR user other than a user that is authorized to the use the AR system. This allows the AR system to constantly monitor the user's eyes, and thus be aware of the user's identity as needed.
  • the AR system may easily be able to detect when the AR system is being worn by a different AR user other than a user that is authorized to the use the AR system. This allows the AR system to constantly monitor the user's eyes, and thus be aware of the user's identity as needed.
  • the AR device may be configured to track a set of biometric data about the user.
  • the system may track eye movements, eye movement patterns, blinking patterns, eye vergence, eye color, iris patterns, retinal patters, fatigue parameters, changes in eye color, changes in focal distance, and many other parameters, that may be used in providing an optical augmented reality experience to the user.
  • a suitable user display device 62 comprising a display lens 106 which may be mounted to a user's head or eyes by a housing or frame 108.
  • the display lens 106 may comprise one or more transparent mirrors positioned by the housing 108 in front of the user's eyes 20 and configured to bounce projected light 38 into the eyes 20 and facilitate beam shaping, while also allowing for transmission of at least some light from the local environment.
  • two wide-field-of-view machine vision cameras 16 are coupled to the housing 108 to image the environment around the user; in one embodiment these cameras 16 are dual capture visible light / infrared light cameras.
  • the depicted embodiment also comprises a pair of scanned-laser shaped-wavefront (i.e., for depth) light projector modules 18 (e.g., spatial light modulators such as DLP, fiber scanning devices (FSDs), LCDs, etc.) with display mirrors and optics configured to project light 38 into the eyes 20 as shown.
  • the depicted embodiment also comprises two miniature infrared cameras 24 paired with infrared light sources 26, such as light emitting diodes "LED"s, which are configured to be able to track the eyes 20 of the user to support rendering and user input.
  • the display system 62 further features a sensor assembly 39, which may comprise X, Y, and Z axis accelerometer capability as well as a magnetic compass and X, Y, and Z axis gyro capability, preferably providing data at a relatively high frequency, such as 200 Hz.
  • the depicted system 62 also comprises a head pose processor 36, such as an ASIC (application specific integrated circuit), FPGA (field programmable gate array), and/or ARM processor (advanced reduced-instruction-set machine), which may be configured to calculate real or near-real time user head pose from wide field of view image information output from the cameras 16.
  • the head pose processor 36 is operatively coupled (90, 92, 94; e.g., via wired or wireless connectivity) to the cameras 16 and the rendering engine 34.
  • FIG. 3 Also shown is another processor 32 configured to execute digital and/or analog processing to derive pose from the gyro, compass, and/or accelerometer data from the sensor assembly 39.
  • the depicted embodiment also features a GPS 37 subsystem to assist with pose and positioning.
  • the depicted embodiment comprises a rendering engine 34 which may feature hardware running a software program configured to provide rendering information local to the user to facilitate operation of the scanners and imaging into the eyes of the user, for the user's view of the world.
  • the rendering engine 34 is operatively coupled (105, 94, 100/102, 104; i.e., via wired or wireless connectivity) to the sensor pose processor 32, the image pose processor 36, the eye tracking cameras 24, and the projecting subsystem 18 such that rendered light 38 is projected using a scanned laser arrangement 18 in a manner similar to a retinal scanning display.
  • the wavefront of the projected light beam 38 may be bent or focused to coincide with a desired focal distance of the projected light 38.
  • the mini infrared cameras 24 may be utilized to track the eyes to support rendering and user input (i.e., where the user is looking, what depth he is focusing; as discussed below, eye vergence may be utilized to estimate depth of focus).
  • the GPS 37, gyros, compass, and accelerometers 39 may be utilized to provide coarse and/or fast pose estimates.
  • the camera 16 images and pose data, in conjunction with data from an associated cloud computing resource, may be utilized to map the local world and share user views with a virtual or augmented reality community.
  • all of the components of the system 62 featured in Figure 5 are directly coupled to the display housing 108 except for the image pose processor 36, sensor pose processor 32, and rendering engine 34, and communication between the latter three and the remaining components of the system may be by wireless communication, such as ultra wideband, or wired communication.
  • the depicted housing 108 preferably is head- mounted and wearable by the user. It may also feature speakers, such as those which may be inserted into the ears of a user and utilized to provide sound to the user.
  • the AR device may comprise many components that are configured to collect data from the user and his/her surroundings. For example, as described above, some embodiments of the AR device collect GPS information to determine a location of the user. In other embodiments, the AR device comprises infrared cameras to track the eyes of the user. In yet other embodiments, the AR device may comprise field-of-view cameras to capture images of the user's environment, which may, in turn, be used to construct a map (contained in one of the servers 1 10, as described in Figure 3) of the user's physical space, which allows the system to render virtual content in relation to appropriate real-life objects, as described briefly with respect to Figure 3.
  • a map obtained in one of the servers 1 10, as described in Figure 3
  • the mini cameras 24 may be utilized to measure where the centers of a user's eyes 20 are geometrically verged to, which, in general, coincides with a position of focus, or "depth of focus", of the eyes 20.
  • a three dimensional surface of all points the eyes verge to is called the "horopter”.
  • the focal distance may take on a finite number of depths, or may be infinitely varying. Light projected from the vergence distance appears to be focused to the subject eye 20, while light in front of or behind the vergence distance is blurred.
  • the eye vergence may be tracked with the mini cameras 24, and the rendering engine 34 and projection subsystem 18 may be utilized to render all objects on or close to the horopter in focus, and all other objects at varying degrees of defocus (i.e., using intentionally-created blurring).
  • a see-through light guide optical element configured to project coherent light into the eye may be provided by suppliers such as Lumus, Inc.
  • the system 62 renders to the user at a frame rate of about 60 frames per second or greater.
  • the mini cameras 24 may be utilized for eye tracking, and software may be configured to pick up not only vergence geometry but also focus location cues to serve as user inputs.
  • a system is configured with brightness and contrast suitable for day or night use.
  • such a system preferably has latency of less than about 20 milliseconds for visual object alignment, less than about 0.1 degree of angular alignment, and about 1 arc minute of resolution, which is approximately the limit of the human eye.
  • the display system 62 may be integrated with a localization system, which may involve GPS elements, optical tracking, compass, accelerometers, and/or other data sources, to assist with position and pose determination; localization information may be utilized to facilitate accurate rendering in the user's view of the pertinent world (i.e., such information would facilitate the glasses to know where they are with respect to the real world).
  • the traditional model(s) for conducting business transactions tend to be inefficient and onerous, and often have the effect of deterring users from engaging in transactions. For example, consider a user at a department store. In traditional models, the user is required to physically go to a store, select items, stand in line, wait for the cashier, provide payment information and or identification, and authorize payment. Even online shopping, which is arguably less cumbersome, comes with its share of drawbacks. Although the user does not have to physically be at the store location and can easily select items of interest, payment still often requires credit card information and authentication. With the advent of AR devices, however, the traditional models of payment (e.g., cash, credit card, monetary tokens, etc.) may be rendered unnecessary, because the AR device can easily confirm the user's identity and authenticate a business transaction.
  • the traditional models of payment e.g., cash, credit card, monetary tokens, etc.
  • an AR user may leisurely stroll into a retail store and pick up an item.
  • the AR device may confirm the user's identity, confirm whether the user wants to make the purchase, and simply walk out of the store.
  • the AR device may interface with a financial institution that will transfer money from the user's account to an account associated with the retail store based on the confirmed purchase.
  • the AR user may watch an advertisement for a particular brand of shoes.
  • the user may indicate, through the AR device, that the user wants to purchase the shoes.
  • the AR device may confirm identity of the user, and authenticate the purchase.
  • an order may be placed at the retailer of the brand of shoes, and the retailer may simply ship a pair of the desired shoes to the user.
  • the AR device since the AR device "knows" the identity of the user (and AR devices are typically built and customized for every individual user), financial transactions are easily authenticated, thereby greatly reducing the friction costs typically associated with conducting business.
  • the AR device may periodically perform an identification test of the user for privacy and security purposes.
  • this periodic identification and authentication of the user is necessary for security purposes especially in the context of conducting business transactions, or for privacy purposes to ensure that the AR device is not being used by unknown users and being linked to the AR user's account on the cloud.
  • This application describes systems and methods for ensuring security for financial/business transactions in which case user identification and
  • authentication is paramount. Similarly, these steps are equally important to ensure user privacy. In fact, these identification steps may be used prior to opening any personal/private user account (e.g., email, social network, financial account, etc.) through the AR device.
  • personal/private user account e.g., email, social network, financial account, etc.
  • the AR device may identify the user to ensure that the AR device hasn't been stolen. If the AR device detects an unknown user, the AR device may immediately send captured information about the user, and the location of the AR device to the AR server. Or, in other embodiments, if it is detected that the AR device is being used by someone who is not identified, the AR device may shut down entirely and automatically delete all contents in the memory of the AR device such that no confidential information is leaked or misused. These security measures may prevent thefts of the AR device, because the AR device is able to capture many types of information about a wearer of the AR device.
  • the embodiments described below may enable an AR device to tele-operate a shopping robot. For example, once a user of the AR device has been identified, the AR device may connect to a shopping robot through a network of a particular store or franchise, and communicate a transaction with the shopping robot. Thus, even if the user is not physically in the store the AR device may conduct transactions through a proxy, once the user has been authenticated. Similarly, many other security and/or privacy applications may be envisioned.
  • the tracked biometric data may be eye-related biometric data such as patterns in eye movements, iris patterns, eye vergence information, etc. In essence, rather than requiring the user to remember a password, or present some type of
  • the AR device automatically verifies identity through the use of the tracked biometric data. Given that the AR device has constant access to the user's eyes, it is anticipated that the tracked data will provide highly accurate and individualized
  • the AR system architecture comprises a head-worn AR device 62, a local processing module 660 of the AR device 62, a network 650, an AR server 612, a financial institution 620 and one or more vendors (622A, 622B, etc.).
  • the head-worn AR device 62 comprises many sub-components, some of which are configured to capture and/or track information associated with the user and/or surroundings of the user. More particularly, in one or more embodiments, the head-worn AR device 62 may comprise one or more eye tracking cameras. The eye tracking cameras may track the user's eye movements, eye vergence, etc.
  • the eye tracking cameras may be configured to capture a picture of the user's iris.
  • the head-worn AR device 62 may comprise other cameras configured to capture other biometric information.
  • associated cameras may be configured to capture an image of the user's eye shape.
  • cameras (or other tracking devices) may capture data regarding the user's eye lashes.
  • the tracked biometric information e.g., eye data, eye-lash data, eye-shape data, eye movement data, head data, sensor data, voice data, fingerprint data, etc.
  • the local processing module 660 may be part of a belt pack of the AR device 62.
  • the local processing module may be part of the housing of the head-worn AR device 62.
  • the head-worn AR device 62 of a user 680 interfaces with the local processing module 660 to provide the captured data.
  • the local processing module 660 comprises a processor 664 and other components 652 (e.g., memory, power source, telemetry circuitry, etc.) that enable the AR system to perform a variety of computing tasks.
  • the local processing module 660 may also comprise an identification module 614 to identify a user based on information tracked by the one or more tracking devices of the head-worn AR device 62.
  • the identification module 614 comprises a database 652 to store a set of data with which to identify and/or authenticate a user.
  • the database 652 comprises a mapping table 670 that may store a set of predetermined data and/or predetermined authentication details or patterns.
  • the captured data may be compared against the predetermined data stored at the mapping table 670 to determine the identity of the user 680.
  • the database 652 may comprise other data to be used in performing the
  • the database 652 may store one or more eye tests to verify the identity of the user, as will be described in detail further below.
  • the local processing module 660 communicates with an AR server 612 through a cloud network 650.
  • the AR server 612 comprises many components/circuitry that are crucial to providing a realistic augmented reality experience to the user 680.
  • the AR server 612 comprises a map 690 of the physical world that is frequently consulted by the local processing module 660 of the AR device 62 to render virtual content in relation to physical objects of the real world.
  • the AR server 612 builds upon information captured through numerous users to build an ever-growing map 690 of the real world.
  • the AR server 612 may simply host the map 690 which may be built and maintained by a third party.
  • the AR server 612 may also host an individual user's account, where the user's private captured data is channeled. This captured data may be stored in a database 654, in one or more embodiments.
  • the database 654 may store user information 610, historical data 615 about the user 680, user preferences 616 and other entity authentication information 618. Indeed, other
  • the AR system may comprise many other types of information individual to the user.
  • the user information 610 may comprise a set of personal biographical information (e.g., name, age, gender, address, location, etc.), in one or more embodiments.
  • Historical data 615 about the user 680 may refer to previous purchases and/or transactions performed by the user.
  • user preferences 616 may comprise a set of interests (e.g., shopping, activities, travel, etc.) and/or purchasing preferences (e.g., accessories, brands of interest, shopping categories, etc.) about the user.
  • behavioral data of the AR user may be used to inform the system of the user's preferences and/or purchasing patterns.
  • Other entity authentication information 618 may refer to authentication credentials of the user to verify that the user has been successfully authenticated to access outside accounts (e.g., banking authentication information, account authentication information of various websites, etc.)
  • the data captured through the AR device 62, data tracked through past activity, business data associated with the user, etc. may be analyzed to recognize patterns and/or to understand a behavior of the user. These functions may be performed by a third party, in one or more embodiments, in a privacy and security-sensitive manner.
  • the database 654 may also store other entity authentication information 618 that allows the AR server 612 to communicate with financial institutions and/or third party entities particular to the user.
  • the other entity authentication information 618 may refer to the user's banking information (e.g., bank name, account information, etc.). This information may, in turn, be used to communicate with financial information, third party entities, vendors, etc.
  • the AR server 612 may communicate with one or more financial institutions 620 in order to complete transactions.
  • the financial institution may have the user's financial information.
  • the financial institution may perform a second verification of the user's authentication information for security purposes.
  • the AR server 612 may be authenticated to communicate with the financial institution 620. If the user is authenticated for a particular purchase of an item 630, the financial institution 620 may directly communicate with one or more vendors (622A, 622B, etc.) to directly transmit money to the vendors once the user has been authenticated. In other embodiments (not shown), the AR server 612 may communicate directly with the vendors as well to communicate data regarding one or more purchases.
  • the user 680 may not need to connect to the AR server 612 to proceed with one or more financial transactions.
  • the AR device 62 may allow "offline browsing" of a plurality of e-commerce sites, etc., and the user 680 may be able to select one or more items of interest through an offline ID.
  • the financial institution 620 or vendor 622 may have a random number generator created for that particular transaction, which may be later verified once the AR device 62 is connected to the network 650.
  • the system may validate the transaction offline, and then use additional information (e.g., random generated number) to verify the purchase at a later time. This allows the AR device 62 to participate in necessary commercial transactions even if the user is not currently connected to the AR server 612 and/or financial institutions or vendors.
  • the vendors (622A, 622B, etc.) may have a pre- established relationship with the AR server 612 and/or the financial institution(s) 620 that enables this new paradigm of making purchases through the AR device 62. It should be appreciated that the embodiments described above are provided for illustrative purposes only, and other embodiments may comprise greater or fewer components.
  • an input may be received regarding a transaction.
  • the user may explicitly indicate (e.g., through a command, a gesture, etc.) an interest in purchasing an item.
  • the AR system may suggest a purchase to the AR user based on past purchases, user interests, etc., and receive a confirmation from the user.
  • the AR system may assess interest based on "heat maps" of various items.
  • the AR device 62 may be able to determine how long a user has looked at various virtual and/or real objects, in order to determine the user's interest in an item. For example, if the user is viewing a virtual advertisement for a particular brand, the AR device 62 may gauge the user's interest by determining how long the user has looked at a particular product. In one or more embodiments, the AR system may generate heat maps based on how long one or more users have looked at a particular product. If the heat map indicates interest in a particular product (e.g., amount of time spent looking at a particular item exceeds a predetermined threshold amount of time), the AR device 62 may request confirmation from the AR user about purchase of the product.
  • the AR device 62 may request confirmation from the AR user about purchase of the product.
  • the AR device 62 may perform a user-identification protocol. There may be many types of user-identification protocols, as will be described in further detail below.
  • the AR device 62 may request a "password" based simply on eye movements to determine if the user is verified.
  • the AR device 62 may capture a picture of the user's iris, and confirm whether the user is the valid user of the AR device 62 (and the accounts linked to the AR device 62).
  • the AR device 62 may monitor a continuity of the AR device 62 remaining on the user's head (e.g., if the user has not removed the AR device 62 at all, it is likely that the user is the same).
  • the AR device 62 may, based on the user- identification protocol, periodically capture iris images, or periodically perform tests to ensure that the user is the verified user of the AR device 62. As discussed here, there are many ways to identify the user through biometric data, and some example methods will be described further below.
  • the identification protocol may be a constant identification (e.g., movement patterns of the eye, contact with skin, etc.) of the user.
  • a constant identification e.g., movement patterns of the eye, contact with skin, etc.
  • the identification protocol may simply be a one-time identification (through any identification method). Thus, in some embodiments, once the AR system has identified a user once, the same user may not need to be identified unless an intervening event occurs (e.g., user removes AR device 62, interruption in network connectivity, etc.).
  • the AR device 62 may determine whether the identification protocol requires capture of biometric data. If the user protocol requires biometric data to be captured, the AR device may capture biometric data. Otherwise, the AR device 62 may proceed to identify the user through a non-biometric capture identification method.
  • biometric data may be captured.
  • the AR device 62 may track the user's eye movement through one or more eye tracking cameras. The captured movement may be correlated with the "password" or signature eye movement to determine if the user is verified.
  • the user-identification protocol is iris capture, an image of the iris may be captured, and be correlated with the known image of the user.
  • an iris capture or an eye test may be performed to verify the identity of the user.
  • the biometric data may be eye-related in some embodiments, or may be other types of biometric data.
  • the biometric data may be voice, in one or more embodiments.
  • the biometric data may be eye lash related data, or eye shape data. Any type of biometric data that may be used to uniquely identify a user over other users may be used.
  • the biometric data may be compared to predetermined user identification data, to identify the user. Or, if the user identification doesn't require biometric data, the AR device 62 may determine, for example, that the user has not taken off the AR device 62, therefore indicating that the user is the same as the previously identified user. If the user is identified, the AR device 62 proceeds to 710 and transmits information to one or more financial institutions.
  • the AR device 62 may perform another user-identification protocol, or else block the user from making the transaction. If the user is identified, data regarding the desired item may be transmitted to the cloud, and to the financial institution, at 710. For example, following the example above, information about the desired shoes (e.g., product number, quantity desired, information about the user, shipping address, user account, etc.) may be communication to the vendors and/or financial institution.
  • information about the desired shoes e.g., product number, quantity desired, information about the user, shipping address, user account, etc.
  • the vendors and/or financial institution may be communication to the vendors and/or financial institution.
  • the AR system may receive confirmation from the financial institution that payment is complete and/or authorized.
  • a confirmation message may be displayed to the user to confirm that the purchase has been complete.
  • one approach to identify a user for validation purposes is by periodically administering a user identification test.
  • the user-identification method may utilize eye-related data to complete the user identification test. Because the AR device 62 is equipped with eye tracking cameras that continually track the user's eye movements, a known pattern of eye movements may be used as an eye test to recognize and/or identify a user. For example, while a password may be easily copied or stolen, it may be difficult to replicate eye movements or other physiological characteristics of other users, making it easier to identify non-authorized users of the AR device 62.
  • the system may, with input of the user, configure a known pattern of eye movement (i.e., akin to an eye- password) unique to the user.
  • This known pattern of eye movement may be stored and correlated every time the user-identification protocol is performed.
  • an example eye pattern 802 of a user's eyes 804 is provided.
  • the AR device 806 may track the user's eye movement through eye-tracking cameras (not shown), and correlate the pattern with the known pattern of the eye movement (i.e., eye password).
  • the AR device 806 may allow the user to conduct the transaction. As shown in Fig. 8A, the user may have moved his/her eye in the denoted pattern. For illustrative purposes, a line (802) is drawn to represent the eye pattern. Of course, in practice, there would be no line, but the eye tracking devices would simply track such a movement and convert it to a desired data format.
  • a grid 904 similar to that shown in Fig. 8B may be utilized. It should be appreciated that other such techniques may be used as well. By dividing an available space into discretized areas, through use of the grid 904, it may be easier to determine whether the tracked eye pattern 802 resembles the predetermined pattern 902 most closely. For example, as shown in Fig. 8B, the tracked eye pattern 802 more or less follows the predetermined pattern 902 (as denoted by the bold line connecting the centers of each grid square associated with the predetermined pattern). Although Fig. 8B represents a rather simplified version of the grid 904, it should be appreciated that the size of each grid may be reduced for more accurate determinations.
  • the pattern may be recognized.
  • a majority of the grid squares may need to be hit before the user is deemed to have passed the user-identification test.
  • other such thresholds may be devised for various eye- movement tracking protocols.
  • a blink pattern may be similarly utilized.
  • the eye-password may be a series of blinks, or blinks combined with movement to track a signature of the user.
  • an eye-movement test may be initiated.
  • the AR user may indicate a desire to purchase an item, or the AR user may have put down the AR device 62, 806 and may resume wearing the AR device 62, 806.
  • the AR user may indicate a desire to purchase an item, or the AR user may have put down the AR device 62, 806 and may resume wearing the AR device 62, 806.
  • the AR user may indicate a desire to purchase an item, or the AR user may have put down the AR device 62, 806 and may resume wearing the AR device 62, 806.
  • the eye-movement test may be administered periodically for security purposes.
  • an eye-movement pattern may be tracked and received.
  • a virtual display screen may display instructions to "enter password,” which may trigger the user to form the known pattern with his/her eyes.
  • the tracked eye-movement may be converted into a particular data format.
  • the data may indicate the coordinates of the grids that were hit by the eye-movement.
  • Many other approaches may be similarly used.
  • the converted data may be compared to a predetermined data of a known signature eye-movement pattern.
  • the AR system may determine if the tracked eye- movement matches the predetermined pattern within a threshold.
  • the user fails the test, and may be blocked from making the purchase, or may have to undergo the test again.
  • the user passes the test, and may be allowed to make the purchase.
  • the AR system may periodically capture a picture of the AR user's eye, and perform an eye-identification by comparing the captured image of the user's eye with known information.
  • the AR device 62, 806 may request the user to stare at a particular virtual object presented to the user. This allows the user's eye to be still, and an image of the user's eye may be captured, and compared. If the picture of the eye correlates with a known picture of the user's eye, the AR user may be allowed to make the purchase.
  • a head mounted display (“HMD”) component features one or more cameras that are oriented to capture image information pertinent to the user's eyes.
  • HMD head mounted display
  • each eye of the user may have a camera focused on it, along with three or more LEDs (in one embodiment directly below the eyes as shown) with known offset distances to the camera, to induce glints upon the surfaces of the eyes.
  • the system can deduce the curvature of the eye. With known 3D offset and orientation to the eye, the system can form exact (images) or abstract (gradients or other features) templates of the iris or retina for use to identify the user. In other embodiments, other characteristics of the eye, such as the pattern of veins in and over the eye, may also be used (e.g., along with the iris or retinal templates) to identify the user.
  • an iris-image identification approach may be used.
  • the pattern of muscle fibers in the iris of an eye forms a stable unique pattern for each person, including freckles, furrows and rings.
  • Various iris features may be more readily captured using infrared or near-infrared imaging compared to visible light imaging.
  • the system can transform the captured iris features into an identification code in many different ways. The goal is to extract a sufficiently rich texture from the eye. With sufficient degrees of freedom in the collected data, the system can theoretically identify a unique user.
  • the HMD comprises a diffraction display driven by a laser scanner steered by a steerable fiber optic cable.
  • This fiber optic cable can also be utilized to visualize the interior of the eye and image the retina, which has a unique pattern of visual receptors (rods and cones) and blood vessels. These rods and cones may also form a pattern unique to each individual, and can be used to uniquely identify each person.
  • a pattern of dark and light blood vessels of each person is unique and can be transformed into a "dark-light" code by standard techniques such as applying gradient operators to the retinal image and counting high low transitions in a standardized grid centered at the center of the retina.
  • the subject systems may be utilized to identify the user with enhanced accuracy and precision by comparing user characteristics captured or detected by the system with known baseline user characteristics for an authorized user of the system.
  • a curvature/size of the eye may be similarly used. For example, this information may assist in identifying the user because eyes of different users are similar but not exactly the same.
  • temporal biometric information may be collected when the user is subjected to stress, and correlated to known data. For example, a user's heart rate may be monitored, whether the user's eyes are producing a water film, whether the eyes verge and focus together, breathing patterns, blink rates, pulse rate, etc. may be similarly used to confirm and/or invalidate the user's identity.
  • the AR system may correlate information captured through the AR device (e.g., images of the surrounding environment captured through the field-of-view cameras of the AR device 62, 806) and determine whether the user is seeing the same scene that correlates to the location as derived from GPS and maps of the environment. For example, if the user is supposedly at home, the AR system may verify by correlated known objects of the user's home with what is being seen through the user's field-of-view cameras.
  • the above-described AR/user identification system provides an extremely secure form of user identification.
  • the system may be utilized to determine who the user is with relatively high degrees of accuracy and precision. Since the system can be utilized to know who the user is with unusually high degree of certainty, and on a persistent basis (e.g., using periodic monitoring), it can be utilized to enable various financial transactions without the need for separate logins.
  • One approach to ensure that the user identification system is highly accurate is through the use of neural networks, as is described in further detail in co-pending application 62/159,593 under Attorney Docket No. ML 30028.00.
  • Figs. 10A-10I an example process flow 1000 of using biometric data for conducting transactions is illustrated. As shown in Fig. 10A, a user 1002 wearing an AR device 1004 walks into a store. While at the store, the user 1002 may see a pair of shoes 1006 he may be interested in purchasing.
  • Fig. 10B an example view of the shoes, as seen by the user 1002 through the AR device 1004 is shown. Detecting that the user's gaze is focused on the pair of shoes 1006, the AR device 1004 may look up details about the pair of shoes 1006 (e.g., through a product catalog synched to the AR device 1004, etc.), and display the details as virtual content 1008. Referring now to Fig. 10C, the AR device 1004 may determine if the user wants to purchase an item by displaying virtual content 1010. The user 1002 may confirm or reject through any form of user input (e.g., gestures, voice, eye control, etc.).
  • any form of user input e.g., gestures, voice, eye control, etc.
  • the AR device 1004 may request the password through virtual content 1012.
  • the user 1002 may proceed to produce eye signature 1016.
  • a virtual grid 1014 may be presented to the user to aid in moving the eyes in a particular manner.
  • the inputted signature 1016 may be received by the AR system 1004, and compared to the predetermined signature to determine if the user is an authenticated user. If the user is authenticated, the AR device 1004, as shown in Fig. 10G may transmit data to the AR server 1020 through a network 1018 regarding the desired product to the vendor 1024 and a financial institution 1022. Based on the confirmation received from the AR server 1020, the financial institution 1022 may transmit the appropriate monetary amount to the vendor 1024.
  • Fig. 10H once the transaction has been confirmed, the AR device 1004 may display virtual content 1026 confirming purchase of the shoes 1006. Having received confirmation, the user 1002 may walk out of the store with the desired shoes 1006, as shown in Fig. 101.
  • Figs. 10A-10I represents only an example embodiment that is presented here for illustrative purposes only and should not be read as limiting. Numerous other embodiments may be similarly envisioned. For example, in one or more embodiments, rather than requesting a "password" (e.g., Fig.
  • the AR system may request the user to stare at a virtual dot on the screen, and capture an image of the user's eye (e.g., retina, iris, eye shape, etc.). This image may then be correlated to a known image of the user's eye, and the user's identity may be confirmed. Once the user's identity has been confirmed the AR system may transmit information to the vendor 1024 and the financial institution 1022 as shown in Fig. 10G. Similarly, many other similar
  • the subject system can pre-identify/pre-authenticate a user with a very high degree of certainty. Further, the system can maintain the identification of the user over time using periodic monitoring. Therefore, the identified user can have instant access to any site after a notice (that can be displayed as an overlaid user interface item to the user) about the terms of that site.
  • the system may create a set of standard terms predetermined by the user, so that the user instantly knows the conditions on that site. If a site does not adhere to this set of conditions (e.g., the standard terms), then the subject system may not automatically allow access or transactions therein.
  • the above-described AR/user identification systems can be used to facilitate "micro-transactions." Micro-transactions which generate very small debits and credits to the user's financial account, typically on the order of a few cents or less than a cent.
  • the subject system may be configured to see that the user not only viewed or used some content but for how long (a quick browse might be free, but over a certain amount would be a charge).
  • a news article may cost 1/3 of a cent; a book may be charged at a penny a page; music at 10 cents a listen, and so on.
  • an advertiser may pay a user 1 ⁇ 2 a cent for selecting a banner ad or taking a survey.
  • the system may be configured to apportion a small percentage of the transaction fee to the service provider.
  • the system may be utilized to create a specific micro- transaction account, controllable by the user, in which funds related to micro-transactions are aggregated and distributed in predetermined meaningful amounts to/from the user's more traditional financial account (e.g., an online banking account).
  • the micro-transaction account may be cleared or funded at regular intervals (e.g., quarterly) or in response to certain triggers (e.g., when the user exceeds several dollars spent at a particular website).
  • the subject system and functionality may be provided by a company focused on augmented reality, and since the user's I D is very certainly and securely known, the user may be provided with instant access to their accounts, 3D view of amounts, spending, rate of spending and graphical and/or geographical map of that spending. Such users may be allowed to instantly adjust spending access, including turning spending (e.g., micro-transactions) off and on.
  • turning spending e.g., micro-transactions
  • the user may use the system to order perishable goods for delivery to their tracked location or to a user selected map location.
  • the system can also notify the user when deliveries arrive (e.g., by displaying video of a delivery being made in the AR system).
  • AR telepresence a user can be physically located in an office away from their house, but let a delivery person into their house, appear to the delivery person by avatar telepresence, watch the delivery person as they deliver the product, then make sure the delivery person leaves, and lock the door to their house by avatar.
  • the system may store user product preferences and alert the user to sales or other promotions related to the user's preferred products.
  • the user can see their account summary, all the statistics of their account and buying patterns, thereby facilitating comparison shopping before placing their order.
  • the system may be utilized to track the eye, it can also enable "one glance” shopping. For instance, a user may look at an object (say a robe in a hotel) and say, "I want that, when my account goes back over $3,000.” The system would execute the purchase when specific conditions (e.g., account balance greater than $3,000) are met.
  • object say a robe in a hotel
  • specific conditions e.g., account balance greater than $3,000
  • iris and/or retinal signature data may be used to secure communications.
  • the subject system may be configured to allow text, image, and content to be transmittable selectively to and displayable only on trusted secure hardware devices, which allow access only when the user can be authenticated based on one or more dynamically measured iris and/or retinal signatures.
  • the AR system display device projects directly onto the user's retina, only the intended recipient (identified by iris and/or retinal signature) may be able to view the protected content; and further, because the viewing device actively monitors the users eye, the dynamically read iris and/or retinal signatures may be recorded as proof that the content was in fact presented to the user's eyes (e.g., as a form of digital receipt, possibly accompanied by a verification action such as executing a requested sequence of eye movements).
  • Spoof detection may rule out attempts to use previous recordings of retinal images, static or 2D retinal images, generated images, etc. based on models of natural variation expected.
  • a unique fiducial/watermark may be generated and projected onto the retinas to generate a unique retinal signature for auditing.
  • the invention includes methods that may be performed using the subject devices.
  • the methods may comprise the act of providing such a suitable device. Such provision may be performed by the end user.
  • the "providing" act merely requires the end user obtain, access, approach, position, set-up, activate, power-up or otherwise act to provide the requisite device in the subject method.
  • Methods recited herein may be carried out in any order of the recited events which is logically possible, as well as in the recited order of events.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computer Graphics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Processing Or Creating Images (AREA)
PCT/US2016/032583 2015-05-14 2016-05-14 Augmented reality systems and methods for tracking biometric data WO2016183541A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
EP16793671.5A EP3295347A4 (en) 2015-05-14 2016-05-14 Augmented reality systems and methods for tracking biometric data
CA2984455A CA2984455C (en) 2015-05-14 2016-05-14 Augmented reality systems and methods for tracking biometric data
JP2017558979A JP6863902B2 (ja) 2015-05-14 2016-05-14 バイオメトリックデータを追跡するための拡張現実システムおよび方法
CN201680027161.7A CN107533600A (zh) 2015-05-14 2016-05-14 用于跟踪生物特征数据的增强现实系统和方法
AU2016262579A AU2016262579B2 (en) 2015-05-14 2016-05-14 Augmented reality systems and methods for tracking biometric data
KR1020177035995A KR102393271B1 (ko) 2015-05-14 2016-05-14 생체 인증 데이터를 추적하기 위한 증강 현실 시스템들 및 방법들
NZ736861A NZ736861B2 (en) 2015-05-14 2016-05-14 Augmented reality systems and methods for tracking biometric data
IL255325A IL255325B (en) 2015-05-14 2017-10-30 Augmented reality systems and biometric data tracking methods

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562161588P 2015-05-14 2015-05-14
US62/161,588 2015-05-14

Publications (1)

Publication Number Publication Date
WO2016183541A1 true WO2016183541A1 (en) 2016-11-17

Family

ID=57249401

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/032583 WO2016183541A1 (en) 2015-05-14 2016-05-14 Augmented reality systems and methods for tracking biometric data

Country Status (8)

Country Link
EP (1) EP3295347A4 (ko)
JP (2) JP6863902B2 (ko)
KR (1) KR102393271B1 (ko)
CN (1) CN107533600A (ko)
AU (1) AU2016262579B2 (ko)
CA (1) CA2984455C (ko)
IL (1) IL255325B (ko)
WO (1) WO2016183541A1 (ko)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018102246A1 (en) 2016-11-29 2018-06-07 Alibaba Group Holding Limited Service control and user identity authentication based on virtual reality
WO2018102245A1 (en) * 2016-11-29 2018-06-07 Alibaba Group Holding Limited Virtual reality device using eye physiological characteristics for user identity authentication
CN108334185A (zh) * 2017-01-20 2018-07-27 深圳纬目信息技术有限公司 一种头戴显示设备的眼动数据反馈系统
WO2019031531A1 (ja) * 2017-08-10 2019-02-14 日本電気株式会社 情報取得システム、情報取得方法及び記憶媒体
KR20190113880A (ko) * 2017-02-23 2019-10-08 알리바바 그룹 홀딩 리미티드 가상현실 장면-기반 비즈니스 검증 방법 및 디바이스
US10846388B2 (en) 2017-03-15 2020-11-24 Advanced New Technologies Co., Ltd. Virtual reality environment-based identity authentication method and apparatus
US11050752B2 (en) 2018-06-07 2021-06-29 Ebay Inc. Virtual reality authentication

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3772699A1 (en) * 2019-08-09 2021-02-10 Siemens Aktiengesellschaft Method for user verification, communication device and computer program
KR102628102B1 (ko) * 2019-08-16 2024-01-23 엘지전자 주식회사 Xr 디바이스 및 그 제어 방법
CN111104927B (zh) * 2019-12-31 2024-03-22 维沃移动通信有限公司 一种目标人物的信息获取方法及电子设备
JP6839324B1 (ja) * 2020-08-06 2021-03-03 株式会社キューブシステム 入力システム、入力プログラムおよび入力方法
US12008605B2 (en) 2022-02-16 2024-06-11 International Business Machines Corporation Peer-to-peer donation management

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090140839A1 (en) * 2001-07-10 2009-06-04 American Express Travel Related Services Company, Inc. Systems and methods for non-traditional payment using biometric data
US20120233072A1 (en) * 2011-03-08 2012-09-13 Bank Of America Corporation Conducting financial transactions based on identification of individuals in an augmented reality environment
US20130030966A1 (en) * 2011-07-28 2013-01-31 American Express Travel Related Services Company, Inc. Systems and methods for generating and using a digital pass
US20130267204A1 (en) * 2012-02-28 2013-10-10 Verizon Patent And Licensing Inc. Method and system for multi-factor biometric authentication based on different device capture modalities

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003533801A (ja) * 2000-05-16 2003-11-11 スイスコム・モバイル・アクチエンゲゼルシヤフト 生物測定学的な身元確認方法及び認証方法
JP4765575B2 (ja) * 2005-11-18 2011-09-07 富士通株式会社 個人認証方法、個人認証プログラムおよび個人認証装置
JP5375481B2 (ja) * 2009-09-24 2013-12-25 ブラザー工業株式会社 ヘッドマウントディスプレイ
WO2011106798A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US8482859B2 (en) * 2010-02-28 2013-07-09 Osterhout Group, Inc. See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
JP5548042B2 (ja) * 2010-06-23 2014-07-16 ソフトバンクモバイル株式会社 ユーザ端末装置及びショッピングシステム
US8988350B2 (en) * 2011-08-20 2015-03-24 Buckyball Mobile, Inc Method and system of user authentication with bioresponse data
US10223710B2 (en) * 2013-01-04 2019-03-05 Visa International Service Association Wearable intelligent vision device apparatuses, methods and systems
HK1160574A2 (en) * 2012-04-13 2012-07-13 King Hei Francis Kwong Secure electronic payment system and process
US8953850B2 (en) * 2012-08-15 2015-02-10 International Business Machines Corporation Ocular biometric authentication with system verification
US9164580B2 (en) * 2012-08-24 2015-10-20 Microsoft Technology Licensing, Llc Calibration of eye tracking system
JP2014092940A (ja) * 2012-11-02 2014-05-19 Sony Corp 画像表示装置及び画像表示方法、並びにコンピューター・プログラム
WO2015112108A1 (en) * 2012-11-28 2015-07-30 Visa International Service Association Multi disparate gesture actions and transactions apparatuses, methods and systems
US9979547B2 (en) * 2013-05-08 2018-05-22 Google Llc Password management
US9336781B2 (en) * 2013-10-17 2016-05-10 Sri International Content-aware speaker recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090140839A1 (en) * 2001-07-10 2009-06-04 American Express Travel Related Services Company, Inc. Systems and methods for non-traditional payment using biometric data
US20120233072A1 (en) * 2011-03-08 2012-09-13 Bank Of America Corporation Conducting financial transactions based on identification of individuals in an augmented reality environment
US20130030966A1 (en) * 2011-07-28 2013-01-31 American Express Travel Related Services Company, Inc. Systems and methods for generating and using a digital pass
US20130267204A1 (en) * 2012-02-28 2013-10-10 Verizon Patent And Licensing Inc. Method and system for multi-factor biometric authentication based on different device capture modalities

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3295347A4 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7065867B2 (ja) 2016-11-29 2022-05-12 アドバンスド ニュー テクノロジーズ カンパニー リミテッド ユーザ識別認証のために目の生理的特性を使用する仮想現実デバイス
WO2018102245A1 (en) * 2016-11-29 2018-06-07 Alibaba Group Holding Limited Virtual reality device using eye physiological characteristics for user identity authentication
US11783632B2 (en) 2016-11-29 2023-10-10 Advanced New Technologies Co., Ltd. Service control and user identity authentication based on virtual reality
WO2018102246A1 (en) 2016-11-29 2018-06-07 Alibaba Group Holding Limited Service control and user identity authentication based on virtual reality
US11348369B2 (en) 2016-11-29 2022-05-31 Advanced New Technologies Co., Ltd. Service control and user identity authentication based on virtual reality
EP3549126A4 (en) * 2016-11-29 2019-11-27 Alibaba Group Holding Limited ORDERING A SERVICE AND AUTHENTICATING THE IDENTITY OF A USER BASED ON A VIRTUAL REALITY
JP2020514897A (ja) * 2016-11-29 2020-05-21 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実に基づいたサービス制御およびユーザ識別認証
JP2020515949A (ja) * 2016-11-29 2020-05-28 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited ユーザ識別認証のために目の生理的特性を使用する仮想現実デバイス
CN108334185A (zh) * 2017-01-20 2018-07-27 深圳纬目信息技术有限公司 一种头戴显示设备的眼动数据反馈系统
KR20190113880A (ko) * 2017-02-23 2019-10-08 알리바바 그룹 홀딩 리미티드 가상현실 장면-기반 비즈니스 검증 방법 및 디바이스
KR102298793B1 (ko) * 2017-02-23 2021-09-07 어드밴스드 뉴 테크놀로지스 씨오., 엘티디. 가상현실 장면-기반 비즈니스 검증 방법 및 디바이스
US11170087B2 (en) 2017-02-23 2021-11-09 Advanced New Technologies Co., Ltd. Virtual reality scene-based business verification method and device
JP2020515945A (ja) * 2017-02-23 2020-05-28 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実シーンベースのビジネス検証方法およびデバイス
EP3567535A4 (en) * 2017-02-23 2019-11-13 Alibaba Group Holding Limited BUSINESS VERIFICATION METHOD AND DEVICE BASED ON VIRTUAL REALITY SCENE
US10846388B2 (en) 2017-03-15 2020-11-24 Advanced New Technologies Co., Ltd. Virtual reality environment-based identity authentication method and apparatus
JPWO2019031531A1 (ja) * 2017-08-10 2020-05-28 日本電気株式会社 情報取得システム、情報取得方法及び記憶媒体
WO2019031531A1 (ja) * 2017-08-10 2019-02-14 日本電気株式会社 情報取得システム、情報取得方法及び記憶媒体
US11050752B2 (en) 2018-06-07 2021-06-29 Ebay Inc. Virtual reality authentication
US11736491B2 (en) 2018-06-07 2023-08-22 Ebay Inc. Virtual reality authentication

Also Published As

Publication number Publication date
IL255325B (en) 2021-04-29
KR102393271B1 (ko) 2022-04-29
JP7106706B2 (ja) 2022-07-26
NZ736861A (en) 2021-06-25
CA2984455A1 (en) 2016-11-17
AU2016262579B2 (en) 2021-06-03
JP2018526701A (ja) 2018-09-13
CA2984455C (en) 2022-02-08
JP6863902B2 (ja) 2021-04-21
CN107533600A (zh) 2018-01-02
IL255325A0 (en) 2017-12-31
AU2016262579A1 (en) 2017-11-23
JP2021121923A (ja) 2021-08-26
EP3295347A1 (en) 2018-03-21
EP3295347A4 (en) 2018-05-02
KR20180008632A (ko) 2018-01-24

Similar Documents

Publication Publication Date Title
US20160358181A1 (en) Augmented reality systems and methods for tracking biometric data
AU2016262579B2 (en) Augmented reality systems and methods for tracking biometric data
US11216965B2 (en) Devices, methods and systems for biometric user recognition utilizing neural networks
CN109154983B (zh) 被配置为交换生物测定信息的头戴式显示系统
JP2017527036A (ja) セキュアなモバイル通信で眼信号を用いるためのシステムおよび方法
WO2023164268A1 (en) Devices, methods, and graphical user interfaces for authorizing a secure operation
NZ736861B2 (en) Augmented reality systems and methods for tracking biometric data
US20230273985A1 (en) Devices, methods, and graphical user interfaces for authorizing a secure operation
NZ736574B2 (en) Methods for biometric user recognition

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16793671

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2984455

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 255325

Country of ref document: IL

ENP Entry into the national phase

Ref document number: 2017558979

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2016262579

Country of ref document: AU

Date of ref document: 20160514

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20177035995

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2016793671

Country of ref document: EP