CN117592027A - Techniques for providing user authentication for near-eye display devices - Google Patents

Techniques for providing user authentication for near-eye display devices Download PDF

Info

Publication number
CN117592027A
CN117592027A CN202311033865.3A CN202311033865A CN117592027A CN 117592027 A CN117592027 A CN 117592027A CN 202311033865 A CN202311033865 A CN 202311033865A CN 117592027 A CN117592027 A CN 117592027A
Authority
CN
China
Prior art keywords
user
display device
eye display
biometric information
authentication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311033865.3A
Other languages
Chinese (zh)
Inventor
欣德·霍贝卡
王楠
布鲁斯·王
陈忠明
大卫·陶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Meta Platforms Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/113,919 external-priority patent/US20240061918A1/en
Application filed by Meta Platforms Technologies LLC filed Critical Meta Platforms Technologies LLC
Publication of CN117592027A publication Critical patent/CN117592027A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Collating Specific Patterns (AREA)

Abstract

Authentication techniques based on user biometric information are provided to a wearer of a near-eye display device that participates in multimedia content and/or access to restricted data. Biometric information and/or behavioral biometric information associated with a user may be captured by sensors and similar devices as follows: the sensor and similar devices are integrated into or communicatively coupled to a near-eye display device. The biometric information may include data associated with a user's face, fingerprint, palm print, iris, retina, electrocardiograph, etc. The behavioral biometric information may include data associated with a user's movement, gait, one or more gestures, voice, and the like. One authentication technique may be automatically selected based on environmental conditions (e.g., noise level or light level). The near-eye display device may detect continuous wear of the user after a period of non-use or between two different authentication sessions and refresh or continue authentication.

Description

Techniques for providing user authentication for near-eye display devices
Cross Reference to Related Applications
This patent application claims priority from U.S. provisional patent application Ser. No. 63/398,415, entitled "TECHNIQUES TO PROVIDE USER AUTHENTICATION FOR A NEAR-EYE DISPLAY DEVICE," filed 8/16 of 2022.
Technical Field
The present application relates generally to near-eye display devices having augmented reality (augmented reality, AR)/Virtual Reality (VR) functionality, and in particular, to techniques for authenticating a user of a near-eye display device for secure content delivery, data exchange, and/or communication via the near-eye display device.
Background
In recent years, with recent advances in technology, popularity and diffusion of content authoring and delivery have increased greatly. In particular, interactive content such as Virtual Reality (VR) content, augmented Reality (AR) content, mixed Reality (MR) content, and content within and associated with a real and/or virtual environment (e.g., "metaspace") has become attractive to consumers.
In addition to the delivery of content, near-eye display devices (e.g., head-mounted displays (HMDs)) may be used to facilitate audio/video communications, access to network-stored data (e.g., viewing files), or similar data exchange sessions similar to portable computing devices.
Disclosure of Invention
In some examples of the present disclosure, a wearer of a near-eye display device may be provided with various authentication techniques based on user biometric information. The near-eye display device may be used to participate in multimedia content (provided to streaming or similar delivery technologies) and/or access to restricted data. For example, limited access content (e.g., images, streaming video, and the like) may be presented to the user; users may participate in communication sessions (e.g., conferences), multi-party games, and similar data exchanges through near-eye display devices, and may access file directories, data stores, view documents, banking information, and the like. For any of these and similar functions, the user (i.e., the current wearer of the near-eye display device) may be authenticated for access to particular functions.
Drawings
Features of the present disclosure are illustrated by way of example and not limited by the following figures, in which like references indicate similar elements. Those skilled in the art will readily recognize from the following that alternative examples of the structures and methods illustrated in the accompanying drawings may be employed without departing from the principles described herein.
Fig. 1 illustrates a perspective view of a near-eye display device in the form of a pair of Augmented Reality (AR) glasses according to an example.
Fig. 2A illustrates a perspective view of a near-eye display device that may be used to authenticate a user through fingerprint detection, according to an example.
Fig. 2B illustrates a perspective view of a near-eye display device that may be used to authenticate a user through gesture detection, according to an example.
Fig. 2C illustrates a perspective view of a near-eye display device that may be used to authenticate a user through speech recognition, according to an example.
Fig. 2D illustrates a perspective view of a near-eye display device that may be used to authenticate a user through iris scanning, according to an example.
Fig. 2E illustrates a perspective view of a near-eye display device that may be used to authenticate a user through electrocardiographic signal detection according to an example.
Fig. 2F illustrates a perspective view of a near-eye display device that may be used to authenticate a user through palm print detection, according to an example.
Fig. 2G illustrates a perspective view of a near-eye display device that may be used to authenticate a user through password entry, according to an example.
Fig. 2H illustrates a perspective view of a near-eye display device having one or more sensors to detect continuous wear of the near-eye display device, according to an example.
Fig. 3 illustrates a flow chart of a method for authenticating a user of a near-eye display device for secure data exchange or communication via the near-eye display device according to an example.
Detailed Description
For purposes of simplicity and illustration, the present application is described by referring primarily to examples of the present application. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. It will be apparent, however, that the present application may be practiced without limitation to these specific details. In other instances, some methods and structures that would be readily understood by one of ordinary skill have not been described in detail so as not to unnecessarily obscure the present application. As used herein, the terms "a" and "an" are intended to mean at least one of the specified elements, the term "comprising" is intended to be inclusive and not limited to, and the term "based on" is intended to be based, at least in part, on.
As used herein, a "near-eye display device" may refer to any display device (e.g., an optical device) that may be in close proximity to the eyes of a user. As used herein, an "artificial reality" may refer to, among other things, aspects of the environment of "metauniverse" or real and virtual elements, and may include the use of technologies associated with Virtual Reality (VR), augmented Reality (AR), and/or Mixed Reality (MR). As used herein, a "user" may refer to a user or wearer of a "near-eye display device.
Near-eye display devices may provide digital content (real life and/or artificial images and video) from various sources (e.g., local data storage, networked streaming sources, etc.). In addition, near-eye display devices may also be used for communications such as video conferencing, multiparty gaming, multiparty video viewing, and the like. Without any security measures, any wearer of the near-eye display device may access content or communication sessions available through the near-eye display device.
In some examples of the present disclosure, a wearer of a near-eye display device may be provided with various authentication techniques based on user biometric information. The near-eye display device may be used to participate in multimedia content (provided to streaming or similar delivery technologies) and/or access to restricted data. For example, limited access content (e.g., images, streaming video, and the like) may be presented to the user; users may participate in communication sessions (e.g., conferences), multi-party games, and similar data exchanges through near-eye display devices, and may access file directories, data stores, view documents, banking information, and the like. For any of these and similar functions, the user (i.e., the current wearer of the near-eye display device) may be authenticated for access to particular functions.
Example authentication techniques may include, but are not limited to, capturing or obtaining biometric information (biological biometric information) and/or behavioral biometric information (behavioral biometric information) associated with a user by sensors and similar devices as follows: the sensor and similar devices are integrated into or communicatively coupled to a near-eye display device. The biometric information may include data associated with a user's face, fingerprint, palm print, iris, retina, electrocardiograph, etc. Behavioral biometric information may include data associated with a user's movement, gait, one or more gestures, voice, and the like. In some examples, the near-eye display device may provide a selection among a variety of authentication techniques, or may automatically select one based on, for example, environmental conditions (e.g., noise level or light level). In other examples, the near-eye display device may detect continuous wear of the user after a period of non-use or between two different authentication sessions and refresh or continue authentication. In some examples, a camera on a device may capture a code (e.g., a two-dimensional code (QR code)) on another device (mobile phone, computer, etc.) that is capable of performing authentication in other ways.
While some advantages and benefits of the present disclosure are apparent, other advantages and benefits may include providing data and user identity security, ease of authentication (without requiring a separate computing device), transitioning between two different authentication sessions, and the like.
Fig. 1 is a perspective view of a near-eye display device 102 in the form of a pair of glasses (or other similar glasses) according to an example. In some examples, the near-eye display device 102 may be configured to operate as a virtual reality display, an Augmented Reality (AR) display, and/or a Mixed Reality (MR) display.
As shown in schematic 100, near-eye display device 102 may include a frame 105 and a display 110. In some examples, the display 110 may be configured to present media content or other content to a user. In some examples, display 110 may include display electronics and/or display optics. For example, the display 110 may include a liquid crystal display (liquid crystal display, LCD) display panel, a light-emitting diode (LED) display panel, or an optical display panel (e.g., a waveguide display assembly). In some examples, display 110 may also include any number of optical components, such as waveguides, gratings, lenses, mirrors, and the like. In other examples, display 110 may include a projector, or, in lieu of display 110, near-eye display device 102 may include a projector. The projector may use a laser to form an image in an angular domain on an eyebox for direct viewing by a viewer's eye, and may include a vertical cavity surface emitting laser (vertical cavity surface emitting laser, VCSEL) integrated with a photonic integrated circuit (photonic integrated circuit, PIC) to achieve high efficiency and reduced power consumption, the VCSEL emitting light at an off-normal angle.
In some examples, the near-eye display device 102 may also include various sensors 112A, 112B, 112C, 112D, and 112E on the frame 105 or within the frame 105. In some examples, as shown, the various sensors 112A-112E may include any number of depth sensors, motion sensors, position sensors, inertial sensors, and/or ambient light sensors. In some examples, the various sensors 112A-112E may include any number of image sensors configured to generate image data representing different fields of view in one or more different directions. In some examples, the various sensors 112A-112E may be used as input devices to control or affect the display content of the near-eye display device, and/or to provide an interactive Virtual Reality (VR), augmented Reality (AR), and/or Mixed Reality (MR) experience to a user of the near-eye display device 102. In some examples, the various sensors 112A-112E may also be used for stereoscopic imaging or other similar applications.
In some examples, the near-eye display device 102 may also include one or more illuminators 108 to project light into the physical environment. The projected light may be associated with different frequency bands (e.g., visible light, infrared light, ultraviolet light, etc.), and may be used for various purposes. In some examples, one or more luminaires 108 may be used as locators.
In some examples, the near-eye display device 102 may also include a camera 104 or other image capturing unit. For example, the camera 104 may capture an image of a physical environment in the field of view. In some instances, the captured image may be processed, for example, by a virtual reality engine, to add virtual objects to the captured image or to modify physical objects in the captured image, and the processed image may be displayed to a user by display 110 for an Augmented Reality (AR) application and/or a Mixed Reality (MR) application.
In some examples, the near-eye display device 102 may include a local controller (also referred to as a controller) 115. Local controller 115 may manage the operation of electronic components and circuitry (e.g., display 110, camera 104, illuminator 108, and/or sensors 112A-112E) on near-eye display device 102. For example, the controller 115 may perform some or all of the various authentication operations associated with the various authentication techniques discussed herein. In other examples, the operation of the electronic components and circuitry on the near-eye display device 102 may be managed by a remote controller (not shown) that is communicatively coupled to the near-eye display device 102. In a further example, the remote controller and the local controller 115 may operate together, each performing some of a plurality of processes and tasks. The remote controller and the local controller 115 may be implemented as separate processors or distributed processors, graphics processing units (graphical processing unit, GPU), digital signal processing (digital signal processing, DSP) units, or similar circuits.
The various functions described herein may be distributed among the various components of the near-eye display device 102 in a different manner than described herein. Further, the near-eye display devices discussed herein may be implemented using more or fewer components than are shown in fig. 1. Although the near-eye display device 102 is shown and described in the form of eyeglasses, user authentication may be implemented in other forms of near-eye display devices or head-mounted display (HMD) devices (e.g., goggles, headsets, and the like). Further, user authentication may be implemented in other types of wearable devices (e.g., smart watches, smart armrings, non-display wearable devices (e.g., game controllers), and similar wearable devices).
Fig. 2A illustrates a perspective view of a near-eye display device 202 according to an example, the near-eye display device 202 may be used to authenticate a user through fingerprint detection. Near-eye display device 202 may include some or all of the components of near-eye display device 102 in fig. 1 for performing functions associated with presenting content to a user, such as Augmented Reality (AR) content, virtual Reality (VR) content, and/or other content. As mentioned herein, a user (the current wearer of near-eye display device 202) may be authenticated for secure communications and/or data exchange sessions. In one example implementation of the authentication function, an externally facing camera 214 on the frame 205 may be used to capture a user's fingerprint 218. The controller (not shown) may then compare the captured fingerprint 218 with stored fingerprints of the user and authenticate the user if the fingerprints match.
In some examples, fingerprint authentication may alternatively be performed by a fingerprint sensor 212 located on a temple (temple) 206 of the near-eye display device 202. The fingerprint sensor 212 may be located on a side surface, top surface and/or bottom surface of the temple 206, among other locations. The fingerprint sensor 212 may be, for example, a high density capacitive sensor as follows: the high density capacitive sensor may detect a fingerprint as a user slides their finger 216 over the fingerprint sensor 212 or places their finger 216 on the fingerprint sensor 212. The display 210 may be used to provide instructions to the user, for example, telling the user to place their finger in front of the camera 214 or slide their finger over the fingerprint sensor 212. The user may also be provided with a repeat request via a display and/or via an audio prompt if the fingerprint is not successfully detected.
Based on user preferences, any images (e.g., fingerprints) or information captured by the sensor, as well as authentication results, may or may not be stored at the near-eye display device 202 and/or any other external system (e.g., authentication server, content server, etc.). For personal information protection and privacy purposes, the default setting may be to not store information, which may be modified by the user at any time.
Fig. 2B illustrates a perspective view of a near-eye display device that may be used to authenticate a user through gesture detection, according to an example. As with the examples above, the near-eye display device 202 may include some or all of the components of the near-eye display device 102 in fig. 1 for performing functions associated with presenting content to a user, such as Augmented Reality (AR) content, virtual Reality (VR) content, and/or other content. In another example implementation of the authentication function, a camera 214 or similar image capturing device or a separate detector 224 on the frame 205 may be used to capture a sequence of finger gestures of the user. A controller (not shown) may then analyze the captured gesture sequence and authenticate the user based on gesture recognition.
In some examples, the user may customize a particular finger gesture, e.g., a sequence of different finger positions, a combination of which may be used to authenticate the user. The gesture may be detected by the camera 214 or by the detector 224. The detector 224 may be a wrist-worn device capable of detecting muscle electrical signals to interpret (inter) the signals as corresponding to a particular finger position or combination of finger positions. The detector 224 may be communicatively coupled to the near-eye display device 202 via wireless means, such as wireless local area network (wireless local area network, WLAN) communication, short range communication (e.g., bluetooth), or near field communication (near field communication, NFC).
In some examples, the inputs from camera 214 and detector 224 may be combined for strong (stronger) authentication or more reliable interpretation results. Illustrative examples of gesture combinations may include, but are not limited to: two index finger strokes and then two middle finger strokes, or up and then left. Display 210 may present instructions to a user. For example, the display 210 may instruct the user to place their hand in front of the camera 214 or activate the detector 224. Instructions may also be provided via audio cues. The detector 224 may be a smart watch, a game glove, or similar wrist-worn device or hand-worn device. In other examples, the near-eye display device may include one or more motion detectors to detect a gait of the user, which may be used to authenticate the user.
Based on user preferences, any images (e.g., gestures) or information captured by the sensors (e.g., muscle electrical signals), as well as authentication results, may or may not be stored at the near-eye display device 202 and/or any other external system (e.g., authentication server, content server, etc.). For personal information protection and privacy purposes, the default setting may be to not store information, which may be modified by the user at any time.
Fig. 2C illustrates a perspective view of a near-eye display device that may be used to authenticate a user through speech recognition, according to an example. As with the examples above, the near-eye display device 202 may include some or all of the components of the near-eye display device 102 in fig. 1 for performing functions associated with presenting content to a user, such as Augmented Reality (AR) content, virtual Reality (VR) content, and/or other content. In another example implementation of the authentication function, a microphone 232 or similar sound capture device on the frame 205 may be used to capture the user's voice. The controller (not shown) may then compare the captured speech to stored speech of the user or use other speech recognition techniques and authenticate the user based on speech recognition.
In some examples, display 210 may present instructions to a user. For example, the display 210 may present some predetermined words and instruct the user to speak the words aloud for speech recognition. Instructions may also be provided via audio cues. In other examples, any voice or utterance of the user may be used to detect a voice pattern and authenticate the user. Speech biometric techniques recognize a particular speaker based on distinguishing features of each person's speech, which are determined by each person's construction and behavioral speech patterns. The shape and size of each person's mouth and throat, as well as each person's language, pitch, and mode of speaking (e.g., speak fast and slow), shape one person's voice and can be used as recognition features. The speech recognition engine (e.g., executed by the local controller or the remote controller) may map the unique features of the user and then use the mapping for later recognition.
Based on user preferences, any data (e.g., captured speech) or biometric information identified therefrom, as well as authentication results, may or may not be stored at the near-eye display device 202 and/or any other external system (e.g., authentication server, content server, etc.). For personal information protection and privacy purposes, the default setting may be to not store information, which may be modified by the user at any time.
Fig. 2D illustrates a perspective view of a near-eye display device that may be used to authenticate a user through iris scanning, according to an example. As with the examples above, the near-eye display device 202 may include some or all of the components of the near-eye display device 102 in fig. 1 for performing functions associated with presenting content to a user, such as Augmented Reality (AR) content, virtual Reality (VR) content, and/or other content. In another example implementation of the authentication function, an inwardly facing camera 236 and/or an inwardly facing camera 238, or similar image capturing device or devices on the frame 205 may be used to capture images of one or both irises of the user. The controller (not shown) may then compare the captured image or the identified biometric information with stored information associated with the user and authenticate the user based on iris recognition.
The iris is the colored, annular portion of the eye behind the cornea and surrounding the pupil. The iris pattern of a person is unique and remains unchanged throughout life. Furthermore, the iris is well protected from damage due to coverage by the cornea, making the iris a suitable body part for biometric authentication. Since the iris is different between the left eye and the right eye, recognition can be performed by each eye separately, allowing selection between quick authentication and strong authentication.
In some examples, the inward facing camera 236 and/or the inward facing camera 238 may be eye tracking cameras to track pupil movement of a user's eye for purposes of Augmented Reality (AR) or Virtual Reality (VR) content presentation, and may also be used to capture one or more iris images. In other examples, the inward facing camera 236 and/or the inward facing camera 238 may be dedicated iris scan cameras. In other examples, the inwardly facing camera 236 and/or the inwardly facing camera 238 may be an infrared camera that is capable of iris recognition even in the dark.
In some examples, the user may also be authenticated using eye tracking features of an Augmented Reality (AR) near-eye display device or a Virtual Reality (VR) near-eye display device. For example, the user may know or be instructed to move their pupil according to a predetermined pattern (e.g., up, left, down, right, or any other combination) by visual instructions (or via audio cues) displayed on the display 210. While pupil movement combinations may not provide as strong authentication as iris scanning or some of the other techniques discussed herein, pupil movement combinations may provide a fast form of authentication and may be used in conjunction with another fast authentication technique to provide strong (stronger) authentication.
In some examples, the user's retina may be scanned using an inwardly facing camera 236, an inwardly facing camera 238, and/or a separate camera. Biometric information may be identified in the captured image of the retina and used to authenticate the user. In an example embodiment, one or more of the plurality of illuminators may be used to provide backlight during scanning.
Based on user preferences, any images (e.g., iris or retina) or sensor captured information, as well as authentication results, may or may not be stored at the near-eye display device 202 and/or any other external system (e.g., authentication server, content server, etc.). For personal information protection and privacy purposes, the default setting may be to not store information, which may be modified by the user at any time.
Fig. 2E illustrates a perspective view of a near-eye display device that may be used to authenticate a user through electrocardiographic signal detection according to an example. As with the examples above, the near-eye display device 202 may include some or all of the components of the near-eye display device 102 in fig. 1 for performing functions associated with presenting content to a user, such as Augmented Reality (AR) content, virtual Reality (VR) content, and/or other content. In another example implementation of the authentication function, a plurality of electrodes 246 on the frame 205 and/or on the temple 206 may be used to capture an Electrocardiogram (ECG) signal of the user. The plurality of electrodes 246 may also be located at other locations where the user's finger touches, such as the top, bottom, and/or sides of the temple 206. The local controller or remote controller (not shown) may then identify biometric information from the captured Electrocardiogram (ECG) signals and use the biometric information to authenticate the user. Electronic components for capturing Electrocardiogram (ECG) signals through the plurality of electrodes 246 may be located on a main printed circuit board (printed circuit board, PCB) 248. The battery 242 may be used to provide power to various electronic components. Power conditioning and distribution may be accomplished by some components and circuitry on a battery Printed Circuit Board (PCB) 244.
In some examples, a plurality of electrodes (at least two electrodes) 246 may be placed on the near-eye display device at locations that ensure contact with the user's body. For example, two high sensitivity sensors may be placed on the inner surface of a nose pad (nose pad) to contact the nose of the user. Additionally or alternatively, two other high sensitivity sensors may be placed on the inner surface of the temple 206 to contact the user's head. In some examples, two sets of sensors (electrodes) or even more sets of sensors may be used for reliable signal detection. ECG signals are typically characterized by the following parameters: PR parameter, QRS parameter, QT interval parameter, ST segment parameter, and Heart Rate (HR) parameter. The number of parameters may provide a unique signature for each user to be authenticated.
In some examples, capturing Electrocardiogram (ECG) signals before and after a physical activity (e.g., taking four rapid consecutive breaths) may provide a more reliable source of authentication, as differences between biometric information identified before and after the physical activity may make the signature of each user even more unique. The display 210 (and/or audio prompts) may be used to provide instructions to the user, such as telling the user to perform physical activities.
Based on user preferences, any data (e.g., electrocardiographic signals) or other sensor-captured information, as well as authentication results, may or may not be stored at the near-eye display device 202 and/or any other external system (e.g., authentication server, content server, etc.). For personal information protection and privacy purposes, the default setting may be to not store information, which may be modified by the user at any time.
Fig. 2F illustrates a perspective view of a near-eye display device that may be used to authenticate a user through palm print detection, according to an example. As with the examples above, the near-eye display device 202 may include some or all of the components of the near-eye display device 102 in fig. 1 for performing functions associated with presenting content to a user, such as Augmented Reality (AR) content, virtual Reality (VR) content, and/or other content. In another example implementation of the authentication function, an externally facing camera 214 on the frame 205 may be used to capture palmprints on the user's hand 238. The local or remote controller (not shown) may then compare the captured palm print to the stored palm print of the user and authenticate the user if the palm prints match. Alternatively, the controller may identify biometric information from the captured image of the palm print and compare the biometric information to stored biometric information associated with the user.
The display 210 (and/or audio cues) may be used to provide instructions to the user, for example, telling the user to place their hand 238 in front of the camera 214 with their palm facing the camera. If the palmprint is not successfully detected, a repeat request may also be displayed to the user.
Based on user preferences, any images (e.g., palmprints) or sensor captured information, as well as authentication results, may or may not be stored at the near-eye display device 202 and/or any other external system (e.g., authentication server, content server, etc.). For personal information protection and privacy purposes, the default setting may be to not store information, which may be modified by the user at any time.
Fig. 2G illustrates a perspective view of a near-eye display device that may be used to authenticate a user through password entry, according to an example. As with the examples above, the near-eye display device 202 may include some or all of the components of the near-eye display device 102 in fig. 1 for performing functions associated with presenting content to a user, such as Augmented Reality (AR) content, virtual Reality (VR) content, and/or other content. In another example implementation of the authentication function, a touch sensor located on an outer surface of one of the plurality of temples 206 may be used to capture a user's password input. The local controller or remote controller (not shown) may then compare the received password input to the stored user's password and authenticate the user if the passwords match.
Because of the limited availability of area on the near-eye display device 202 (specifically, on the frame or temple surface), the touch sensor 252 may be designed to detect coded input as opposed to a conventional keyboard. In some examples, different finger touches may be used to encode any number of alphanumeric characters. For example, numbers and/or letters may be encoded using a fingertip touch, a finger lay-down touch, and a finger swipe in parallel (e.g., two or more finger touch areas on a touch sensor) or sequentially. A three-finger touch area configuration using the three touch actions described above (without a gap or no touch option) may provide 27 different input configurations. A two-finger touch area configuration with the same three touch actions may provide 8 different input configurations.
In some examples, display 210 (and/or audio prompts) may be used to provide instructions to the user, for example, telling the user to touch their input or inputs via touch sensor 252. The display 210 may also display (as a reminder) different touch types and/or virtual keyboards that show the user which touch actions (finger touch inputs) or combinations correspond to which alphanumeric characters.
Based on user preferences, any data (e.g., finger touch input or password) or other sensor captured information, as well as authentication results, may or may not be stored at the near-eye display device 202 and/or any other external system (e.g., authentication server, content server, etc.). For personal information protection and privacy purposes, the default setting may be to not store information, which may be modified by the user at any time.
Fig. 2H illustrates a perspective view of a near-eye display device having one or more sensors to detect continuous wear of the near-eye display device, according to an example. In some cases, authentication of the user using the near-eye display device 202 may take the form of unlocking the near-eye display device for any operation (e.g., content viewing, communication session, data exchange session, etc.), and may be performed when the device is worn or activated. In other cases, authentication may be for different activities (e.g., those mentioned above) and may need to be performed at the beginning of each activity. To provide ease of use to the user, in some examples, sensor 254 and/or sensor 256 (or other sensors) may be used to confirm that: the user wears the near-eye display device 202 even during periods of inactivity or when switching from one activity to another. Upon confirming that the user has not removed the near-eye display device 202 (in other words, still the same user), the controller of the near-eye display device 202 may automatically update the user's authentication without the user having to go through any of the various authentication techniques discussed herein.
Thus, authentication may be based on biometric information discussed herein or based on user action (i.e., continuous wear of the near-eye display device). In some examples, the continuous authentication (or re-authentication) may be after a short removal of the near-eye display device. If the near-eye display device is removed from the user's head for a period of time (e.g., 1 second to 3 seconds) that is sufficiently short to ensure that the near-eye display device is not being worn by another user, the controller may consider that the device is still being worn by the same user and allow for continuous authentication.
In one illustrative example, users may authenticate themselves through fingerprint detection when starting to play an online gaming session with limited access. Subsequently, and without removing the near-eye display device, the user may leave the gaming session and join the videoconferencing phone. Even though the authentication techniques for the two activities may be different (e.g., password entry for a video conference call), near-eye display device 202 may automatically authenticate the user for the video conference call based on the following determination: the user continuously wears the near-eye display device between two activities. The sensors 254 and 256 may be touch sensors, motion sensors, light sensors, or the like to detect removal of the near-eye display device from the user's head. Other access-restricted activities that may authenticate a user may include, but are not limited to, online telephony (audio, video), messaging (text sharing or media sharing), posting to a social network or professional network, personal control panel access (e.g., calendar events), or mobile payment.
In some examples, more than one authentication technique (and associated hardware) may be available on the near-eye display device 202. In this case, more than one technique may be used in combination for strong (stronger) authentication. Alternatively, one technique may be suggested to the user instead of another based on specific detection conditions. For example, the noise level of the environment may be determined to be above a certain threshold and fingerprint detection or iris detection may be suggested instead of speech recognition. In another example, the user may wear gloves or their hands (fingers) may be dirty. Thus, speech recognition or iris detection may be suggested (or made available) rather than fingerprint detection or palm print detection. In other examples, additional authentication techniques may be suggested or made available upon failure to authenticate a user through one of the authentication techniques.
In some examples, two or more authentication techniques may be combined for strong (stronger) authentication. For example, speech recognition may be sufficient to watch streaming video, while accessing certain data may require stronger authentication, and fingerprint detection may be used for this activity. Alternatively, the strong authentication technique may be used once and cover all activities, including those requiring weaker authentication.
Fig. 3 illustrates a flow chart of a method 300 for authenticating a user of a near-eye display device for secure data exchange or communication via the near-eye display device according to an example 300. The method 300 is provided by way of example as there are various ways in which the methods described herein may be performed. Although the method 300 is described primarily as being performed by various components of fig. 2A-2H, the method 300 may be performed by one or more processing components of another system or combination of systems or otherwise. Each block shown in fig. 3 may also represent one or more processes, one or more methods, or one or more subroutines, and one or more of these blocks may comprise machine-readable instructions stored on a non-transitory computer-readable medium and executed by a processor or other type of processing circuitry to perform one or more operations described herein.
At block 302, a remote controller or local controller of the near-eye display device 202 may determine which authentication techniques (and associated hardware) are available on the near-eye display device 202. At block 304, if more than one authentication technique is available, one may be selected. The selection may be based on requirements of a particular activity (e.g., viewing a video stream, participating in an online video conference, accessing a networked data store, participating in an online gaming session, accessing a banking record, etc.). The selection may also be based on certain conditions, such as the noise level of the environment, the illumination level of the environment, clear access to the user's biometric features (e.g., iris, fingerprint, palm print, etc.).
At block 306, user characteristics according to the selected authentication technique may be captured and authentication performed. If the user has been authenticated, the activity requiring authentication may begin. At block 308, it may be determined (based on the sensor input) whether to remove the near-eye display device from the user's head during periods of inactivity or between two different activities requiring authentication. At block 310, if it is determined that the near-eye display device has not been removed, the user's authentication may be automatically updated and allowed to participate in the next activity requiring authentication.
In addition to the near-eye display devices discussed herein, example user authentication techniques may also be implemented in other forms of head-mounted devices (e.g., goggles, headsets), as well as other types of wearable devices (e.g., smart watches, smart armrings, smart glasses, non-display wearable devices (e.g., game controllers)), and similar devices.
According to an example, a method of enabling a near-eye display device to authenticate a user is described herein. A system for enabling authentication of a near-eye display device is also described herein. The non-transitory computer readable storage medium may have stored thereon an executable file that, when executed, instructs a processor to perform the methods described herein.
It should be noted that the functionality described herein may be constrained by one or more privacy policies that are performed by the near-eye display device 102 or a system that manages operation of the near-eye display device 102 via a communicative coupling, which may prohibit the use of images or other personal information for concept detection, recommendation, generation, and analysis, as described below.
In particular examples, one or more objects of the computing system (e.g., captured user features, content, or other types of objects) may be associated with one or more privacy settings. The one or more objects may be stored on or otherwise associated with any suitable computing system or application, such as, for example, a system, a client device (e.g., near-eye display device 102), a host system, an external system, a social networking application, a messaging application, a photo sharing application, or any other suitable computing system or application. Although the examples discussed herein are in the context of an online social network, these privacy settings may be applied to any other suitable computing system. The privacy settings (or "access settings") of the objects may be stored in any suitable manner, such as, for example, in association with the objects, in an index on an authorization server, in another suitable manner, or any suitable combination thereof. The privacy settings of an object may specify how the object (or particular information associated with the object) may be accessed, stored, or otherwise used (e.g., viewed, shared, modified, copied, executed, displayed, or identified) within the online social network. An object may be described as "visible" with respect to a particular user or other entity when the privacy setting of the object allows the object to be accessed by the user or other entity. By way of example and not limitation, a user of an online social network may specify privacy settings for a user profile page that identify a group of users that may access work experience information on the user profile page, thereby blocking other users from accessing the information.
In a particular example, the privacy settings of an object may specify a "blacklist" of users or other entities that should not be allowed access to certain information associated with the object. In a particular example, the blacklist may include third party entities. A blacklist may specify one or more users or entities for which the object is not visible. By way of example and not limitation, a user may designate a group of users or entities that may not have access to user features (e.g., fingerprints, palmprints, iris images, ECG signals, etc.) for authentication purposes, thereby blocking those users or entities from accessing the user features (while also potentially allowing certain users or entities that are not within the designated group of users or entities to access the user features). In particular examples, privacy settings may be associated with particular social-graph elements. The privacy settings of a social-graph element (e.g., node or edge) may specify how the social-graph element, information associated with the social-graph element, or objects associated with the social-graph element may be accessed using an online social network. By way of example and not limitation, a particular concept node corresponding to a particular user feature may have the following privacy settings: the privacy settings specify that the user features are only accessible by users listed in a particular allowed user list. In particular examples, privacy settings may allow users to opt-in or opt-out of their content, information, or actions stored/recorded by the system or shared with other systems (e.g., external systems). Although this disclosure describes using particular privacy settings in a particular manner, this disclosure contemplates using any suitable privacy settings in any suitable manner.
In particular examples, near-eye display device 102 may present a "privacy guide" to the first user (e.g., within a web page, a module, one or more dialog boxes, or any other suitable interface) to help the first user specify one or more privacy settings. The privacy wizard may display instructions, suitable privacy related information, current privacy settings, one or more input fields for accepting one or more inputs from the first user specifying a change or confirmation of the privacy settings, or any suitable combination thereof. In a particular example, the near-eye display device 102 may provide a "control panel" function to the first user that may display the first user's current privacy settings to the first user. The control panel function may be displayed to the first user at any suitable time (e.g., after input from the first user invoking the control panel function, after a particular event or trigger action occurs). The control panel function may allow the first user to modify one or more of the first user's current privacy settings at any time in any suitable manner (e.g., redirect the first user to the privacy wizard).
The privacy settings associated with the object may specify any suitable granularity of allowing access or denying access. By way of example and not limitation, access may be specified or denied for: a particular user (e.g., me only, my roommate, my leadership), a user within a particular degree of separation (e.g., friends of friends), a group of users (e.g., game clubs, my family), a network of users (e.g., employees of a particular employer, students or alumni of a particular university), all users ("public"), no users ("private"), users of a third party system, a particular application (e.g., a third party application, an external website), other suitable entities, or any suitable combination thereof. Although this disclosure describes a particular granularity of allowing access or denying access, this disclosure contemplates any suitable granularity of allowing access or denying access.
In particular examples, different objects of the same type associated with a user may have different privacy settings. Different types of objects associated with a user may have different types of privacy settings. By way of example and not limitation, a first user may specify that the first user's voice is public, but any images captured by the near-eye display device 102 for fingerprint, palmprint, or iris scan are only accessible to the first user and to specified entities on the online social network. As another example and not by way of limitation, a user may specify different privacy settings for different types of entities, such as individual users, friends of friends, followers, groups of users, or corporate entities. In particular examples, different privacy settings may be provided for different groups of users/entities or groups of users/entities. By way of example and not limitation, a first user may specify that other users within his/her home or workgroup may access their authentication information, but that other users may not.
In a particular example, the near-eye display device 102 may provide one or more default privacy settings for each object of a particular object type. The privacy settings of an object set as default may be changed by a user associated with the object. By way of example and not limitation, the captured voice example of the first user may have default privacy settings as follows: only friends and colleagues of the first user are accessible and for the feature the first user may change the privacy settings of the feature to be restricted to anyone else.
In particular examples, the privacy settings may allow the first user to specify (e.g., by opting out, by not opting in) whether the near-eye display device 102 may receive, collect, record, or store particular objects or particular information associated with the user for any purpose. In particular examples, the privacy settings may allow the first user to specify whether a particular application or process may access, store, or use a particular object or particular information associated with the user. The privacy settings may allow the first user to opt-in or opt-out of accessing, storing, or using objects or information by a particular application or process. Near-eye display device 102 may access such information to provide a particular function or service to the first user, while near-eye display device 102 may not be able to access such information for any other purpose. Prior to accessing, storing, or using such objects or information, the near-eye display device 102 may prompt the user to provide the following privacy settings prior to allowing any such actions: the privacy settings specify which applications or processes, if any, may access, store, or use the object or information. By way of example and not limitation, a first user may send a message to a second user via an application (e.g., a messaging application) related to an online social network, and may specify privacy settings as follows: such messages should not be stored by the near-eye display device 102.
In particular examples, a user may specify whether a particular type of object or information associated with a first user may be accessed, stored, or used by near-eye display device 102. By way of example and not limitation, the first user may specify that the image sent to the authentication service by the near-eye display device 102 may not be stored by the near-eye display device 102. As another example and not by way of limitation, a first user may specify that messages sent from the first user to a particular second user may not be stored by the near-eye display device 102. As yet another example and not by way of limitation, a first user may specify that all objects sent via a particular application may be saved by the near-eye display device 102.
In particular examples, the privacy settings may allow the first user to specify whether particular objects or information associated with the first user may be accessed from the client device or an external system. The privacy settings may allow the first user to opt-in or opt-out of accessing objects or information from a particular device (e.g., a user's smartphone), from a particular application (e.g., an authentication application), or from a particular system (e.g., an authentication server). Near-eye display device 102 may provide default privacy settings for each device, system, or application and/or may prompt the first user to specify particular privacy settings for each context. By way of example and not limitation, the first user may utilize the location services features of the near-eye display device 102 associated with authentication of the user. The default privacy settings of the first user may specify that the near-eye display device 102 may provide location-based services using location information provided from one of the plurality of client devices of the first user, but may specify that the near-eye display device 102 may not store or provide location information of the first user to any external system. The first user may then update the privacy settings to allow the third party application to use the location information.
In particular examples, privacy settings may allow a user to specify whether current, past, or predicted emotion, or emotion information associated with a user may be determined, and whether a particular application or process may access, store, or use such information. The privacy settings may allow the user to opt-in or opt-out of emotion, or affective information accessed, stored, or used by a particular application or process. For example, near-eye display device 102 may predict or determine emotion, or emotion associated with a user based on, for example, inputs provided by the user and interactions with particular objects, in conjunction with ECG-based authentication. In a particular example, near-eye display device 102 may use the user's previous activities and the calculated emotion, or emotion to determine the current emotion, or emotion. Users desiring to enable this functionality may indicate in their privacy settings that they choose to join the following choices: near-eye display device 102 receives input necessary to determine emotion, or emotion. By way of example and not limitation, near-eye display device 102 may determine that the default privacy setting is not to receive any information necessary to determine emotion, or emotion until the user explicitly indicates that near-eye display device 102 may do so. Conversely, if the user does not choose to join the following selections: the near-eye display device 102 receives these inputs (or positively opts out of the selection that the near-eye display device 102 receives these inputs), the near-eye display device 102 may be prevented from receiving, collecting, recording, or storing these inputs or any information associated with these inputs. In particular examples, near-eye display device 102 may use predicted emotions, or emotions to provide recommendations or advertisements to a user. In particular examples, if the user wishes to use the functionality for a particular purpose or application, additional privacy settings may be specified by the user to opt-in to use emotion, or emotion information for the particular purpose or application. By way of example and not limitation, near-eye display device 102 may use a user's emotion, or emotion to provide a news feed item, page, friend, or advertisement to the user. The user may specify in their privacy settings that near-eye display device 102 may determine the user's emotion, or emotion. The user may then be required to provide additional privacy settings to indicate the user's emotion, or purpose for which emotion may be used. The user may indicate that near-eye display device 102 may use his or her emotion, or emotion to provide news push content and recommended pages, but not to recommend friends or advertisements. Near-eye display device 102 may then provide only news push content or pages based on the user's emotion, or emotion, and may not use this information for any other purpose even if the privacy settings are not explicitly prohibited.
In particular examples, privacy settings may allow users to participate in transient sharing of objects on an online social network. Transient sharing refers to sharing objects (e.g., posts, photos) or information for a limited period of time. Access to the object or information or denial of access may be specified by time or date. By way of example and not limitation, a user may specify that a particular image, video, or similar information uploaded by the user through the near-eye display device 102 is visible to friends of the user the next week after which the image may no longer be accessible to other users.
In particular examples, for particular objects or information having privacy settings that specify that they are ephemeral, near-eye display device 102 may be limited in its access, storage, or use of such objects or information. The near-eye display device 102 may temporarily access, store, or use these particular objects or information in order to facilitate particular actions by a user associated with the objects or information, and may subsequently delete the objects or information (as specified by the corresponding privacy settings). By way of example and not limitation, the first user may send a message to the second user, and near-eye display device 102 may temporarily store the message in the content data store until the second user has viewed or downloaded the message, at which point near-eye display device 102 may delete the message from the data store. As another example and not by way of limitation, continuing with the previous example, the message may be stored for a specified period of time (e.g., 2 weeks) after which time the near-eye display device 102 may delete the message from the content data store.
In particular examples, the privacy settings may allow a user to specify one or more geographic locations from which objects may be accessed. Access to the object or denial of access may depend on the geographic location of the user attempting to access the object. By way of example and not limitation, a user may share an object and specify that only users or entities in the same city may access or view the object. As another example and not by way of limitation, a first user may share an object and specify that the object is visible to a second user or entity only when the first user is in a particular location. If the first user leaves the particular location, the object may no longer be visible to the second user or entity. As another example and not by way of limitation, a first user may specify that an object is visible only to a second user or entity within a threshold distance from the first user. If the first user subsequently changes locations, then the second user or entity user that originally had access to the object may lose access, and a new set of second users or entities may gain access when they reach within the threshold distance of the first user.
In a particular example, the near-eye display device 102 may have the following functionality: the function may use personal information or biometric information of the user as input for user authentication or experience personalization purposes. Users may choose to take advantage of these functions to enhance their experience on an online social network. By way of example and not limitation, a user may provide personal or biometric information to the near-eye display device 102. The user's privacy settings may specify that such information may only be used for a particular process (e.g., authentication), and also that such information may not be shared with any external system or used for other processes or applications associated with the near-eye display device 102. As another example and not by way of limitation, near-eye display device 102 may provide functionality for a user to provide voice-print (voice-print) recordings to an online social network. By way of example and not limitation, if a user wishes to utilize this functionality of an online social network, the user may provide a voice recording of his or her own voice to provide status updates on the online social network. The record of voice input may be compared to the user's voiceprint to determine what word the user uttered. The privacy settings of the user may specify that such voice recordings may be used for voice input purposes only (e.g., to authenticate the user, to send voice messages, to improve voice recognition to use voice operating features of an online social network), and also that such voice recordings may not be shared with any external system or used by other processes or applications associated with the near-eye display device 102. As another example and not by way of limitation, near-eye display device 102 may provide a user with functionality to provide a reference image (e.g., facial profile, retinal scan) to an online social network. The online social network may compare the reference image with later received image inputs (e.g., to authenticate the user, to mark the user in a photograph). The user's privacy settings may specify that such voice recordings may only be used for limited purposes (e.g., authentication, marking the user in a photograph), and also specify that such voice recordings may not be shared with any external system or used by other processes or applications associated with the near-eye display device 102.
In a particular example, a change to the privacy setting may be retrospectively effective (take effect retroactively), affecting the visibility of objects and content shared prior to the change. By way of example and not limitation, a first user may share a first image and designate that the first image be disclosed to all other users. At a later time, the first user may specify that any images shared by the first user should be made visible only to the first user group. The near-eye display device 102 may determine that the privacy setting also applies to the first image and make the first image visible only to the first user group. In certain examples, the change in privacy settings may only be effective in the future. Continuing with the example above, if the first user changes the privacy settings and then shares the second image, the second image may be visible only to the first user group, but the first image may remain visible to all users. In particular examples, in response to a user action to change the privacy settings, near-eye display device 102 may also prompt the user to indicate whether the user wants to retrospectively apply the change to the privacy settings. In a particular example, the user's change to the privacy settings may be a one-time change specific to one object. In a particular example, the user's change to the privacy settings may be a global change for all objects associated with the user.
In a particular example, the near-eye display device 102 may determine that the first user may want to change one or more privacy settings in response to a trigger action associated with the first user. The trigger action may be any suitable action on the online social network. By way of example and not limitation, the trigger action may be a change in a relationship between a first user and a second user of the online social network (e.g., a "un-friend" with the user, a change in a relationship state between the users). In a particular example, upon determining that the trigger action has occurred, the near-eye display device 102 may prompt the first user to change privacy settings regarding the visibility of an object associated with the first user. The prompt may redirect the first user to a workflow process as follows: the workflow process is used to edit privacy settings for one or more entities associated with the trigger action. The privacy settings associated with the first user may change only in response to explicit input from the first user and may not change without approval by the first user. By way of example and not limitation, the workflow process may include providing a first user with a current privacy setting for a second user or group of users (e.g., canceling a marker for the first user or second user from a particular object, changing the visibility of the particular object relative to the second user or group of users), and receiving an indication from the first user to change the privacy setting or to maintain an existing privacy setting based on any of the various methods described herein.
In particular examples, a user may need to provide verification of privacy settings before allowing the user to perform particular actions on an online social network, or may need to provide verification before changing particular privacy settings. Upon performing a particular action or changing a particular privacy setting, a prompt may be presented to the user to alert the user to his or her current privacy setting and to ask the user to verify the privacy setting for the particular action. Further, the user may need to provide a confirmation, double confirmation, authentication, or other suitable type of verification before performing a particular action, and the action may not be completed before providing such verification. By way of example and not limitation, a user's default privacy setting may indicate that a person's relationship status is visible (i.e., "public") to all users. However, if the user changes his or her relationship state, the near-eye display device 102 may determine that such action may be sensitive and may prompt the user to confirm that his or her relationship state should remain open before proceeding. As another example and not by way of limitation, the privacy settings of the user may specify that posts of the user are visible only to friends of the user. However, if the user changes the privacy settings of his or her posts to public, the near-eye display device 102 may prompt the user for a reminder that the current privacy settings of the user's posts are only visible to friends, and a warning that the change will make all past posts of the user visible to the public. The user may then need to provide a second verification, enter authentication credentials, or provide other types of verification before making the change in privacy settings. In certain examples, the user may need to provide verification of privacy settings on a regular basis. The prompt or reminder may be sent to the user periodically based on the elapsed time or the number of user actions. By way of example and not limitation, the near-eye display device 102 may send a reminder to the user after every six months or every ten photo posts to confirm his or her privacy settings. In certain examples, the privacy settings may also allow the user to control access to objects or information on a per request basis. By way of example and not limitation, whenever an external system attempts to access information associated with a user, the near-eye display device 102 may notify the user and require the user to provide verification of permitted access before proceeding.
In the above description, various inventive examples, including devices, systems, methods, and the like, have been described. For purposes of explanation, specific details are set forth in order to provide a thorough understanding of the examples of the present disclosure. It will be apparent, however, that various examples may be practiced without these specific details. For example, devices, systems, structures, components, methods, and other means may be shown as block diagram form in order to avoid obscuring the examples in unnecessary detail. In other instances, well-known devices, processes, systems, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the examples.
The drawings and description are not intended to be limiting. The terms and expressions which have been employed in the present disclosure are used as terms of description and not of limitation, and there is no intention in the use of such terms and expressions of excluding any equivalents of the features shown and described or portions of the features. The word "example" is used herein to mean "serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs.
While the methods and systems described herein may be primarily directed to digital content (e.g., video or interactive media), it should be understood that the methods and systems described herein may also be used with other types of content or scenes. Other applications or uses of the methods and systems described herein may also include social networking, marketing, content-based recommendation engines, and/or other types of knowledge or data driven systems.

Claims (20)

1. A near-eye display device, comprising:
a display element for presenting digital content to a user;
at least one sensor for receiving biometric information associated with the user; and
a controller communicatively coupled to the display element and the at least one sensor, the controller authenticating the user for access to restricted activity based at least in part on the received biometric information or the evaluation of the user's actions.
2. The near-eye display device of claim 1, wherein the access restricted activity comprises participation in multimedia content or restricted data.
3. The near-eye display device of claim 1, further comprising:
At least one additional sensor for detecting whether the near-eye display device is continuously worn by the user between a first access restricted activity and a second access restricted activity, wherein the user action is continuous wearing of the near-eye display device by the user.
4. A near-eye display device as claimed in claim 3, wherein the controller is further for:
the authentication is updated for the second access restricted activity if the near-eye display device is continuously worn by the user or is re-worn by the user within a predetermined period of time.
5. The near-eye display device of claim 1, further comprising:
at least one additional sensor for detecting one of an environmental condition or a user condition, wherein the controller is further configured to:
determining that the environmental condition or the user condition is incompatible with a first authentication technique based on input from the at least one additional sensor; and
the second authentication technique is made available.
6. The near-eye display device of claim 1, wherein the biometric information associated with the user comprises at least one of biometric information or behavioral biometric information.
7. The near-eye display device of claim 6, wherein:
the biometric characteristic information includes data associated with a facial, fingerprint, palmprint, iris, retina, or electrocardiographic ECG signal of the user; and is also provided with
The near-eye display device further comprises a camera for capturing an image of the user's finger or a capacitive sensor for scanning the user's finger.
8. The near-eye display device of claim 6, wherein the behavioral biometric information comprises data associated with movement, gesture, voice, or gait of the user.
9. The near-eye display device of claim 8, further comprising at least one of:
a motion detector for detecting gait of the user;
a touch pad integrated with the near-eye display device to receive a password from the user;
a microphone for capturing speech of the user;
the camera is used for capturing a finger gesture sequence; or (b)
A muscular electrical signal detector for detecting a muscular electrical signal from the wrist of the user and for determining a finger gesture sequence based on the detected muscular electrical signal.
10. A method, comprising:
presenting the digital content to a user through a display element of a near-eye display device;
receiving, by at least one sensor of the near-eye display device, biometric information associated with the user; and
authenticating the user for access restricted activity by a controller of the near-eye display device based at least in part on the received biometric information or the evaluation of the user's actions.
11. The method of claim 10, wherein presenting the digital content comprises:
participation in multimedia content or restricted data is achieved.
12. The method of claim 10, further comprising:
detecting whether the near-eye display device is continuously worn by the user between a first access restricted activity and a second access restricted activity, wherein the user action is continuous wearing of the near-eye display device by the user.
13. The method of claim 12, further comprising:
the authentication is updated for the second access restricted activity if the near-eye display device is continuously worn by the user or is re-worn by the user within a predetermined period of time.
14. The method of claim 10, further comprising:
detecting an environmental condition or a user condition;
determining that the environmental condition or the user condition is incompatible with a first authentication technique; and
the second authentication technique is made available.
15. The method according to claim 10, wherein:
the biometric information associated with the user includes at least one of biometric information or behavioral biometric information; and
the behavioral biometric information includes data associated with movement, gestures, speech, or gait of the user.
16. A non-transitory computer readable storage medium having stored thereon an executable file that, when executed, instructs a processor on a near-eye display device to:
presenting the digital content to a user through a display element of a near-eye display device;
receiving, by at least one sensor of the near-eye display device, biometric information associated with the user, wherein the biometric information comprises biometric information or behavioral biometric information;
the user is authenticated for access restricted activity based at least in part on the received biometric information or the evaluation of the user's actions.
17. The computer-readable storage medium of claim 16, wherein the executable file further causes the processor to:
detecting whether the near-eye display device is continuously worn by the user between a first access restricted activity and a second access restricted activity; and
the authentication is updated for the second access restricted activity if the near-eye display device is continuously worn by the user or is re-worn by the user within a predetermined period of time.
18. The computer-readable storage medium of claim 16, wherein the executable file further causes the processor to:
detecting an environmental condition or a user condition;
determining that the environmental condition or the user condition is incompatible with a first authentication technique; and
the second authentication technique is made available.
19. The computer-readable storage medium of claim 16, wherein the executable file further causes the processor to:
the biometric characteristic information includes data associated with a facial, fingerprint, palmprint, iris, retina, or electrocardiographic ECG signal of the user; and is also provided with
The behavioral biometric information includes data associated with movement, gestures, speech, or gait of the user.
20. The computer-readable storage medium of claim 16, wherein the access restricted activity comprises participation in multimedia content or restricted data.
CN202311033865.3A 2022-08-16 2023-08-16 Techniques for providing user authentication for near-eye display devices Pending CN117592027A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/398,415 2022-08-16
US18/113,919 US20240061918A1 (en) 2022-08-16 2023-02-24 Techniques to provide user authentication for a near-eye display device
US18/113,919 2023-02-24

Publications (1)

Publication Number Publication Date
CN117592027A true CN117592027A (en) 2024-02-23

Family

ID=89920729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311033865.3A Pending CN117592027A (en) 2022-08-16 2023-08-16 Techniques for providing user authentication for near-eye display devices

Country Status (1)

Country Link
CN (1) CN117592027A (en)

Similar Documents

Publication Publication Date Title
US10156900B2 (en) Systems and methods for discerning eye signals and continuous biometric identification
US11511199B2 (en) Systems and methods for creating and sharing virtual and augmented experiences
US10223832B2 (en) Providing location occupancy analysis via a mixed reality device
US9329771B2 (en) Embedded authentication systems in an electronic device
CN108108012B (en) Information interaction method and device
KR20180100329A (en) User authentication and registration of wearable devices using biometrics
US20060139374A1 (en) Method and virtual retinal display for reproducing the image of a person
EP4325383A1 (en) Techniques to provide user authentication for a near-eye display device
CN117592027A (en) Techniques for providing user authentication for near-eye display devices
KR20210070119A (en) Meditation guide system using smartphone front camera and ai posture analysis
KR20230043749A (en) Adaptive user enrollment for electronic devices
US11992773B2 (en) Augmented reality experiences based on qualities of interactions
US20230306097A1 (en) Confirm Gesture Identity
WO2015093221A1 (en) Electronic device and program
CN117978501A (en) Method for determining verification mode, method and system for checking body
CN115082980A (en) Image recognition method, device and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination