WO2015194017A1 - Dispositif portable sur soi et procédé d'authentification associé - Google Patents

Dispositif portable sur soi et procédé d'authentification associé Download PDF

Info

Publication number
WO2015194017A1
WO2015194017A1 PCT/JP2014/066350 JP2014066350W WO2015194017A1 WO 2015194017 A1 WO2015194017 A1 WO 2015194017A1 JP 2014066350 W JP2014066350 W JP 2014066350W WO 2015194017 A1 WO2015194017 A1 WO 2015194017A1
Authority
WO
WIPO (PCT)
Prior art keywords
authentication
wearable device
unit
position guide
user
Prior art date
Application number
PCT/JP2014/066350
Other languages
English (en)
Japanese (ja)
Inventor
俊輝 中村
大内 敏
瀬尾 欣穂
川村 友人
佑哉 大木
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2014/066350 priority Critical patent/WO2015194017A1/fr
Publication of WO2015194017A1 publication Critical patent/WO2015194017A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/30Security of mobile devices; Security of mobile applications
    • H04W12/33Security of mobile devices; Security of mobile applications using wearable devices, e.g. using a smartwatch or smart-glasses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication

Definitions

  • the present invention relates to a wearable device and an authentication method.
  • a so-called wearable information display device that displays predetermined information using a goggle-type or eyeglass-type head-mounted display has been commercialized.
  • User authentication is important from the viewpoint of security, but passwords are fragile from the viewpoint of security, and the input mechanism for entering passwords hinders downsizing of the apparatus.
  • Patent Document 1 discloses a technology of a head mounted display that images an iris and performs user authentication.
  • Patent Document 1 strong security is possible on the head-mounted display based on biological information called iris.
  • user authentication is performed by pattern matching with a pre-recorded iris, if the positional relationship between the camera to be imaged and the iris is not always constant, there is a problem that collation accuracy decreases and user authentication becomes unstable.
  • an object of the present invention is to improve biometric information collation accuracy and enable stable user authentication in a wearable device such as a head-mounted display.
  • a representative wearable device is a wearable device, wherein the image display unit displays a position guide to a user wearing the wearable device, an image pickup unit, and the image pickup unit at a position corresponding to the position guide. And an authentication processing unit that authenticates the user based on the biometric information photographed by.
  • the present invention is grasped as an authentication method.
  • biometric information collation accuracy is improved and stable user authentication is possible.
  • FIG. 1 It is a figure which shows the example of the usage condition of a wearable apparatus. It is a figure which shows the example of the usage condition of the wearable apparatus which mounts an imaging part rotatably. It is a figure which shows the example of the usage condition of the wearable apparatus which mounts two imaging parts. It is a figure which shows the example of a structure of a wearable apparatus. It is a figure which shows the example of a structure of the wearable apparatus which detects biometric information in an imaging part. It is a figure which shows the example of a position guidance. It is a figure which shows the other example of a position guidance. It is a figure which shows the other example of a position guidance. It is a figure which shows the other example of a position guidance.
  • FIG. 1A is a diagram illustrating an example of a usage pattern of the wearable device 100.
  • the wearable device 100 is a see-through head mounted display that is mounted on a user's head 101 and has a video display unit 102 that displays a video in a partial area 103 in the user's field of view in a state where the outside world is visible.
  • the wearable device 100 includes an imaging unit 106 that captures an imaging range 105 that is a part of the visual field range of the user.
  • the size and direction of the area and the range may be any. .
  • the video display unit 102 transfers video displayed on a liquid crystal or a digital micromirror device mounted on the head mounted display 100 into a user's field of view by a predetermined optical mechanism using a lens, a hologram, an optical fiber, or the like. Any configuration may be used as long as it is transmitted and formed on the retina of the user so that it is recognized as an image.
  • the imaging unit 106 may be a camera, for example.
  • the wearable device 100 is equipped with a touch sensor 107 that detects a finger contact (not shown) and a speaker, a microphone, and the like.
  • the touch sensor 107 may be a sensor that detects a fingerprint.
  • FIG. 1B is a diagram illustrating an example of a usage pattern of the wearable device 100a in which the imaging unit 106 is rotatably mounted.
  • the imaging unit 106 can be rotated manually or by an electric signal to move the imaging range, and can switch between the imaging range 105 shown in FIG. 1A and the imaging range 108 shown in FIG. 1B.
  • the imaging unit 106 captures the user's eyes using the imaging range 108 illustrated in FIG. 1B.
  • FIG. 1C is a diagram illustrating an example of a usage pattern of the wearable device 100b in which the two imaging units 106 and 109 are mounted.
  • the imaging unit 109 images the user's eyes using the imaging range 108.
  • the wearable devices 100a and 100b may have a half mirror or the like in front of the user's eyes, and the imaging range 108 may include the half mirror or the like, or the wearable device 100a.
  • 100b may be an image of the user's face captured by the imaging units 106 and 108.
  • FIG. 2 is a diagram showing an example of the configuration of the wearable device 100.
  • the wearable device 100a, 100b, and the like are typically described as the wearable device 100.
  • the control unit 201 controls the wearable device 100 as a whole.
  • the video display unit 203 and the imaging unit 202 are a part of the video display unit 102 and a part of the imaging unit 106 described with reference to FIG.
  • the voice input / output unit 210 is a microphone, an earphone, or the like, and the voice processing unit 211 processes a voice signal from the voice input / output unit 210.
  • the communication unit 208 connects to a network via the communication input / output unit 209 by wireless communication.
  • the wearable device 100 may acquire information by directly connecting to a base station such as the Internet, and the wearable device 100 is a Bluetooth (registered trademark) as a head mounted display including at least an imaging unit and a display control unit.
  • Wifi registered trademark
  • UHF ultra high-power Bluetooth
  • VHF ultra high-power Bluetooth
  • other short-distance / far-distance wireless communication with information terminals (smartphones, tablet terminals, PCs, etc.) in separate cases, and these information terminals are connected to the Internet, etc. You may make it perform the main process of a connection.
  • the sensing unit 207 performs various sensor processes via the sensor input / output unit 206. Sensors that detect the user's posture, orientation, and movement, such as an inclination sensor and an acceleration sensor, sensors that detect the user's physical state, such as a line-of-sight sensor and a temperature sensor, pressure-sensitive sensors such as the touch sensor 107, and capacitance sensors
  • the wearable device 100 can be equipped with a plurality of sensors. The wearable device 100 can be used as an input / output interface for detecting a user's instruction input.
  • the user's biometric information template collected and registered in advance is stored in the storage medium 205, and the authentication processing unit 204 compares the user template with the detection pattern to determine whether authentication is possible.
  • the security of wearable device 100 can be improved by performing user authentication using biometric authentication.
  • the biometric information used for biometric authentication may be a detection target of biometric features such as palm prints, fingerprints, veins, irises, retinas, faces, palm shapes, voice prints, brain waves, and pulse waves. Such biological information is acquired by the biological information detection unit 291.
  • the biological information detection unit 291 may be, for example, a dedicated imaging unit that is provided separately from the imaging unit 202 and detects an iris or a retina, that is, the imaging unit 109 illustrated in FIG. 1C, or uses a capacitive sensor. It may be a fingerprint detection unit, or a detection unit that optically or electrically detects brain waves or pulse waves. However, the present invention is not limited to the provision of the dedicated biological information detection unit 291.
  • the biological information detection unit 291 may be shared in whole or in part with other components, and imaging is performed as shown in FIG. 1B. Unit 106 may be switchable.
  • the imaging unit 202 that captures a part of the user's field of view detects palm prints, fingerprints, veins, palm shapes, etc., which are biological information
  • the biological information detection unit 291 and the imaging unit 202 are shared
  • the wearable device 100 can be reduced in size and weight because there is no need to provide a biological information detection unit 291 dedicated to detection of fingerprints, veins, palms, and the like.
  • the voice input / output unit 210 detects a voiceprint
  • the biological information detection unit 291 and the voice input / output unit 210 are shared, and there is no need to provide a biological information detection unit 291 dedicated to voiceprint detection.
  • the wearable device 100 generates a guide for position guidance in the user's field of view as auxiliary information by the auxiliary information generation unit 290 as auxiliary information in order to suppress degradation of the authentication accuracy due to the displacement of the detection target at the time of authentication. It may be displayed.
  • auxiliary information for iris and retina detection a position guide that specifies the direction of the line of sight may be displayed to suppress the movement of the eyeball, and the display of the video display unit 203 may be darkened to open the pupil. Also good. Further, the scan start timing or the like may be displayed as auxiliary information in fingerprint detection.
  • auxiliary information in voiceprint detection a display for instructing the start of sound generation, a phrase to be sounded may be displayed, and the timing of reading a phrase may be displayed by changing the color of the characters in order.
  • the wearable device 100 can improve the authentication accuracy by providing auxiliary information corresponding to the authentication method to the user.
  • biological information such as palm prints, fingerprints and veins is detected by the imaging unit 202 that captures the user's field of view, and the biological information detection unit 291 and the imaging unit 202 are shared, and a dedicated biological information detection unit 291 is provided.
  • the imaging unit 202 captures the user's field of view
  • the biological information detection unit 291 and the imaging unit 202 are shared, and a dedicated biological information detection unit 291 is provided.
  • a configuration example in which the wearable device 100 is reduced in size and weight will be described in more detail.
  • FIG. 3 is a diagram illustrating an example of the configuration of the wearable device 100c when the biological information detection unit 291 and the imaging unit 202 are shared.
  • the wearable device 100c illustrated in FIG. 3 does not include the auxiliary information generation unit 290 and the biological information detection unit 291 illustrated in FIG. 2, but includes the position guide unit 212 and the position guide image generation unit 213, and the other configuration illustrated in FIG. The configuration described with reference to FIG.
  • the user holds, for example, a palm in the imaging range 105 of the imaging unit 202, and the wearable device 100c detects a palm print pattern as biometric information.
  • the authentication processing unit 204 determines whether or not authentication is possible by comparing and collating the user template recorded in the storage medium 205 with the detected pattern.
  • an illumination mechanism (not shown) may be mounted on the wearable device 100c, and the light quantity for the inspection object may be increased by the illumination mechanism.
  • a biometric authentication apparatus often seen as a product has a mounting base for positioning an inspection target so that the detection target can be repeatedly arranged at the same position in order to suppress displacement of the detection target of biometric information. Then, the pattern is detected by positioning the inspection object on the mounting table, that is, the positioning table.
  • the position guide 212 allows the detection target at the time of authentication to be placed at the correct position without using the positioning table.
  • FIG. 4 is a diagram showing an example of position guidance.
  • the position guide unit 212 displays the position guide 110 in a partial region 103 within the user's field of view using the video display unit 203.
  • the image information of the position guide 110 is generated by the position guide image generation unit 213, and is the outline of the palm when the palm print pattern is registered.
  • the image display unit 203 displays the palm contour position with respect to the imaging unit 202 at the time of registration as a virtual image in the user's field of view, that is, displays the position guide 110 as augmented reality (AR) with respect to the real image detection target 111.
  • AR augmented reality
  • the user looks at the position guide 110 displayed as AR and moves the detection target 111 (palm in the example of FIG. 4) so that it overlaps with it, so that the detection target 111 is positioned at substantially the same position and inclination as at the time of registration. Can be arranged.
  • the palm is described as an example of the detection target of the biological information.
  • the detection target is not limited to the palm
  • the position guide is not limited to the outline of the palm.
  • 5A to 5C are diagrams showing another example of position guidance.
  • palmprints or fingerprints or fingerprints are detected
  • the position guide 110a is a cross mark
  • the cross mark indicates the position of the center of the palm or a predetermined finger position.
  • a fingerprint is a detection target
  • the position guide 110b is a contour of a finger and indicates a position where the detection target finger is overlapped.
  • the palm pattern or the like may be a detection target
  • the position guide 110b may be the outline of a plurality of fingers, indicating the position where the plurality of detection target fingers are overlapped, and specifying the position of the palm.
  • palm prints, palm veins, and the like are detection targets
  • the position guide 110 c is a representative pattern of palm prints, veins, and the like that can be easily viewed, not the outline, and indicates the positions where the detection target palms overlap. Even if it is other than what was demonstrated above, the display format of a position guide will not be limited if a user can be notified of the position where a detection target should be arrange
  • the position guides 110a, 110b, 110c and the like are also representatively described as the position guides 110.
  • wearable device 100 may notify the user that the detection target has been placed at the correct position by displaying the position determination result in real time.
  • the position determination result may be displayed in characters, but by changing the display color of the position guide 110 or the like, it may be intuitively recognized by the user that the detection target is placed at the correct position.
  • the video display unit 203 displays the authentication result on the AR, so that the user can check the authentication result in real time without moving the line of sight, and the display as the wearable device 100c can be effectively used.
  • the authentication process is started by using a predetermined input signal of the sensor input / output unit 206 such as an acceleration sensor, a proximity sensor, a touch sensor, etc. as a trigger, and only when necessary.
  • the authentication process can be started, and otherwise, in a standby state, power consumption can be suppressed and long-time driving can be realized in the wearable device 100c that is operated by a battery and has a limited power capacity.
  • FIG. 6 is a diagram illustrating an example of a flowchart of biometric authentication in the wearable device 100.
  • Wearable device 100 starts processing in step 301.
  • wearable apparatus 100 determines whether or not there is a trigger signal for starting authentication processing from sensor input / output unit 206.
  • the wearable device 100 detects a trigger signal in step 302
  • the wearable device 100 starts an authentication process in step 380.
  • the wearable device 100 displays auxiliary information in step 381.
  • the wearable device 100 compares the template data registered in the storage medium 205 with the detected biometric information pattern in step 307 and determines whether or not authentication is possible. If the authentication is successful as a result of the collation, the wearable device 100 displays an authentication result indicating that the authentication is successful in step 308, and displays detailed information. If the authentication fails in step 307, the wearable device 100 displays a result indicating that the authentication has failed in step 309 to the user, and ends the display of the related information.
  • FIG. 7 shows an example of a flowchart of biometric authentication particularly when biometric information is detected using the imaging unit 202 and the position guide 110 is displayed as auxiliary information.
  • the wearable device 100c starts processing in step 301.
  • wearable device 100c determines the presence / absence of a trigger signal for starting authentication processing from sensor input / output unit 206.
  • the image capturing unit 202 starts photographing in order to start authentication processing in step 303.
  • the wearable device 100c causes the video display unit 203 to display the position guide 110 as an AR in a partial area 103 within the user's visual field.
  • the wearable device 100c determines in step 305 whether the user has placed the detection target 111 so as to overlap the position guide 110 based on the captured image of the imaging unit 202. If the position has been correctly arranged, the wearable device 100c performs a display informing the user of the position determination result in step 306. The wearable device 100c compares the template data registered in the storage medium 205 with the biometric information pattern detected by the imaging unit 202 in step 307 and determines whether or not authentication is possible. When the authentication is successful as a result of the collation, the wearable device 100c displays an authentication result indicating that the authentication is successful in step 308, and displays detailed information. If the authentication fails in step 307, the wearable device 100c displays a result notifying the user of the failure in step 309, and ends the display of the related information.
  • the wearable device 100 includes the video display unit 203 that displays predetermined information in the visual field direction of the user in a state where the external world is visible, the storage medium 205 storing user information, the user
  • the biometric information of the user is detected from the inspection object placed in the area where the display range of the image display unit 203 and the display range of the video display unit 203 and the imaging range of the imaging unit 202 overlap.
  • an authentication processing unit 204 that performs user authentication by collating information with a template stored in the storage medium 205, a simple configuration can be achieved in a small, lightweight and wearable information display terminal. Users can be authenticated with high accuracy.
  • FIG. 8 is a diagram illustrating an example of communication connection cooperation between the wearable device 100 and an external information processing device (external device).
  • the wearable device 100 communicates with external devices such as the PC 151, the smartphone 152, the server 153, and other wearable devices 154 via the Internet line 150, and information in the external device is displayed on the video display unit 203 of the wearable device 100. Can be displayed in the user's field of view.
  • communication may be performed by short-distance / far-distance wireless communication such as Bluetooth (registered trademark), WiFi (registered trademark), UHF, or VHF.
  • the user authentication in the authentication processing unit 204 is not used only for availability of the wearable device 100 but is used for user authentication when establishing a communication connection with an external device. Thereby, the convenience of a user at the time of making an external apparatus and the wearable apparatus 100 cooperate, and the secrecy of information can be improved.
  • the authentication in the communication connection cooperation shown in FIG. 8 is classified into a host authentication method and a self-authentication method depending on on which device the authentication determination position is located.
  • the host authentication method is a method in which the authentication determination position is on the server 153 via the Internet line 150, and the wearable device 100 transmits image data from the image capturing unit 202 and performs connection and authentication on the server 153 side.
  • the self-authentication method is a method in which the authentication determination position is in the same housing of the wearable device 100 as that of the imaging unit 202, and the server 153 side authentication is also performed in the wearable device 100.
  • the host authentication method only needs to communicate photographing data, but requires a network system using a dedicated server, and the system becomes complicated.
  • the self-authentication method does not require a network system using a dedicated server, it can be a small-scale system.
  • FIG. 9 is a diagram illustrating an example of a configuration related to authentication of the wearable device 100d and the external device 250.
  • the wearable device 100d includes the configuration of the wearable device 100c described with reference to FIG.
  • the external device 250 is the PC 151, the smart phone 152, the server 153, another wearable device 154, or the like shown in FIG.
  • the wearable device 100d performs authentication by comparing the biometric information detected by the imaging unit 202 with the registration data by the authentication processing unit 204. If the authentication is successful, the detailed functions and information of the wearable device 100d can be displayed.
  • the wearable device 100d communicates with the external device 250 via the Internet line 150, and receives an authentication request from the authentication request unit 251 of the external device 250. Upon reception, the external device 250 is notified of the user authentication result. Thereby, the external device 250 can confirm the execution result of the biometric authentication in the remote wearable device 100d.
  • communication between the wearable device 100d and the external device 250 can be a system that ensures confidentiality by using, for example, public key cryptography.
  • the signature unit 214 signs the authentication information such as the verification result, user information, and identification information of the authentication request from the server with a (secret) encryption key, and wearable device 100d Transmits the signed authentication information to the external device 250.
  • the external device 250 verifies the validity of the user by using the (public) key by the authentication unit 252.
  • FIG. 10 is a diagram showing an example of a flowchart of authentication connection between the wearable device 100d and the external device 250.
  • the biometric authentication process in step 310 is steps 302 to 307 described with reference to FIG.
  • the wearable device 100d transmits a connection request to the external device 250 in step 311.
  • Wearable device 100 d receives an authentication request from external device 250 in step 312.
  • the wearable device 100d signs the authentication information such as the verification result, the user information, and the identification information of the authentication request from the server with the (secret) encryption key based on the verification result.
  • the wearable device 100d transmits the authentication information signed in step 314 to the external device 250.
  • the external device 250 verifies whether or not the authentication information received in step 315 corresponds to the previously issued authentication request and the validity of the user using the (public) key, and determines the validity of the target user. If the validity of the determination result is obtained in step 316, the external device 250 can display the internal detailed information of the external device 250. If the validity is not obtained, the external device 250 terminates the display of the related information in step 317.
  • FIG. 11 is a diagram showing an example of a flowchart of the template registration procedure.
  • wearable device 100 is instructed to register biometric information using sensing unit 207 and voice processing unit 211. Specifically, for example, the wearable device 100 may be instructed by a predetermined operation or sound generation by the touch sensor 107 or voice.
  • the wearable device 100 displays the position target where the detection target is arranged as AR in the user's field of view on the video display unit 203.
  • the wearable device 100 detects biometric information by the imaging unit 202.
  • the wearable device 100 registers the biometric information pattern detected in step 333 in the recording medium 205.
  • the wearable device 100 generates position guide data by the position guide image generation unit 213, registers the position guide data in the storage medium 205, and displays the position guide data as the position guide 110 at the time of user authentication.
  • the wearable device 100 determines whether it is self-authentication or host authentication in step 335. Register the server user account and the registered template in association.
  • the wearable device 100 transmits the pattern data to an external device that determines host authentication in step 337, and registers biometric information as a user template.
  • position guide data in the position guide image generation unit 213 will be described in detail.
  • the contour is extracted from the palm-captured image acquired by the imaging unit 202.
  • the installation position of the imaging unit 202 is shifted from the position of the user's eyes, a difference occurs between the visual field of the imaging unit 202 and the visual field of the user.
  • the contour image acquired by the imaging unit 202 is displayed as it is in the visual field of the user by the video display unit 203, a deviation occurs from the correct contour position, and guidance to the correct position cannot be made.
  • the contour is displayed on the video display unit 203, it is necessary to display it at a reduced scale in which the size of the detection target is adjusted.
  • the position guidance image generation unit 213 geometrically corrects the difference in field of view and the difference in display magnification, and registers the result in the storage medium 205, so that the position guidance can be performed more accurately at the time of authentication and the authentication accuracy is improved. be able to.
  • the position guide image generation unit 213 When the position guide image generation unit 213 is not used, the same fixed position guide mark is displayed at the time of registration and authentication, and the user positions the detection target with respect to the displayed fixed position guide mark. Since the positional deviation between registration and authentication can be suppressed and there is no need to provide the position guide image generation unit 213, the registration process can be simplified.
  • FIG. 12 is a diagram illustrating an example of a flowchart of a template change procedure.
  • wearable device 100 is instructed to change the biometric information template using sensing unit 207 and audio processing unit 211. Specifically, for example, the wearable device 100 may be instructed by a predetermined operation or sound generation by the touch sensor 107 or voice.
  • the wearable device 100 executes the authentication process once in the existing registered template at step 341.
  • the biometric authentication process in step 341 is steps 302 to 307 described with reference to FIG.
  • the registration change process is started at step 343, and the position target where the detection target is arranged is displayed as AR on the user's field of view by the video display unit 203.
  • the wearable device 100 detects biometric information by the imaging unit 202.
  • Wearable device 100 registers the detected pattern in recording medium 205 in step 345.
  • the wearable device 100 generates and registers position guide data for the position guide unit 212.
  • the wearable device 100 determines whether self-authentication or host authentication is performed in step 347, and if it is host authentication, transmits the template data changed to the external device that performs authentication determination in step 348, and changes and registers the biometric user template. To do. If self-authentication is performed, wearable device 100 ends the template change process in step 349.
  • the biological information detection unit 291 may be configured to use a laser scanning unit.
  • FIG. 13 shows an example of the configuration of the laser scanning unit 500.
  • the laser scanning unit 500 includes a laser light source 501, a scanning mechanism 502, and a detection unit 503.
  • the scanning mechanism 502 scans light from the laser light source 501 and irradiates it into a predetermined region of the detection target object 504.
  • By detecting the reflected light by the detection unit 503 it is possible to measure the three-dimensional shape and shading in a predetermined region.
  • a time-of-flight method, a phase shift method, or the like may be used.
  • the imaging unit 202 needs to be illuminated.
  • the laser scanning unit 500 performs detection, the illumination unit and the detection unit can be realized with a small device. Moreover, you may acquire shapes, such as a vein inside a biological body, using an infrared wavelength as a light source. In this way, by detecting the living body by the laser scanning unit 500, one laser scanning unit 500 can detect various types of biological information, and the authentication performance is improved by downsizing the apparatus or making it multimodal. be able to.
  • biometric information collation accuracy is improved and stable user authentication becomes possible. Also, stable user authentication can be used in communication cooperation.
  • the two or more types of authentication contents are, for example, simple authentication contents and detailed authentication contents, and the convenience of the user is improved by the simple authentication contents, and strong security is ensured by the detailed authentication contents.
  • the configuration of the wearable device 100 has already been described with reference to FIG. 2, but in the second embodiment, a plurality of authentication contents by the authentication processing unit 204 are used, and authentication by two or more authentication contents is possible.
  • the first authentication content is simple authentication that allows the functions of the wearable device 100 to be restricted and can be used
  • the second authentication content is detailed authentication that enables cooperation with all the functions of the wearable device 100 and external devices.
  • For authentication contents such as palm prints, it is necessary to activate the imaging unit 202 and align the palm, which makes the authentication operation complicated. Therefore, simple authentication is performed based on the first authentication content that realizes authentication with a simpler operation, and a part of the functions of the wearable device 100 can be used.
  • the first authentication content may be a voice print, a gesture, a touch operation of the touch sensor 107, or the like. That is, the wearable device 100 may perform authentication such as voiceprint authentication using a short phrase, authentication using a touch operation by touching a predetermined area of the touch sensor 107 in a predetermined order, and detection of a gesture. At this time, auxiliary information is displayed to the user using the information display device 203.
  • a voiceprint is biological information, it is possible to acquire information by a relatively simple method such as recording. Therefore, in order to ensure strong security, the second authentication content may be authentication content that has high authentication accuracy as described above and uses palm prints, veins, and the like that are difficult to impersonate.
  • Wearable device 100 starts processing in step 401, and determines in step 402 whether or not there is a trigger signal for starting authentication processing from sensor input / output unit 206.
  • the wearable device 100 detects a trigger signal in step 402, it determines in step 403 whether the user's selection instruction is simple authentication or detailed authentication.
  • wearable device 100 starts simple authentication processing in step 404.
  • the wearable device 100 displays auxiliary information for simple authentication in step 405 and compares the template data registered in the storage medium 205 with the detected information pattern in step 406 to determine whether or not authentication is possible. If the authentication is successful as a result of the collation in step 406, the wearable device 100 displays an authentication result indicating that the authentication is successful in step 407, and displays simple information. On the other hand, if the authentication fails in step 406, wearable device 100 displays a result informing that the failure has occurred in step 408, and ends the display of the related information.
  • wearable device 100 If it is determined in step 403 that detailed authentication is to be performed, wearable device 100 starts detailed authentication processing in step 409.
  • the wearable device 100 displays auxiliary information for detailed authentication such as a position guide in step 410 and compares the template data registered in the storage medium 205 with the detected biometric information pattern in step 411 to determine whether authentication is possible. judge.
  • the wearable device 100 displays an authentication result indicating that the authentication is successful in step 412 and displays detailed information.
  • the wearable device 100 displays a result indicating that the authentication has failed in step 413, and ends the display of the related information.
  • the wearable device 100 enables the cooperation with the external device even in the simple authentication.
  • Information provided to wearable device 100 using the result may be switched between information for simple authentication and information for detailed authentication, and simple authentication and detailed authentication may be used properly according to the security level of the connection destination.
  • the wearable device switches the functions provided according to two or more types of authentication contents, such as simple authentication contents and detailed authentication contents, to improve user convenience and ensure strong security. I can do it.
  • the embodiment is not limited to the above description, and includes various modifications.
  • the case of a head-mounted display has been described as an example of the wearable device 100.
  • the wearable device 100 is not necessarily limited to the head-mounted display. If the auxiliary information can be displayed by AR, user authentication of the wearable device other than the head-mounted display is possible. It may be.
  • the above embodiments are described in detail for easy understanding, and are not necessarily limited to those having all the configurations described.
  • each of the functions and processing units described above may be realized in part or in whole with predetermined hardware, or the processor interprets and executes a program that implements each function or processing unit. By doing so, it may be realized by software.
  • the processing procedure shown in the flowchart is an example, and is not limited to this processing procedure.

Abstract

Dispositif portable sur soi possédant une unité d'affichage d'image qui affiche un guide de position pour un utilisateur portant le dispositif portable sur soi; une unité de capture d'image; et une unité de traitement d'authentification qui authentifie l'utilisateur sur la base d'informations biologiques photographiées à une position correspondant au guide de position au moyen de l'unité de capture d'image.
PCT/JP2014/066350 2014-06-19 2014-06-19 Dispositif portable sur soi et procédé d'authentification associé WO2015194017A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/066350 WO2015194017A1 (fr) 2014-06-19 2014-06-19 Dispositif portable sur soi et procédé d'authentification associé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/066350 WO2015194017A1 (fr) 2014-06-19 2014-06-19 Dispositif portable sur soi et procédé d'authentification associé

Publications (1)

Publication Number Publication Date
WO2015194017A1 true WO2015194017A1 (fr) 2015-12-23

Family

ID=54935046

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/066350 WO2015194017A1 (fr) 2014-06-19 2014-06-19 Dispositif portable sur soi et procédé d'authentification associé

Country Status (1)

Country Link
WO (1) WO2015194017A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018156479A (ja) * 2017-03-17 2018-10-04 日本電信電話株式会社 認証システム、認証装置および認証方法
JP2018180952A (ja) * 2017-04-13 2018-11-15 富士通株式会社 情報処理装置、機能制限管理方法及び機能制限管理プログラム
JP2019511021A (ja) * 2016-02-10 2019-04-18 メフォン ベンチャーズ インク.Mefon Ventures Inc. バイオメトリクスによるウェアラブルデバイスのユーザ認証又は登録方法
JP2019121080A (ja) * 2017-12-28 2019-07-22 シヤチハタ株式会社 認証用画像の取得装置、認証用画像の取得システムおよび認証用画像の取得方法
JP2019535025A (ja) * 2017-09-11 2019-12-05 ピン・アン・テクノロジー(シェンゼン)カンパニー リミテッドPing An Technology (Shenzhen) Co., Ltd. 声紋識別によるエージェントログイン方法、電子装置及び記憶媒体
JP2020503578A (ja) * 2016-10-13 2020-01-30 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実を使用したユーザ識別認証
JP2020504348A (ja) * 2016-10-13 2020-02-06 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実に基づくサービス制御およびユーザ識別認証
WO2020105578A1 (fr) * 2018-11-21 2020-05-28 日本電気株式会社 Dispositif d'imagerie et procédé d'imagerie
WO2021245886A1 (fr) * 2020-06-04 2021-12-09 三菱電機株式会社 Système de gestion de circulation dans un bâtiment
JP7339569B2 (ja) 2020-10-30 2023-09-06 株式会社Mixi 頭部装着ディスプレイ装置、認証方法、及び認証プログラム

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001273498A (ja) * 2000-03-24 2001-10-05 Matsushita Electric Ind Co Ltd バイオメトリックに基づく本人認証装置、本人認証システム、本人認証用カード及び本人認証方法
JP2003108983A (ja) * 2001-09-28 2003-04-11 Matsushita Electric Ind Co Ltd 目画像撮像装置及び虹彩認証装置並びに虹彩認証機能付き携帯端末装置
JP2005071118A (ja) * 2003-08-26 2005-03-17 Hitachi Ltd 個人認証装置及び方法
JP2008123207A (ja) * 2006-11-10 2008-05-29 Sony Corp 登録装置、照合装置、登録方法、照合方法及びプログラム
JP2009230692A (ja) * 2008-03-25 2009-10-08 Fujitsu Ltd 生体認証装置、生体情報登録装置および生体認証方法
JP2010146158A (ja) * 2008-12-17 2010-07-01 Fujitsu Ltd 生体認証装置、生体認証方法、及びコンピュータプログラム
JP2011013710A (ja) * 2009-06-30 2011-01-20 Nec Corp 生体パターン撮像装置
JP2012053629A (ja) * 2010-08-31 2012-03-15 Canon Inc 画像生成装置、画像生成方法、及びプログラム
JP2013521576A (ja) * 2010-02-28 2013-06-10 オスターハウト グループ インコーポレイテッド 対話式ヘッド取付け型アイピース上での地域広告コンテンツ
JP2014044654A (ja) * 2012-08-28 2014-03-13 Nikon Corp 情報入出力装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001273498A (ja) * 2000-03-24 2001-10-05 Matsushita Electric Ind Co Ltd バイオメトリックに基づく本人認証装置、本人認証システム、本人認証用カード及び本人認証方法
JP2003108983A (ja) * 2001-09-28 2003-04-11 Matsushita Electric Ind Co Ltd 目画像撮像装置及び虹彩認証装置並びに虹彩認証機能付き携帯端末装置
JP2005071118A (ja) * 2003-08-26 2005-03-17 Hitachi Ltd 個人認証装置及び方法
JP2008123207A (ja) * 2006-11-10 2008-05-29 Sony Corp 登録装置、照合装置、登録方法、照合方法及びプログラム
JP2009230692A (ja) * 2008-03-25 2009-10-08 Fujitsu Ltd 生体認証装置、生体情報登録装置および生体認証方法
JP2010146158A (ja) * 2008-12-17 2010-07-01 Fujitsu Ltd 生体認証装置、生体認証方法、及びコンピュータプログラム
JP2011013710A (ja) * 2009-06-30 2011-01-20 Nec Corp 生体パターン撮像装置
JP2013521576A (ja) * 2010-02-28 2013-06-10 オスターハウト グループ インコーポレイテッド 対話式ヘッド取付け型アイピース上での地域広告コンテンツ
JP2012053629A (ja) * 2010-08-31 2012-03-15 Canon Inc 画像生成装置、画像生成方法、及びプログラム
JP2014044654A (ja) * 2012-08-28 2014-03-13 Nikon Corp 情報入出力装置

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019511021A (ja) * 2016-02-10 2019-04-18 メフォン ベンチャーズ インク.Mefon Ventures Inc. バイオメトリクスによるウェアラブルデバイスのユーザ認証又は登録方法
US11227038B2 (en) 2016-10-13 2022-01-18 Advanced New Technologies Co., Ltd. User identity authentication using virtual reality
JP2020503578A (ja) * 2016-10-13 2020-01-30 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実を使用したユーザ識別認証
JP2020504348A (ja) * 2016-10-13 2020-02-06 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実に基づくサービス制御およびユーザ識別認証
JP2018156479A (ja) * 2017-03-17 2018-10-04 日本電信電話株式会社 認証システム、認証装置および認証方法
JP2018180952A (ja) * 2017-04-13 2018-11-15 富士通株式会社 情報処理装置、機能制限管理方法及び機能制限管理プログラム
JP2019535025A (ja) * 2017-09-11 2019-12-05 ピン・アン・テクノロジー(シェンゼン)カンパニー リミテッドPing An Technology (Shenzhen) Co., Ltd. 声紋識別によるエージェントログイン方法、電子装置及び記憶媒体
JP2019121080A (ja) * 2017-12-28 2019-07-22 シヤチハタ株式会社 認証用画像の取得装置、認証用画像の取得システムおよび認証用画像の取得方法
JP7098131B2 (ja) 2017-12-28 2022-07-11 シヤチハタ株式会社 認証用画像の取得装置、認証用画像の取得システムおよび認証用画像の取得方法
JPWO2020105578A1 (ja) * 2018-11-21 2021-09-27 日本電気株式会社 撮像装置および撮像方法
WO2020105578A1 (fr) * 2018-11-21 2020-05-28 日本電気株式会社 Dispositif d'imagerie et procédé d'imagerie
JP7259866B2 (ja) 2018-11-21 2023-04-18 日本電気株式会社 撮像装置および撮像方法
US11699304B2 (en) 2018-11-21 2023-07-11 Nec Corporation Imaging device and imaging method
WO2021245886A1 (fr) * 2020-06-04 2021-12-09 三菱電機株式会社 Système de gestion de circulation dans un bâtiment
JP7294538B2 (ja) 2020-06-04 2023-06-20 三菱電機株式会社 建物の交通管理システム
JP7339569B2 (ja) 2020-10-30 2023-09-06 株式会社Mixi 頭部装着ディスプレイ装置、認証方法、及び認証プログラム

Similar Documents

Publication Publication Date Title
WO2015194017A1 (fr) Dispositif portable sur soi et procédé d'authentification associé
US11892710B2 (en) Eyewear device with fingerprint sensor for user input
US10341113B2 (en) Password management
US10205883B2 (en) Display control method, terminal device, and storage medium
US9503800B2 (en) Glass-type terminal and method of controlling the same
US10102676B2 (en) Information processing apparatus, display apparatus, information processing method, and program
KR101688168B1 (ko) 이동 단말기 및 그 제어방법
JP7070588B2 (ja) 生体認証装置、システム、方法およびプログラム
US20140126782A1 (en) Image display apparatus, image display method, and computer program
KR20160136013A (ko) 이동 단말기 및 그 제어 방법
JP6701631B2 (ja) 表示装置、表示装置の制御方法、表示システム、及び、プログラム
JP6789170B2 (ja) ディスプレイ装置、認証方法、及び認証プログラム
JP2018032222A (ja) 携帯機器、認証方法および認証プログラム
US20160116740A1 (en) Display device, control method for display device, display system, and computer program
JP2007159762A (ja) 生体認証装置用距離測定装置および生体認証装置
US20230308873A1 (en) Systems and methods for user authenticated devices
JP5730086B2 (ja) 入力装置および入力方法
JP6539981B2 (ja) 表示装置、表示装置の制御方法、表示システム、及び、プログラム
KR101629758B1 (ko) 글라스형 웨어러블 디바이스의 잠금해제 방법 및 프로그램
JP7045633B2 (ja) 頭部装着ディスプレイ装置、認証方法、及び認証プログラム
KR20180031240A (ko) 이동 단말기 및 그 이동 단말기의 제어 방법
KR20240014988A (ko) 생체 정보에 기반한 외부 표시 장치와의 연결 방법 및 전자 장치
KR101661556B1 (ko) 글라스형 웨어러블 디바이스를 이용한 신원 확인 방법 및 프로그램
KR20240014415A (ko) 외부 장치를 이용한 지문 탈취 방지 방법 및 전자 장치
JP2016181864A (ja) 表示装置、及び、表示装置の制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14895187

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14895187

Country of ref document: EP

Kind code of ref document: A1