WO2016132617A1 - 情報処理装置、情報処理方法、およびプログラム - Google Patents
情報処理装置、情報処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2016132617A1 WO2016132617A1 PCT/JP2015/082623 JP2015082623W WO2016132617A1 WO 2016132617 A1 WO2016132617 A1 WO 2016132617A1 JP 2015082623 W JP2015082623 W JP 2015082623W WO 2016132617 A1 WO2016132617 A1 WO 2016132617A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- line
- sight
- unit
- movement
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
- G06V40/113—Recognition of static hand signs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/37—Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/20—Individual registration on entry or exit involving the use of a pass
- G07C9/22—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
- G07C9/25—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
- G07C9/257—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition electronically
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 an operation pattern including movement of a user's gaze position or gaze point with respect to a plurality of images displayed on a display panel is specified, and the specified operation pattern and a determination registered in advance A technique for authenticating a user based on the degree of coincidence with a pattern is disclosed.
- Patent Document 1 has low authentication safety.
- the number of images that can be displayed on the display panel is restricted, so the number of determination patterns that can be registered is restricted.
- the determination pattern may be specified by a third party, for example, when the movement of the user's line of sight is seen by a third party.
- the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of realizing highly secure authentication using line-of-sight movement.
- an information processing apparatus comprising: a line-of-sight movement specifying unit that performs a user authentication unit that authenticates the user based on the specified line-of-sight movement.
- the movement of the user's line of sight relative to the user's body based on the image of the user's eyes and the image of the user's body located in the direction of the user's line of sight captured by the imaging unit.
- An information processing method includes identifying the user and authenticating the user based on the identified movement of the line of sight.
- the computer is configured to capture the user's body with respect to the user's body based on the image of the user's eye captured by the imaging unit and the image of the user's body positioned in the user's line of sight.
- a program is provided for functioning as a line-of-sight movement specifying unit that specifies movement of a line of sight and a user authentication unit that authenticates the user based on the movement of the specified line of sight.
- FIG. 1 is an external view of an information processing apparatus 10-1 according to a first embodiment of the present disclosure. It is a functional block diagram showing an example of composition of information processor 10-1 by the embodiment. It is explanatory drawing which showed the example of the image of the user's eyeball by the embodiment. It is explanatory drawing which showed the example of the image of the user's right hand by the embodiment. It is explanatory drawing which showed the example of the eyes
- FIG. 3 is an explanatory diagram illustrating a hardware configuration common to the information processing apparatus 10-1, the information processing apparatus 10-2, and the information processing apparatus 30.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
- a plurality of configurations having substantially the same functional configuration are distinguished as necessary, such as an eyeball image 20a and an eyeball image 20b.
- only the same reference numerals are given.
- the eyeball image 20a and the eyeball image 20b they are simply referred to as the eyeball image 20.
- FIG. 1 is an explanatory diagram showing the appearance of the information processing apparatus 10-1.
- the information processing apparatus 10-1 is, for example, a glasses-type apparatus that can be used by being worn on the head of a user.
- the information processing apparatus 10-1 includes a first imaging unit 120, a second imaging unit 122, and a display unit 124 that displays a display screen.
- the first imaging unit 120 is an imaging unit directed to the inside of the information processing apparatus 10-1.
- the first imaging unit 120 can capture an image of one eyeball of the user as, for example, a moving image in a state where the user wears the information processing apparatus 10-1.
- the second imaging unit 122 is an imaging unit that is directed to the outside of the information processing apparatus 10-1.
- the second imaging unit 122 can capture an image of a scene in front of the user, for example, as a moving image.
- the display unit 124 is configured as a see-through display, for example. As a result, the user can see the scene in front of the user through the display while viewing the display screen displayed on the display unit 124 in a state where the information processing apparatus 10-1 is worn on the head. is there.
- FIG. 1 illustrates an example in which the first imaging unit 120, the second imaging unit 122, and the display unit 124 are provided in the right lens in FIG. 1, but the present invention is not limited to this example, and the left lens May be provided.
- the information processing apparatus 10-1 according to the first embodiment has been created by focusing on the above circumstances.
- the information processing apparatus 10-1 according to the first embodiment can achieve highly secure authentication by using a line-of-sight pattern with respect to the user's body.
- FIG. 2 is a functional block diagram showing the configuration of the information processing apparatus 10-1.
- the information processing apparatus 10-1 includes a control unit 100-1, a first imaging unit 120, a second imaging unit 122, a display unit 124, a communication unit 126, and a storage unit 128.
- control unit 100-1 uses hardware such as a CPU (Central Processing Unit) 150 and a RAM (Random Access Memory) 154, which will be described later, built in the information processing apparatus 10-1. Control the overall operation. Further, as illustrated in FIG. 2, the control unit 100-1 includes a line-of-sight detection unit 102, a line-of-sight movement specifying unit 104, a line-of-sight pattern comparison unit 106, a user authentication unit 108, and a display control unit 110.
- the line-of-sight detection unit 102 detects the user's line-of-sight direction based on the image of the user's eyeball imaged by the first imaging unit 120. For example, the line-of-sight detection unit 102 first specifies the position of the black eye of the eyeball included in the image of the user's eyeball imaged by the first imaging unit 120, and then determines the direction of the user's line of sight based on the specified position. To detect. More specifically, the line-of-sight detection unit 102 performs pattern matching between eyeball image learning data for each line-of-sight direction stored in the storage unit 128 and the eyeball image captured by the first imaging unit 120, for example. By doing so, the user's line-of-sight direction is specified.
- FIG. 3 is an explanatory diagram illustrating an example of an eyeball image (eyeball image 20) captured by the first imaging unit 120.
- the line-of-sight detection unit 102 identifies the user's line-of-sight direction in the eyeball image 20a as “left” based on the eyeball image 20a and the learning data illustrated in FIG.
- the line-of-sight detection unit 102 identifies the user's line-of-sight direction in the eyeball image 20b illustrated in FIG.
- the line-of-sight movement specifying unit 104 moves the user's line of sight relative to the user's body based on the user's line-of-sight direction detected by the line-of-sight detection unit 102 and the user's body image captured by the second imaging unit 122. Identify. For example, the line-of-sight movement identification unit 104 firstly intersects the half-line extending in the user's line-of-sight direction detected by the line-of-sight detection unit 102 from the position of the user's eyes (hereinafter referred to as the visual position). The part which the user is looking at is specified. Then, the line-of-sight movement specifying unit 104 specifies the user's line-of-sight movement with respect to the specified part.
- the user's line-of-sight movement includes, for example, gaze at the user's body part or line-of-sight trace.
- the line-of-sight trace is a locus drawn by the user, and has various shapes such as a straight line, a curve, or a circle.
- the line-of-sight movement specifying unit 104 specifies that the user is gazing at the corresponding position.
- the line-of-sight movement specifying unit 104 traces the corresponding part from the starting point. Identify that.
- FIG. 4 is an explanatory diagram showing an example (right hand image 22) of the user's right hand image captured by the second imaging unit 122.
- FIG. 4 it is assumed that the user is gazing at the tip 220a of the little finger.
- the line-of-sight movement specifying unit 104 specifies that the part viewed by the user is the tip 220a of the little finger by calculating the visual position based on the line-of-sight direction of the user detected by the line-of-sight detection unit 102.
- the line-of-sight movement specifying unit 104 specifies that the user is gazing at the tip 220a of the little finger.
- FIG. 5 is an explanatory diagram showing a series of line-of-sight movements performed by the user in the posture shown in FIG. Specifically, in FIG. 5, as indicated by the dashed arrow, the user first gazes at the tip of the little finger 220 a, then traces the life line of the palm with the line of sight, and then starts from the tip of the middle finger. This shows an example of tracing up to the line of sight.
- the line-of-sight movement specifying unit 104 specifies that the user is tracing the lifeline. Further, if it is detected that the line-of-sight direction is moved discontinuously from the upper end of the lifeline to the tip of the middle finger and the viewing position is continuously moved within a predetermined time from the point after the movement, the middle finger Specify that the user is tracing for.
- the line-of-sight movement specifying unit 104 specifies individual line-of-sight movements in association with the order of line-of-sight movements when a plurality of line-of-sight movements are performed by the user. For example, in the example illustrated in FIG. 5, the line-of-sight movement specifying unit 104 specifies the gaze of the tip 220a of the little finger as the first line-of-sight movement, and specifies the trace from the lower end to the upper end of the lifeline as the second line-of-sight movement. Then, the trace from the tip to the root of the middle finger is specified as the third eye movement.
- the line-of-sight movement specifying unit 104 can also specify the user's line-of-sight movement with respect to the body in conjunction with a change in the posture of the user's body.
- the line-of-sight movement specifying unit 104 specifies the user's line-of-sight movement with respect to the changed posture every time the user's body posture changes.
- the posture change may be a gesture, for example.
- FIG. 6 is an explanatory diagram showing an example (gesture image 24) of the gesture image of the right hand of the user imaged by the second imaging unit 122.
- FIG. 6 shows an example in which the gesture image 24a, the gesture image 24b, and the gesture image 24c are captured in that order, that is, the user changes the shape of the right hand in the order of goo, choki, and par.
- the line-of-sight movement specifying unit 104 first specifies that the shape of the right hand is goo based on the gesture image 24a. Then, the line-of-sight movement specifying unit 104 specifies that the user is gazing at the tip 240a of the thumb when the shape of the right hand is goo.
- the line-of-sight movement specifying unit 104 specifies that the shape of the right hand has been changed from goo to choco based on the gesture image 24b. Then, the line-of-sight movement specifying unit 104 specifies that the user is gazing at the tip 240b of the index finger when the shape of the right hand is stiff. Similarly, the line-of-sight movement specifying unit 104 specifies that the shape of the right hand has changed from choki to par based on the gesture image 24c, and the user moves the tip of the little finger 240c when the shape of the right hand is par. Identify that you are watching.
- individual line-of-sight movements combined with user gestures can be specified.
- the line-of-sight pattern comparing unit 106 compares the line-of-sight pattern registered in the user information DB 130 in association with the user and the line-of-sight pattern based on the movement of the line of sight of the user specified by the line-of-sight movement specifying unit 104. More specifically, the line-of-sight pattern comparing unit 106 first identifies, as the line-of-sight pattern, an array of line-of-sight movements for each part of the user's body identified by the line-of-sight movement identifying unit 104 for each line-of-sight movement order. Then, the line-of-sight pattern comparison unit 106 compares the identified line-of-sight pattern with the line-of-sight pattern registered in the user information DB 130 to determine the degree of coincidence.
- the line-of-sight pattern comparison unit 106 includes three line-of-sight movement arrangements, that is, gaze at the tip of the little finger, trace from the bottom end to the top end of the lifeline, and trace from the tip to the root of the middle finger. Is specified as the user's line-of-sight pattern. Then, the line-of-sight pattern comparison unit 106 associates each user with the line-of-sight movement registered in the user information DB 130 and the line-of-sight movement included in the line-of-sight pattern identified by the line-of-sight movement identification unit 104. Are compared for each movement order of the line of sight, thereby determining the degree of coincidence of the line-of-sight pattern.
- the user information DB 130 is a database in which line-of-sight patterns are recorded in association with users.
- a configuration example of the user information DB 130 will be described with reference to FIG.
- a user ID 1300, a target part / posture 1302, and a line-of-sight pattern 1304 are associated with each other.
- the line-of-sight pattern 1304 includes line-of-sight movements 1306 in each order, such as a first line-of-sight movement 1306a and a second line-of-sight movement 1306b.
- a user ID issued in advance for each user is recorded.
- the part and the posture of the part in which the line-of-sight pattern is registered in association with the user of the corresponding user ID are recorded.
- the line-of-sight pattern 1304 a line-of-sight pattern registered in association with the user of the corresponding user ID and the corresponding part is recorded.
- the contents of each order of line-of-sight motion included in the registered line-of-sight patterns are recorded.
- the first record shown in FIG. 7 shows a registration example of the line-of-sight pattern for the right hand in a state where the right hand is opened for the user whose user ID is “0001”.
- the first line-of-sight movement included in this line-of-sight pattern is “gazing at the tip of the little finger”
- the second line-of-sight movement is “trace from the bottom to the top of the life line”
- the third line-of-sight movement Indicates “trace from the tip of the middle finger to the root”.
- the user authentication unit 108 authenticates the user based on the comparison result by the line-of-sight pattern comparison unit 106. More specifically, the user authentication unit 108 determines that the user is a valid user when the degree of coincidence determined by the line-of-sight pattern comparison unit 106 is equal to or greater than a predetermined threshold, and the determined match When the degree is less than a predetermined threshold, it is determined that the user is not a valid user.
- the user authentication unit 108 can transmit the result of authentication to another device to the communication unit 126 described later.
- the display control unit 110 sequentially displays a display indicating the movement of the identified line of sight on the display unit 124 described later each time the movement of the line of sight is identified by the line-of-sight movement identification unit 104. For example, in the example illustrated in FIG. 5, the display control unit 110 displays a display indicating the gaze position of the little finger when the gaze movement specifying unit 104 confirms that the user is gazing at the tip 220 a of the little finger. The image is superimposed on the display unit 124 at the tip 220a. Similarly, the display control unit 110 shows the line of the newly confirmed part (for example, in the trace 220b) every time the visual line movement specifying unit 104 confirms that the user is tracing from the lower end of the life line. As shown in FIG.
- the user can move the line of sight while sequentially confirming the line-of-sight movement recognized by the information processing apparatus 10-1 at the time of authentication. For this reason, the user can perform the line-of-sight movement with respect to the body more easily.
- the first imaging unit 120 is an example of a first imaging unit in the present disclosure.
- the first imaging unit 120 captures an image of the user's eyeball.
- the second imaging unit 122 is an example of a second imaging unit in the present disclosure.
- the second imaging unit 122 captures an image of the user's body located in the user's line-of-sight direction.
- the display unit 124 displays various display screens under the control of the display control unit 110.
- the communication unit 126 transmits and receives information to and from another device (not shown) that can communicate with the information processing device 10-1 by wireless communication, for example.
- the communication unit 126 receives the user ID of the user wearing the information processing apparatus 10-1 from another apparatus.
- the storage unit 128 can store, for example, various data and software such as the user information DB 130 and the user ID.
- the configuration of the information processing apparatus 10-1 according to the first embodiment is not limited to the configuration described above.
- the user information DB 130 may be stored in another device that can communicate with the information processing device 10-1.
- the information processing apparatus 10-1 may further include an input unit for the user to input various information such as a user ID to the information processing apparatus 10-1.
- the input unit may be capable of receiving, for example, an input based on the line of sight, that is, an input based on the line-of-sight direction detected by the line-of-sight detection unit 102.
- FIG. 8 is an image of the left arm of the user's left arm and left hand (left arm image 26) captured by the second imaging unit 122.
- FIG. 9 is an explanatory diagram showing a series of line-of-sight movements performed by the user in the posture shown in FIG. Specifically, in FIG. 9, as indicated by the dashed arrow, the user first gazes at the left arm elbow 262a, then gazes at the second joint 262b of the index finger, and is included in the forearm. The example which stared at the mole 262c is shown.
- the line-of-sight movement identification unit 104 gazes at the elbow 262a of the left arm based on the user's line-of-sight direction detected by the line-of-sight detection unit 102 and the user's body image captured by the second imaging unit 122. Is identified as the first eye movement.
- the line-of-sight movement is performed.
- the specifying unit 104 specifies the gaze of the second joint 262b of the index finger as the second line-of-sight movement. Furthermore, if it is detected that the line-of-sight direction is discontinuously moved from the second joint 262b of the index finger to the mole 262c of the forearm and the user has been looking at the place after the movement for a predetermined time or more, the line-of-sight movement is performed.
- the identifying unit 104 identifies the gaze of the forearm mole 262c as the third line-of-sight movement.
- the line-of-sight pattern comparison unit 106 identifies the sequence of the line-of-sight movements identified by the line-of-sight movement identification unit 104 as the line-of-sight pattern of the user. Then, the line-of-sight pattern comparison unit 106 associates each line-of-sight movement included in the line-of-sight pattern registered in the user information DB 130 in association with the user and each line-of-sight movement included in the specified line-of-sight pattern for each line-of-sight movement order. The degree of coincidence of the line-of-sight pattern is determined by comparing each.
- the user authentication unit 108 determines that the user is a valid user when the degree of coincidence determined by the line-of-sight pattern comparison unit 106 is equal to or greater than a predetermined threshold. Further, the user authentication unit 108 determines that the user is not a valid user when the determined degree of matching is less than a predetermined threshold.
- FIG. 10 is a flowchart showing a part of the operation according to the first embodiment.
- the communication unit 126 of the information processing apparatus 10-1 receives a user ID from another apparatus (S101).
- the control unit 100-1 may extract the user ID from the storage unit 128.
- the line-of-sight movement specifying unit 104 extracts the line-of-sight pattern registered in the user information DB 130 in association with the user ID acquired in S101 from the user information DB 130 (S103).
- the line-of-sight movement specifying unit 104 sets “1” to the variable i indicating the order of the line-of-sight movement (S105). Then, the line-of-sight movement specifying unit 104 sets the number of line-of-sight movements included in the line-of-sight pattern extracted in S103 as a constant N indicating the total number of line-of-sight movements (S107).
- the line-of-sight movement identifying unit 104 determines whether or not the value of i is N or less (S109). When i is N or less (S109: Yes), the line-of-sight movement identifying unit 104 determines whether or not a change in the user's line-of-sight direction is detected by the line-of-sight detection unit 102 (S111). When a change in the user's line-of-sight direction has not been detected (S111: No), the line-of-sight movement identifying unit 104 repeats the operation of S111.
- the line-of-sight movement specifying unit 104 specifies the body part the user is looking at based on the detected line-of-sight direction after the movement. (S113).
- the line-of-sight movement specifying unit 104 specifies the user's line-of-sight movement with respect to the body part specified in S113 (S115).
- the line-of-sight movement identification unit 104 records the line-of-sight movement identified in S115 in the storage unit 128 as the i-th line-of-sight movement (S117).
- the line-of-sight movement identifying unit 104 adds “1” to i (S119). Thereafter, the line-of-sight movement identifying unit 104 repeats the operation of S109 again.
- the line-of-sight pattern comparison unit 106 identifies the sequence of line-of-sight movements recorded in the storage unit 128 in step S117 as the line-of-sight pattern (S121).
- the line-of-sight pattern comparison unit 106 compares the line-of-sight pattern identified in S121 with the line-of-sight pattern extracted in S103, and determines the degree of coincidence (S123). If the determined degree of coincidence is equal to or greater than the predetermined threshold (S123: Yes), the user authentication unit 108 determines that the user is a valid user (S125). On the other hand, if the determined degree of matching is less than the predetermined threshold (S123: No), the user authentication unit 108 determines that the user is not a valid user (S127).
- the information processing apparatus 10-1 includes the image of the eyeball of the user captured by the first imaging unit 120, The movement of the user's line of sight relative to the user's body is identified based on the image of the user's body located in the user's line of sight imaged by the second imaging unit 122, and the movement of the identified line of sight is performed. Authenticate users based on.
- the information processing apparatus 10-1 performs highly secure authentication using sight line movement. Can be realized. For example, even if a third party steals a user's line of sight during user authentication, it is difficult for the third party to completely specify the line-of-sight pattern. Accordingly, since the line-of-sight pattern is not easily stolen by another person, it is possible to perform highly secure and robust user authentication.
- the user can freely select a target part of the line-of-sight motion from a very large number of body parts. Also, the user can arbitrarily change the line-of-sight pattern as desired.
- Second Embodiment >> The first embodiment has been described above. Next, a second embodiment will be described. First, the background that led to the creation of the second embodiment will be described.
- the physical characteristics such as the length of arms and fingers, or the position and shape of moles, wrinkles, and blood vessel patterns are unique to individuals. For this reason, for the user himself / herself, physical features are easily identifiable or difficult to forget, but for others it is difficult to identify and memorize.
- the line-of-sight pattern is third.
- the risk identified by the person is very small.
- the gaze pattern is more difficult to be stolen by others.
- FIG. 12 is a functional block diagram showing the configuration of the information processing apparatus 10-2 according to the second embodiment.
- the information processing apparatus 10-2 includes a control unit 100-2 instead of the control unit 100-1 as compared with the information processing apparatus 10-1 according to the first embodiment.
- description of functions that are the same as those in the first embodiment will be omitted.
- control unit 100-2 The control unit 100-2 further includes a physical feature detection unit 112, as compared with the control unit 100-1.
- the physical feature detection unit 112 detects a physical feature related to a part of the user's body based on the image of the user's body located in the user's line of sight imaged by the second imaging unit 122.
- the physical feature detection unit 112 first specifies the distance from the position of the user's eyes to the body part that the user is looking at based on the gaze direction detected by the gaze detection unit 102. More specifically, the physical feature detection unit 112 is viewed by the user based on, for example, a measurement result by a distance sensor (not shown) included in the information processing apparatus 10-2 or a captured image by the second imaging unit 120. Identify the distance to the part.
- the physical feature detection unit 112 may The measured distance (fixed distance) is specified as the distance to the part the user is viewing.
- the physical feature detection unit 112 determines the relative length of the part. , Detect the relative position. More specifically, first, the physical feature detection unit 112 starts from the provisional length of the part or a specific location based on the image of the part of the user's body imaged by the second imaging unit 122. The temporary position of the other place in the said part is calculated. Then, the physical feature detection unit 112 calculates the temporary value based on the ratio between the distance to the identified body part and the distance from the position of the user's eyes to the body part when the line-of-sight pattern is registered. The relative length of the part and the relative position of the other place are calculated by normalizing the length and the temporary position.
- the physical feature detection unit 112 detects a shape, a relative length, or a relative position starting from a specific place, such as a mole, a heel, or a blood vessel pattern included in the part.
- a specific place such as a mole, a heel, or a blood vessel pattern included in the part.
- the specific location may be determined, for example, as a location that the user first viewed at the site.
- the physical feature detection unit 112 first specifies the distance from the user's eye position to the right hand. Then, based on the right hand image 22 and the specified distance, the physical feature detection unit 112 detects features related to the palm line of the right hand and each finger of the right hand. More specifically, the physical feature detection unit 112 detects, for example, the shape, relative length, or relative position of each palm line, such as emotion lines, brain lines, and life lines. To do. The physical feature detection unit 112 detects the relative length of each finger and the relative position of each finger starting from a specific location.
- the line-of-sight movement specifying unit 104 is detected by the user's line-of-sight direction detected by the line-of-sight detection unit 102, the user's body image captured by the second imaging unit 122, and the physical feature detection unit 112.
- the movement of the user's line of sight relative to the user's body is specified based on the determined physical characteristics.
- the line-of-sight movement specifying unit 104 is configured to display a line (that is, a lifeline) traced by the user with a line of sight on the palm based on the characteristics regarding each palm line detected by the physical feature detection unit 112.
- Information including a relative position, a relative length, or a shape is specified as the second eye movement.
- the line-of-sight movement specifying unit 104 is based on the feature about the middle finger detected by the physical feature detection unit 112, and the relative position of the line traced by the user with the line of sight (that is, from the tip of the middle finger to the base) on the middle finger, And information including the relative length is specified as the third eye movement.
- the line-of-sight movement specifying unit 104 starts from a specific place for each shape of the right hand based on the physical feature of the right hand detected by the physical feature detection unit 112.
- the information including the relative position of the place 240 at which the user gazes is specified as the eye movement of each order.
- the specific place may be the gaze position in the first line-of-sight movement (that is, the tip 240a of the thumb when the shape of the right hand is goo).
- the line-of-sight movement specifying unit 104 specifies, for example, the user's gaze position when the shape of the right hand is stiff as a relative position starting from the user's gaze position when the shape of the right hand is goo. It will be.
- the physical feature detection unit 112 starts from the position of the user's eyes to the left arm or the left hand (the user is viewing) based on the gaze direction detected by the gaze detection unit 102. Identify the distance.
- the physical feature detector 112 detects a physical feature related to the left arm or the left hand based on the left arm image 26 shown in FIG. 8 and the identified distance to the left arm or the left hand. For example, the physical feature detection unit 112 specifies the position of a mole included in the forearm.
- the line-of-sight movement identification unit 104 is a part of the user's first gaze based on the user's line-of-sight direction detected by the line-of-sight detection unit 102 and the physical feature detected by the physical feature detection unit 112.
- the relative position of the left arm elbow 262a starting from a specific location is specified, and information including the specified position is specified as the first eye movement.
- the line-of-sight movement specifying unit 104 detects that the line-of-sight direction is discontinuously moved from the elbow 262a of the left arm to the second joint 262b of the index finger, and that the user has been looking at the place after the movement for a predetermined time or more.
- the line-of-sight movement specifying unit 104 specifies the relative position of the part that the user is currently viewing (that is, the second joint 262b of the index finger) starting from a specific place, and specifies The information including the position is specified as the second eye movement.
- the line-of-sight movement specifying unit 104 for example, the relative distance to the second joint 262b of the index finger starting from the gaze position (left arm elbow 262a) in the first line-of-sight movement, and the index finger starting from the mole position
- the relative position of the second joint 262b of the index finger may be specified based on the relative distance to the second joint 262b.
- the line-of-sight movement is performed.
- the identifying unit 104 identifies information including the relative position of the forearm mole 262c as the third eye movement.
- the line-of-sight pattern comparison unit 106 identifies the sequence of the line-of-sight movements identified by the line-of-sight movement identification unit 104 as the line-of-sight pattern of the user. Then, the line-of-sight pattern comparison unit 106 associates each line-of-sight movement included in the line-of-sight pattern registered in the user information DB 130 in association with the user and each line-of-sight movement included in the specified line-of-sight pattern for each line-of-sight movement order. The degree of coincidence of the line-of-sight pattern is determined by comparing each.
- the user authentication unit 108 determines that the user is a valid user when the degree of coincidence determined by the line-of-sight pattern comparison unit 106 is equal to or greater than a predetermined threshold. Further, the user authentication unit 108 determines that the user is not a valid user when the determined degree of matching is less than a predetermined threshold.
- the physical feature detection unit 112 is viewed by the user based on the visual line direction after movement detected by the visual line detection unit 102 and the body image of the user captured by the second imaging unit 122. Detect physical features related to the body part that is present.
- the line-of-sight movement specifying unit 104 performs the user's line-of-sight movement with respect to the body part specified in S113 based on the user's line-of-sight direction detected by the line-of-sight detection unit 102 and the physical features detected in S113. Is identified.
- the information processing apparatus 10-2 captures the body of the user located in the user's line-of-sight direction captured by the second imaging unit 122. Based on the image, a physical feature related to a part of the user's body is detected, and based on the user's eye image captured by the first imaging unit 120 and the detected physical feature, the user's line of sight with respect to the user's body is detected. The movement is identified and the user is authenticated based on the identified line of sight movement.
- the information processing apparatus 10-2 can realize authentication with higher safety by using a line-of-sight pattern using physical features. For example, even if a third party steals the movement of the user's line of sight during user authentication, the physical characteristics are individual-specific and difficult for the third party to identify. It is extremely difficult to completely identify the pattern. Accordingly, the line-of-sight pattern using physical features is more difficult to be stolen by another person, so that the safety is further enhanced and robust user authentication is possible.
- the authentication method according to the second embodiment can easily change the line-of-sight pattern. Therefore, even if the line-of-sight pattern leaks, there is an advantage that damage can be suppressed to a small level.
- the information processing apparatus 10-1 according to the first embodiment and the information processing apparatus 10-2 according to the second embodiment have two types of image capturing units, that is, the first image capturing the eyeball image of the user. It is assumed that the imaging unit 120 and the second imaging unit 122 that captures an image of the user's body are included.
- the third embodiment it is possible to identify the movement of the user's line of sight relative to the body and authenticate the user with an apparatus having only one type of imaging unit.
- FIG. 13 is an explanatory diagram showing the appearance of the information processing apparatus 30.
- the information processing apparatus 30 includes an imaging unit 320 that can image the eyeball of the user 2 and a part of the body of the user 2 such as an arm, for example.
- This information processing apparatus 30 may be installed, for example, at the entrance of a house.
- the information processing apparatus 30 may be configured as a part of an authentication system that performs user authentication for the purpose of unlocking the front door.
- FIG. 14 is a functional block diagram showing the configuration of the information processing apparatus 30 according to the third embodiment.
- the information processing apparatus 30 includes a control unit 300, an imaging unit 320, an output unit 322, a communication unit 324, and a storage unit 326.
- description of functions overlapping with those of the second embodiment is omitted.
- the control unit 300 generally controls the operation of the information processing apparatus 30. As illustrated in FIG. 14, the control unit 300 includes a line-of-sight detection unit 302, a line-of-sight movement specifying unit 304, a line-of-sight pattern comparison unit 306, a user authentication unit 308, an output control unit 310, and a physical feature detection unit 312.
- the line-of-sight detection unit 302 detects the user's line-of-sight direction based on the image of the user's eyeball imaged by the imaging unit 320 described later.
- the specific functions are substantially the same as those of the line-of-sight detection unit 102 according to the second embodiment.
- the physical feature detection unit 312 detects a physical feature related to a part of the user's body based on the image of the user's body located in the user's line of sight imaged by the imaging unit 320.
- the specific function is substantially the same as that of the physical feature detection unit 112 according to the second embodiment.
- the line-of-sight movement specifying unit 304 is based on the user's line-of-sight direction detected by the line-of-sight detection unit 102, the user's body image captured by the imaging unit 320, and the physical feature detected by the physical feature detection unit 312. The movement of the user's line of sight relative to the user's body is specified.
- the specific functions are substantially the same as those of the line-of-sight movement specifying unit 104 according to the second embodiment.
- the output control unit 310 causes the output unit 322 to be described later to output the authentication result by the user authentication unit 308. For example, the output control unit 310 causes the output unit 322 to output a sound notifying the authentication result.
- Imaging unit 320 The imaging unit 320, for example, simultaneously captures an image of the user's eyeball and an image of the user's body located in the user's line-of-sight direction.
- the output unit 322 outputs sound or video according to the control of the output control unit 310.
- the line-of-sight pattern comparison unit 306, the user authentication unit 308, the communication unit 324, and the storage unit 326 are respectively the line-of-sight pattern comparison unit 106, the user authentication unit 108, the communication unit 126, and the storage unit 128 according to the second embodiment. It is the same.
- the configuration of the information processing apparatus 30 according to the third embodiment has been described above, but is not limited to the configuration described above.
- the information processing apparatus 30 may not include one or more of the output control unit 310, the output unit 322, or the communication unit 324.
- the information processing apparatus 30 captures an image of the user's body located in the user's line-of-sight direction captured by the imaging unit 320. Based on this, a physical feature relating to a part of the user's body is detected, and an image of the user's eyeball imaged by the imaging unit 320 and a movement of the user's line of sight relative to the user's body based on the detected physical feature And authenticating the user based on the identified line of sight movement. For this reason, since the function substantially the same as 2nd Embodiment is implement
- the information processing apparatus 10-1 (or information processing apparatus 10-2, information processing apparatus 30) includes a CPU 150, a ROM (Read Only Memory) 152, a RAM 154, an internal bus 156, an interface 158, An output device 160, a camera 162, a storage device 164, and a communication device 166 are provided.
- the CPU 150 functions as an arithmetic processing device and a control device.
- the CPU 150 realizes the function of the control unit 100-1 (or the control unit 100-2 or the control unit 300).
- the CPU 150 is configured by a processor such as a microprocessor.
- the ROM 152 stores programs used by the CPU 150, control data such as calculation parameters, and the like.
- the RAM 154 temporarily stores a program executed by the CPU 150, for example.
- the internal bus 156 is composed of a CPU bus and the like.
- the internal bus 156 connects the CPU 150, the ROM 152, and the RAM 154 to each other.
- the interface 158 connects the output device 160, the camera 162, the storage device 164, and the communication device 166 with the internal bus 156.
- the storage device 164 exchanges data with the CPU 150 via the interface 158 and the internal bus 156.
- the output device 160 includes a display device such as a liquid crystal display (LCD) device, an organic light emitting diode (OLED) device, and a lamp. This display device displays an image generated by the CPU 150 and the like.
- LCD liquid crystal display
- OLED organic light emitting diode
- the output device 160 includes an audio output device such as a speaker. This audio output device converts audio data or the like into sound and outputs the sound.
- the output device 160 functions as the display unit 124 or the output unit 322.
- the camera 162 has a function of imaging an external image through a lens on an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and taking a still image or a moving image.
- the camera 162 functions as the first imaging unit 120, the second imaging unit 122, or the imaging unit 320.
- the storage device 164 is a data storage device that functions as the storage unit 128 or the storage unit 326.
- the storage device 164 includes, for example, a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, or a deletion device that deletes data recorded on the storage medium.
- the communication device 166 is a communication interface including a communication device for connecting to a communication network such as a public network or the Internet.
- the communication device 166 may be a wireless LAN compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication.
- the communication device 166 functions as the communication unit 126 or the communication unit 324.
- the information processing apparatus 30 recognizes the line-of-sight pattern based on the movement of the line of sight of the user specified by the line-of-sight movement specifying unit 304 and each user belonging to a specific group registered in the user information DB 130. The line-of-sight patterns of (or all users) are respectively compared. Then, the information processing apparatus 30 can identify the user with the highest degree of matching determined as the target person.
- the information processing apparatus 30 is the person to be imaged in the family even if the user ID is not input. It becomes possible to specify.
- the hardware such as the CPU 150, the ROM 152, and the RAM 154 has functions equivalent to those of the information processing apparatus 10-1, the information processing apparatus 10-2, or the information processing apparatus 30 described above. It is also possible to provide a computer program for exhibiting it. A recording medium on which the computer program is recorded is also provided.
- a line-of-sight movement identifying unit that identifies movement of the user's line of sight relative to the user's body based on an image of the user's eyes and an image of the user's body located in the direction of the user's line of sight captured by the imaging unit; , A user authentication unit that authenticates the user based on the movement of the identified line of sight;
- An information processing apparatus comprising: (2) The information processing apparatus includes a line-of-sight pattern comparison unit that compares a line-of-sight pattern registered in a database in association with the user and a line-of-sight pattern specified by the line-of-sight movement specified by the line-of-sight movement specifying unit Further comprising The information processing apparatus according to (1), wherein the user authentication unit authenticates the user based on a comparison result by the line-of-sight pattern comparison unit.
- the information processing apparatus further includes a physical feature detection unit that detects a physical feature related to a part of the user's body based on an image of the user's body, The line-of-sight movement specifying unit specifies the movement of the line of sight of the user with respect to the user's body based on the image of the user's eyes and the detected physical characteristics, according to (2) or (3).
- Information processing device includes a physical feature detection unit that detects a physical feature related to a part of the user's body based on an image of the user's body.
- the line-of-sight movement specifying unit specifies the movement of the line of sight of the user with respect to the user's body based on the image of the user's eyes and the detected physical characteristics, according to (2) or (3).
- the information processing apparatus is a line-of-sight motion pattern for one or more parts of the user's body.
- the line-of-sight pattern registered in the database is a line-of-sight movement pattern for each of one or more parts of the user's body associated with the order of movement of the line of sight. .
- the line-of-sight pattern registered in the database is a line-of-sight movement pattern for each of the one or more parts of the user's body associated with the order of change in the posture of the user's body.
- the information processing apparatus includes gaze on the body part of the user or a line-of-sight trace.
- the information processing apparatus according to any one of (4) to (8), wherein the physical feature includes a position or shape of a mole, eyelid, or blood vessel pattern.
- the information processing apparatus further includes a display control unit that sequentially displays on the display unit a display indicating the movement of the identified line of sight each time the movement of the line of sight is identified by the line-of-sight movement identifying unit.
- the information processing apparatus according to any one of (9).
- (11) The information processing apparatus according to (10), further including the display unit.
- the imaging unit includes: a first imaging unit that captures an image of the user's eyes; and a second imaging unit that captures an image of the user's body located in the user's line-of-sight direction.
- the information processing apparatus according to any one of (11) to (11).
- (13) The information processing apparatus according to any one of (1) to (12), wherein the information processing apparatus is a glasses-type apparatus.
- An information processing method comprising: (15) Computer A line-of-sight movement identifying unit that identifies movement of the user's line of sight relative to the user's body based on an image of the user's eyes and an image of the user's body located in the direction of the user's line of sight captured by the imaging unit; , A user authentication unit for authenticating the user based on the movement of the identified line of sight, Program to function as
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Collating Specific Patterns (AREA)
Abstract
Description
1.第1の実施形態
2.第2の実施形態
3.第3の実施形態
4.ハードウェア構成
5.変形例
<1-1.情報処理装置10‐1の基本構成>
[1-1-1.基本構成]
まず、第1の実施形態について説明する。最初に、第1の実施形態による情報処理装置10‐1の基本構成について、図1を参照して説明する。
以上、第1の実施形態による情報処理装置10‐1の基本構成について説明した。ところで、従来、表示画面に表示された複数の画像に対するユーザの視線の変化の検出結果を用いてユーザを認証する技術が提案されている。しかしながら、公知の技術では、認証の安全性が低い。例えば、上記の技術では、表示画面に表示可能な画像数が制約されるので、判定パターンの登録可能な数が制約される。このため、ユーザの認証時に、例えば第三者にユーザの視線の動きを盗み見されることにより、判定パターンが第三者に特定される恐れがある。
次に、情報処理装置10‐1の構成について詳細に説明する。図2は、情報処理装置10‐1の構成を示した機能ブロック図である。図2に示したように、情報処理装置10‐1は、制御部100‐1、第1撮像部120、第2撮像部122、表示部124、通信部126、および記憶部128を有する。
制御部100‐1は、情報処理装置10‐1に内蔵される、後述するCPU(Central Processing Unit)150、RAM(Random Access Memory)154などのハードウェアを用いて、情報処理装置10‐1の動作を全般的に制御する。また、図2に示したように、制御部100‐1は、視線検出部102、視線移動特定部104、視線パターン比較部106、ユーザ認証部108、および、表示制御部110を有する。
視線検出部102は、第1撮像部120により撮像されたユーザの眼球の画像に基づいて、ユーザの視線方向を検出する。例えば、視線検出部102は、まず、第1撮像部120により撮像されたユーザの眼球の画像に含まれる眼球の黒目の位置を特定し、そして、特定した位置に基づいて、ユーザの視線方向を検出する。より具体的には、視線検出部102は、例えば記憶部128に記憶されている、視線方向ごとの眼球の画像の学習データと、第1撮像部120により撮像された眼球の画像とをパターンマッチングすることにより、ユーザの視線方向を特定する。
(1-2-3-1.特定例1)
視線移動特定部104は、視線検出部102により検出されたユーザの視線方向、および、第2撮像部122により撮像されたユーザの身体の画像に基づいて、ユーザの身体に対するユーザの視線の移動を特定する。例えば、視線移動特定部104は、まず、ユーザの目の位置から、視線検出部102により検出されたユーザの視線方向へ伸ばした半直線とユーザの身体の部位との交点(以下、目視位置とも称する)を算出することにより、ユーザが見ている部位を特定する。そして、視線移動特定部104は、特定した部位に対するユーザの視線動作を特定する。
また、視線移動特定部104は、ユーザにより複数の視線動作が行われた場合には、視線動作の順番と対応づけて、個々の視線動作を特定する。例えば、図5に示した例では、視線移動特定部104は、小指の先端220aの注視を一番目の視線動作として特定し、生命線の下端から上端までのトレースを二番目の視線動作として特定し、そして、中指の先端から根元までのトレースを三番目の視線動作として特定する。
なお、変形例として、視線移動特定部104は、ユーザの身体の姿勢の変化に連動した、身体に対するユーザの視線動作を特定することも可能である。例えば、視線移動特定部104は、ユーザの身体の姿勢が変化する度に、変化後の姿勢に対するユーザの視線動作を特定する。なお、姿勢の変化とは、例えばジェスチャーであってもよい。
視線パターン比較部106は、ユーザと対応づけてユーザ情報DB130に登録されている視線パターンと、視線移動特定部104により特定されたユーザの視線の移動に基づいた視線パターンとを比較する。より具体的には、視線パターン比較部106は、まず、視線の移動順ごとの、視線移動特定部104により特定されたユーザの身体の部位の各々に対する視線動作の配列を視線パターンとして特定する。そして、視線パターン比較部106は、特定した視線パターンと、ユーザ情報DB130に登録されている視線パターンとを比較し、一致度を判定する。
ユーザ情報DB130は、ユーザに対応づけて視線パターンが記録されるデータベースである。ここで、図7を参照して、ユーザ情報DB130の構成例について説明する。
ユーザ認証部108は、視線パターン比較部106による比較の結果に基づいてユーザを認証する。より具体的には、ユーザ認証部108は、視線パターン比較部106により判定された一致度が所定の閾値以上である場合にはユーザが正当なユーザであると判定し、また、判定された一致度が所定の閾値未満である場合にはユーザが正当なユーザではないと判定する。
表示制御部110は、視線移動特定部104により視線の移動が特定される度に、特定された視線の移動を示す表示を、後述する表示部124に逐次表示させる。例えば、図5に示した例では、表示制御部110は、視線移動特定部104により小指の先端220aをユーザが注視していることが確認された場合には、注視位置を示す表示を小指の先端220aにおいて表示部124に重畳表示させる。同様に、表示制御部110は、視線移動特定部104により生命線の下端を始点としてユーザがトレースしていることが確認される度に、新たに確認された部分の線を(例えばトレース220bに示したように)表示部124に追加して重畳表示させる。
第1撮像部120は、本開示における第1の撮像部の一例である。第1撮像部120は、ユーザの眼球の画像を撮像する。
第2撮像部122は、本開示における第2の撮像部の一例である。第2撮像部122は、ユーザの視線方向に位置するユーザの身体の画像を撮像する。
表示部124は、表示制御部110の制御に従って、各種の表示画面を表示する。
通信部126は、例えば無線通信により、情報処理装置10‐1と通信可能な他の装置(図示省略)との間で情報の送受信を行う。例えば、通信部126は、情報処理装置10‐1を装着するユーザのユーザIDを他の装置から受信する。
記憶部128は、例えば、ユーザ情報DB130やユーザIDなどの各種のデータやソフトウェアを記憶することが可能である。
以上、第1の実施形態による構成について説明した。続いて、第1の実施形態の適用例について、図8および図9を参照して説明する。なお、本適用例では、左腕および左手に対する視線の移動によりユーザ認証を行う場面を前提とする。
以上、第1の実施形態の適用例について説明した。続いて、第1の実施形態による動作について、図10~図11を参照して説明する。
以上、例えば図2、図10、図11などを参照して説明したように、第1の実施形態による情報処理装置10‐1は、第1撮像部120により撮像されたユーザの眼球の画像、および、第2撮像部122により撮像された、ユーザの視線方向に位置するユーザの身体の画像に基づいて、ユーザの身体に対するユーザの視線の移動を特定し、そして、特定された視線の移動に基づいてユーザを認証する。
以上、第1の実施形態について説明した。次に、第2の実施形態について説明する。まず、第2の実施形態を創作するに至った背景について説明する。
第2の実施形態による情報処理装置10‐2の基本構成は、図1に示した第1の実施形態と同様である。
次に、情報処理装置10‐2の構成について詳細に説明する。図12は、第2の実施形態による情報処理装置10‐2の構成を示した機能ブロック図である。図12に示したように、情報処理装置10‐2は、第1の実施形態による情報処理装置10‐1と比較して、制御部100‐1の代わりに、制御部100‐2を有する。なお、以下では、第1の実施形態と重複する機能については説明を省略する。
制御部100‐2は、制御部100‐1と比較して、身体的特徴検出部112をさらに有する。
身体的特徴検出部112は、第2撮像部122により撮像された、ユーザの視線方向に位置するユーザの身体の画像に基づいて、ユーザの身体の一部に関する身体的特徴を検出する。
例えば、身体的特徴検出部112は、まず、視線検出部102により検出された視線方向に基づいて、ユーザの目の位置からユーザが見ている身体の部位までの距離を特定する。より具体的には、身体的特徴検出部112は、例えば、情報処理装置10‐2が有する距離センサー(図示省略)による計測結果や、第2撮像部120による撮像画像に基づいて、ユーザが見ている部位までの距離を特定する。または、例えば所定の専用装置の中で予め登録しておいた姿勢をユーザにとらせるなど、予め登録しておいた立ち位置にユーザが位置する場合には、身体的特徴検出部112は、予め計測しておいた距離(固定距離)をユーザが見ている部位までの距離として特定する。
次に、身体的特徴検出部112は、第2撮像部122により撮像されたユーザの身体の部位の画像、および特定した当該身体の部位までの距離に基づいて、当該部位の相対的な長さや、相対的な位置を検出する。より具体的には、まず、身体的特徴検出部112は、第2撮像部122により撮像されたユーザの身体の部位の画像に基づいて、当該部位の仮の長さや、特定の場所を起点とした当該部位における他の場所の仮の位置を算出する。そして、身体的特徴検出部112は、特定した当該身体の部位までの距離と、視線パターンの登録時におけるユーザの目の位置から当該身体の部位までの距離との比に基づいて、算出した仮の長さや、仮の位置を正規化することにより、当該部位の相対的な長さや、当該他の場所の相対的な位置を算出する。
第2の実施形態による視線移動特定部104は、視線検出部102により検出されたユーザの視線方向、第2撮像部122により撮像されたユーザの身体の画像、および身体的特徴検出部112により検出された身体的特徴に基づいて、ユーザの身体に対するユーザの視線の移動を特定する。例えば、図5に示した例では、視線移動特定部104は、身体的特徴検出部112により検出された各掌線に関する特徴に基づいて、掌においてユーザが視線でトレースした線(つまり生命線)の相対的な位置、相対的な長さ、または形状などを含む情報を二番目の視線動作として特定する。また、視線移動特定部104は、身体的特徴検出部112により検出された中指に関する特徴に基づいて、中指においてユーザが視線でトレースした線(つまり中指の先端から根元まで)の相対的な位置、および相対的な長さを含む情報を三番目の視線動作として特定する。
以上、第2の実施形態による構成について説明した。続いて、第2の実施形態の適用例について、図8および図9を参照して説明する。
以上、第2の実施形態による構成について説明した。続いて、第2の実施形態による動作について説明する。なお、第2の実施形態による動作は、図10~図11に示した第1の実施形態と概略と同様である。以下では、第1の実施形態と内容が異なるステップについてのみ説明を行う。
以上、例えば図12などを参照して説明したように、第2の実施形態による情報処理装置10‐2は、第2撮像部122により撮像された、ユーザの視線方向に位置するユーザの身体の画像に基づいて、ユーザの身体の一部に関する身体的特徴を検出し、第1撮像部120により撮像されたユーザの眼球画像および検出された身体的特徴に基づいてユーザの身体に対するユーザの視線の移動を特定し、そして、特定された視線の移動に基づいてユーザを認証する。
以上、第2の実施形態について説明した。次に、第3の実施形態について説明する。
まず、第3の実施形態による情報処理装置30の基本構成について、図13を参照して説明する。
次に、情報処理装置30の構成について詳細に説明する。図14は、第3の実施形態による情報処理装置30の構成を示した機能ブロック図である。図14に示したように、情報処理装置30は、制御部300、撮像部320、出力部322、通信部324、および記憶部326を有する。なお、以下では、第2の実施形態と重複する機能については説明を省略する。
制御部300は、情報処理装置30の動作を全般的に制御する。また、図14に示したように、制御部300は、視線検出部302、視線移動特定部304、視線パターン比較部306、ユーザ認証部308、出力制御部310、および、身体的特徴検出部312を有する。
視線検出部302は、後述する撮像部320により撮像されたユーザの眼球の画像に基づいて、ユーザの視線方向を検出する。なお、具体的な機能については、第2の実施形態による視線検出部102と概略同様である。
身体的特徴検出部312は、撮像部320により撮像された、ユーザの視線方向に位置するユーザの身体の画像に基づいて、ユーザの身体の一部に関する身体的特徴を検出する。なお、具体的な機能については、第2の実施形態による身体的特徴検出部112と概略同様である。
視線移動特定部304は、視線検出部102により検出されたユーザの視線方向、撮像部320により撮像されたユーザの身体の画像、および身体的特徴検出部312により検出された身体的特徴に基づいて、ユーザの身体に対するユーザの視線の移動を特定する。なお、具体的な機能については、第2の実施形態による視線移動特定部104と概略同様である。
出力制御部310は、ユーザ認証部308による認証結果を、後述する出力部322に出力させる。例えば、出力制御部310は、認証結果を知らせる音を出力部322に出力させる。
撮像部320は、ユーザの眼球の画像、およびユーザの視線方向に位置するユーザの身体の画像を例えば同時に撮像する。
出力部322は、出力制御部310の制御に従って、音または映像を出力する。
第3の実施形態による動作は、2-4節で述べた第2の実施形態による動作と概略同様である。
以上、例えば図13および図14を参照して説明したように、第3の実施形態による情報処理装置30は、撮像部320により撮像された、ユーザの視線方向に位置するユーザの身体の画像に基づいて、ユーザの身体の一部に関する身体的特徴を検出し、撮像部320により撮像されたユーザの眼球の画像、および検出された身体的特徴に基づいて、ユーザの身体に対するユーザの視線の移動を特定し、そして、特定された視線の移動に基づいてユーザを認証する。このため、第2の実施形態と概略同様の機能を実現するので、第2の実施形態と同様の効果が得られる。
次に、情報処理装置10‐1、情報処理装置10‐2、および情報処理装置30のハードウェア構成について、図15を参照して説明する。図15に示したように、情報処理装置10‐1(または、情報処理装置10‐2、情報処理装置30)は、CPU150、ROM(Read Only Memory)152、RAM154、内部バス156、インターフェース158、出力装置160、カメラ162、ストレージ装置164、および通信装置166を備える。
CPU150は、演算処理装置および制御装置として機能する。このCPU150は、制御部100‐1(または、制御部100‐2、制御部300)の機能を実現する。なお、CPU150は、マイクロプロセッサなどのプロセッサにより構成される。
ROM152は、CPU150が使用するプログラムや演算パラメータなどの制御用データなどを記憶する。
RAM154は、例えば、CPU150により実行されるプログラムなどを一時的に記憶する。
内部バス156は、CPUバスなどから構成される。この内部バス156は、CPU150、ROM152、およびRAM154を相互に接続する。
インターフェース158は、出力装置160、カメラ162、ストレージ装置164、および通信装置166を、内部バス156と接続する。例えばストレージ装置164は、このインターフェース158および内部バス156を介して、CPU150との間でデータをやり取りする。
出力装置160は、例えば、液晶ディスプレイ(LCD:Liquid Crystal Display)装置、OLED(Organic Light Emitting Diode)装置およびランプなどの表示装置を含む。この表示装置は、CPU150により生成された画像などを表示する。
カメラ162は、外部の映像を、レンズを通して例えばCCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子に結像させ、静止画像又は動画像を撮影する機能を有する。このカメラ162は、第1撮像部120、第2撮像部122、または撮像部320として機能する。
ストレージ装置164は、記憶部128または記憶部326として機能する、データ格納用の装置である。ストレージ装置164は、例えば、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置、または記憶媒体に記録されたデータを削除する削除装置などを含む。
通信装置166は、例えば公衆網やインターネットなどの通信網に接続するための通信デバイス等で構成された通信インターフェースである。また、通信装置166は、無線LAN対応通信装置、LTE(Long Term Evolution)対応通信装置、または有線による通信を行うワイヤー通信装置であってもよい。この通信装置166は、通信部126または通信部324として機能する。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示はかかる例に限定されない。本開示の属する技術の分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
例えば、上述した各実施形態では、ユーザIDが受信または入力された場合、すなわちユーザが特定されていることを前提としてユーザの認証を行う例について説明したが、かかる例に限定されない。例えば、情報処理装置30(または、情報処理装置10‐1、情報処理装置10‐2)は、ユーザ情報DB130に記録されている全てのユーザの視線パターンに基づいて、撮影対象の人物がいずれのユーザであるかを識別することも可能である。
また、各実施形態によれば、CPU150、ROM152、およびRAM154などのハードウェアを、上述した情報処理装置10‐1、情報処理装置10‐2、または情報処理装置30の各構成と同等の機能を発揮させるためのコンピュータプログラムも提供可能である。また、該コンピュータプログラムが記録された記録媒体も提供される。
(1)
撮像部により撮像された、ユーザの目の画像および前記ユーザの視線方向に位置する前記ユーザの身体の画像に基づいて、前記ユーザの身体に対する前記ユーザの視線の移動を特定する視線移動特定部と、
特定された視線の移動に基づいて前記ユーザを認証するユーザ認証部と、
を備える、情報処理装置。
(2)
前記情報処理装置は、前記ユーザと対応づけてデータベースに登録されている視線パターンと、前記視線移動特定部により特定された前記ユーザの視線の移動に基づいた視線パターンとを比較する視線パターン比較部をさらに備え、
前記ユーザ認証部は、前記視線パターン比較部による比較結果に基づいて、前記ユーザを認証する、前記(1)に記載の情報処理装置。
(3)
前記ユーザ認証部は、前記視線パターン比較部により判定された視線パターンの一致度が所定の閾値以上である場合に、前記ユーザが正当なユーザであると判断する、前記(2)に記載の情報処理装置。
(4)
前記情報処理装置は、前記ユーザの身体の画像に基づいて、前記ユーザの身体の一部に関する身体的特徴を検出する身体的特徴検出部をさらに備え、
前記視線移動特定部は、前記ユーザの目の画像、および検出された身体的特徴に基づいて、前記ユーザの身体に対する前記ユーザの視線の移動を特定する、前記(2)または(3)に記載の情報処理装置。
(5)
前記視線パターンは、前記ユーザの身体の1以上の部位に対する視線動作のパターンである、前記(2)~(4)のいずれか一項に記載の情報処理装置。
(6)
前記データベースに登録されている視線パターンは、視線の移動の順序と対応づけられた前記ユーザの身体の1以上の部位の各々に対する視線動作のパターンである、前記(5)に記載の情報処理装置。
(7)
前記データベースに登録されている視線パターンは、前記ユーザの身体の姿勢の変化の順序と対応づけられた前記ユーザの身体の1以上の部位の各々に対する視線動作のパターンである、前記(5)に記載の情報処理装置。
(8)
前記視線動作は、前記ユーザの身体の部位に対する注視または視線のトレースを含む、前記(5)~(7)のいずれか一項に記載の情報処理装置。
(9)
前記身体的特徴は、ホクロ、皺、または血管パターンの位置または形状を含む、前記(4)~(8)のいずれか一項に記載の情報処理装置。
(10)
前記情報処理装置は、前記視線移動特定部により視線の移動が特定される度に、特定された視線の移動を示す表示を表示部に逐次表示させる表示制御部をさらに備える、前記(1)~(9)のいずれか一項に記載の情報処理装置。
(11)
前記情報処理装置は、前記表示部をさらに備える、前記(10)に記載の情報処理装置。
(12)
前記撮像部は、前記ユーザの目の画像を撮像する第1の撮像部と、前記ユーザの視線方向に位置する前記ユーザの身体の画像を撮像する第2の撮像部とを含む、前記(1)~(11)のいずれか一項に記載の情報処理装置。
(13)
前記情報処理装置は、眼鏡型の装置である、前記(1)~(12)のいずれか一項に記載の情報処理装置。
(14)
撮像部により撮像された、ユーザの目の画像および前記ユーザの視線方向に位置する前記ユーザの身体の画像に基づいて、前記ユーザの身体に対する前記ユーザの視線の移動を特定することと、
特定された視線の移動に基づいて前記ユーザを認証することと、
を備える、情報処理方法。
(15)
コンピュータを、
撮像部により撮像された、ユーザの目の画像および前記ユーザの視線方向に位置する前記ユーザの身体の画像に基づいて、前記ユーザの身体に対する前記ユーザの視線の移動を特定する視線移動特定部と、
特定された視線の移動に基づいて前記ユーザを認証するユーザ認証部、
として機能させるための、プログラム。
100‐1、100‐2、300 制御部
102、302 視線検出部
104、304 視線移動特定部
106、306 視線パターン比較部
108、308 ユーザ認証部
110 表示制御部
112、312 身体的特徴検出部
120 第1撮像部
122 第2撮像部
124 表示部
126、324 通信部
128、326 記憶部
130 ユーザ情報DB
310 出力制御部
320 撮像部
322 出力部
Claims (15)
- 撮像部により撮像された、ユーザの目の画像および前記ユーザの視線方向に位置する前記ユーザの身体の画像に基づいて、前記ユーザの身体に対する前記ユーザの視線の移動を特定する視線移動特定部と、
特定された視線の移動に基づいて前記ユーザを認証するユーザ認証部と、
を備える、情報処理装置。 - 前記情報処理装置は、前記ユーザと対応づけてデータベースに登録されている視線パターンと、前記視線移動特定部により特定された前記ユーザの視線の移動に基づいた視線パターンとを比較する視線パターン比較部をさらに備え、
前記ユーザ認証部は、前記視線パターン比較部による比較結果に基づいて、前記ユーザを認証する、請求項1に記載の情報処理装置。 - 前記ユーザ認証部は、前記視線パターン比較部により判定された視線パターンの一致度が所定の閾値以上である場合に、前記ユーザが正当なユーザであると判断する、請求項2に記載の情報処理装置。
- 前記情報処理装置は、前記ユーザの身体の画像に基づいて、前記ユーザの身体の一部に関する身体的特徴を検出する身体的特徴検出部をさらに備え、
前記視線移動特定部は、前記ユーザの目の画像、および検出された身体的特徴に基づいて、前記ユーザの身体に対する前記ユーザの視線の移動を特定する、請求項2に記載の情報処理装置。 - 前記視線パターンは、前記ユーザの身体の1以上の部位に対する視線動作のパターンである、請求項2に記載の情報処理装置。
- 前記データベースに登録されている視線パターンは、視線の移動の順序と対応づけられた前記ユーザの身体の1以上の部位の各々に対する視線動作のパターンである、請求項5に記載の情報処理装置。
- 前記データベースに登録されている視線パターンは、前記ユーザの身体の姿勢の変化の順序と対応づけられた前記ユーザの身体の1以上の部位の各々に対する視線動作のパターンである、請求項5に記載の情報処理装置。
- 前記視線動作は、前記ユーザの身体の部位に対する注視または視線のトレースを含む、請求項5に記載の情報処理装置。
- 前記身体的特徴は、ホクロ、皺、または血管パターンの位置または形状を含む、請求項4に記載の情報処理装置。
- 前記情報処理装置は、前記視線移動特定部により視線の移動が特定される度に、特定された視線の移動を示す表示を表示部に逐次表示させる表示制御部をさらに備える、請求項1に記載の情報処理装置。
- 前記情報処理装置は、前記表示部をさらに備える、請求項10に記載の情報処理装置。
- 前記撮像部は、前記ユーザの目の画像を撮像する第1の撮像部と、前記ユーザの視線方向に位置する前記ユーザの身体の画像を撮像する第2の撮像部とを含む、請求項1に記載の情報処理装置。
- 前記情報処理装置は、眼鏡型の装置である、請求項1に記載の情報処理装置。
- 撮像部により撮像された、ユーザの目の画像および前記ユーザの視線方向に位置する前記ユーザの身体の画像に基づいて、前記ユーザの身体に対する前記ユーザの視線の移動を特定することと、
特定された視線の移動に基づいて前記ユーザを認証することと、
を備える、情報処理方法。 - コンピュータを、
撮像部により撮像された、ユーザの目の画像および前記ユーザの視線方向に位置する前記ユーザの身体の画像に基づいて、前記ユーザの身体に対する前記ユーザの視線の移動を特定する視線移動特定部と、
特定された視線の移動に基づいて前記ユーザを認証するユーザ認証部、
として機能させるための、プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15882718.8A EP3261055A4 (en) | 2015-02-20 | 2015-11-19 | Information processing device, information processing method, and program |
US15/529,128 US10180717B2 (en) | 2015-02-20 | 2015-11-19 | Information processing device, information processing method, and program |
JP2017500286A JP6648751B2 (ja) | 2015-02-20 | 2015-11-19 | 情報処理装置、情報処理方法、およびプログラム |
CN201580075931.0A CN107209936B (zh) | 2015-02-20 | 2015-11-19 | 信息处理设备,信息处理方法和程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015031637 | 2015-02-20 | ||
JP2015-031637 | 2015-02-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016132617A1 true WO2016132617A1 (ja) | 2016-08-25 |
Family
ID=56688895
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/082623 WO2016132617A1 (ja) | 2015-02-20 | 2015-11-19 | 情報処理装置、情報処理方法、およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10180717B2 (ja) |
EP (1) | EP3261055A4 (ja) |
JP (1) | JP6648751B2 (ja) |
CN (1) | CN107209936B (ja) |
WO (1) | WO2016132617A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10521946B1 (en) | 2017-11-21 | 2019-12-31 | Amazon Technologies, Inc. | Processing speech to drive animations on avatars |
US11232645B1 (en) | 2017-11-21 | 2022-01-25 | Amazon Technologies, Inc. | Virtual spaces as a platform |
US10732708B1 (en) * | 2017-11-21 | 2020-08-04 | Amazon Technologies, Inc. | Disambiguation of virtual reality information using multi-modal data including speech |
JP7327368B2 (ja) * | 2020-12-02 | 2023-08-16 | 横河電機株式会社 | 装置、方法およびプログラム |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007141002A (ja) * | 2005-11-18 | 2007-06-07 | Fujitsu Ltd | 個人認証方法、個人認証プログラムおよび個人認証装置 |
JP2014092940A (ja) * | 2012-11-02 | 2014-05-19 | Sony Corp | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6353422B1 (en) * | 2000-03-31 | 2002-03-05 | Stephen G. Perlman | Virtual display system and method |
US8965460B1 (en) * | 2004-01-30 | 2015-02-24 | Ip Holdings, Inc. | Image and augmented reality based networks using mobile devices and intelligent electronic glasses |
US9229227B2 (en) | 2010-02-28 | 2016-01-05 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a light transmissive wedge shaped illumination system |
KR20130000401A (ko) * | 2010-02-28 | 2013-01-02 | 오스터하우트 그룹 인코포레이티드 | 대화형 머리장착식 아이피스 상의 지역 광고 컨텐츠 |
WO2011158511A1 (ja) * | 2010-06-17 | 2011-12-22 | パナソニック株式会社 | 指示入力装置、指示入力方法、プログラム、記録媒体および集積回路 |
US20120257035A1 (en) * | 2011-04-08 | 2012-10-11 | Sony Computer Entertainment Inc. | Systems and methods for providing feedback by tracking user gaze and gestures |
US8235529B1 (en) * | 2011-11-30 | 2012-08-07 | Google Inc. | Unlocking a screen using eye tracking information |
CN102547123B (zh) * | 2012-01-05 | 2014-02-26 | 天津师范大学 | 基于人脸识别技术的自适应视线跟踪系统及其跟踪方法 |
US9092600B2 (en) * | 2012-11-05 | 2015-07-28 | Microsoft Technology Licensing, Llc | User authentication on augmented reality display device |
CN104077517A (zh) * | 2014-06-30 | 2014-10-01 | 惠州Tcl移动通信有限公司 | 一种基于虹膜识别的移动终端用户模式启动方法及系统 |
-
2015
- 2015-11-19 JP JP2017500286A patent/JP6648751B2/ja active Active
- 2015-11-19 US US15/529,128 patent/US10180717B2/en active Active
- 2015-11-19 WO PCT/JP2015/082623 patent/WO2016132617A1/ja active Application Filing
- 2015-11-19 EP EP15882718.8A patent/EP3261055A4/en active Pending
- 2015-11-19 CN CN201580075931.0A patent/CN107209936B/zh active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007141002A (ja) * | 2005-11-18 | 2007-06-07 | Fujitsu Ltd | 個人認証方法、個人認証プログラムおよび個人認証装置 |
JP2014092940A (ja) * | 2012-11-02 | 2014-05-19 | Sony Corp | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3261055A4 * |
Also Published As
Publication number | Publication date |
---|---|
US20170262055A1 (en) | 2017-09-14 |
EP3261055A4 (en) | 2018-10-24 |
EP3261055A1 (en) | 2017-12-27 |
CN107209936B (zh) | 2021-08-27 |
CN107209936A (zh) | 2017-09-26 |
JP6648751B2 (ja) | 2020-02-14 |
JPWO2016132617A1 (ja) | 2017-11-24 |
US10180717B2 (en) | 2019-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6762380B2 (ja) | 身分認証方法および装置 | |
JP6722272B2 (ja) | 凝視情報を使用するユーザの識別および/または認証 | |
TWI751161B (zh) | 終端設備、智慧型手機、基於臉部識別的認證方法和系統 | |
US10884577B2 (en) | Identification of dynamic icons based on eye movement | |
JP6197345B2 (ja) | 生体認証装置、生体認証システム、および生体認証方法 | |
CN104933344A (zh) | 基于多生物特征模态的移动终端用户身份认证装置及方法 | |
US20150186708A1 (en) | Biometric identification system | |
WO2016132617A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP6849200B2 (ja) | マルチ生体データを用いた非接触式マルチ生体認識方法及びマルチ生体認識装置 | |
CN104914989B (zh) | 姿势辨识装置以及姿势辨识装置的控制方法 | |
WO2016089529A1 (en) | Technologies for learning body part geometry for use in biometric authentication | |
CN204791017U (zh) | 基于多生物特征模态的移动终端用户身份认证装置 | |
CN205485072U (zh) | 一种头戴式显示设备 | |
US9880634B2 (en) | Gesture input apparatus, gesture input method, and program for wearable terminal | |
CN110188658A (zh) | 身份识别方法、装置、电子设备及存储介质 | |
CN110568930B (zh) | 注视点校准方法及相关设备 | |
TWI557601B (zh) | 瞳孔追蹤系統及其方法、電腦程式產品、及電腦可讀取紀錄媒體 | |
JPWO2016088415A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
CN106255973A (zh) | 电子设备及用于控制其访问权的方法 | |
CN105825102A (zh) | 一种基于眼纹识别的终端解锁方法和装置 | |
KR102151474B1 (ko) | 스마트 단말기를 이용한 비접촉 지문인증 방법 | |
Heravian et al. | Implementation of eye tracking in an IoT-based smart home for spinal cord injury patients | |
WO2022180890A1 (ja) | 生体認証システム、認証端末、および認証方法 | |
KR101286750B1 (ko) | 제스처를 이용한 패스워드 판단시스템 | |
JP2013073434A (ja) | 情報入力装置、および情報入力方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15882718 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017500286 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15529128 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2015882718 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |