JP5794310B2 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
JP5794310B2
JP5794310B2 JP2013545731A JP2013545731A JP5794310B2 JP 5794310 B2 JP5794310 B2 JP 5794310B2 JP 2013545731 A JP2013545731 A JP 2013545731A JP 2013545731 A JP2013545731 A JP 2013545731A JP 5794310 B2 JP5794310 B2 JP 5794310B2
Authority
JP
Japan
Prior art keywords
information
finger
biometric feature
unit
biometric
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2013545731A
Other languages
Japanese (ja)
Other versions
JPWO2013076858A1 (en
Inventor
岡崎 健
健 岡崎
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2011/077119 priority Critical patent/WO2013076858A1/en
Publication of JPWO2013076858A1 publication Critical patent/JPWO2013076858A1/en
Application granted granted Critical
Publication of JP5794310B2 publication Critical patent/JP5794310B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00885Biometric patterns not provided for under G06K9/00006, G06K9/00154, G06K9/00335, G06K9/00362, G06K9/00597; Biometric specific functions not specific to the kind of biometric
    • G06K9/00919Static means for assisting the user in correctly positioning the object of interest

Description

  The present case relates to an information processing apparatus.

  Currently, security technologies are used in various fields such as crime prevention measures for ordinary houses, personal information protection, car theft prevention, and terrorism measures. For example, biometrics (biological) authentication that can identify an individual with high accuracy using human biological features such as fingerprints, irises, and veins is widely used as a means for identity verification.

JP 2009-251988 A JP 2008-250601 A JP-A-9-102046

  When a sensor for reading biological information is provided in an information processing apparatus such as a personal computer, the operator may be forced to take an unreasonable posture when holding a hand over the sensor depending on how the sensor is arranged. Further, as the operator's posture becomes unnatural, the relative position of the hand with respect to the sensor is less likely to be an appropriate position, and the quality and authentication accuracy of the registered biometric information may be reduced.

  The present case has been made in view of such points, and an object thereof is to provide an information processing apparatus that allows an operator to read hand information in a natural posture.

  In order to solve the above problems, an information processing apparatus is provided that includes a sensor having a plurality of scanning directions at different angles, which scans a hand and reads hand information. In this information processing apparatus, the sensor is fixed so that the main scanning direction of the plurality of scanning directions intersects with the direction in which the end of the casing of the information processing apparatus facing the operator extends. Arranged.

  In addition, in order to solve the above-described problem, a plurality of operation keys arranged in a predetermined direction and a sensor having a plurality of scanning directions at different angles for scanning the hand to read the hand information are provided. An information processing apparatus is provided. In this information processing apparatus, the sensor is fixedly arranged so that the main scanning direction of the plurality of scanning directions intersects with the arrangement direction of the operation keys.

  Furthermore, in order to solve the above-described problems, an information processing apparatus is provided that includes a display device and sensors having a plurality of scanning directions at different angles that scan the hand and read the hand information. In this information processing apparatus, the sensor is fixedly arranged so that the main scanning direction of the plurality of scanning directions intersects the scanning direction of the display device.

According to the information processing apparatus, it is possible for the operator to read hand information in a natural posture.
These and other objects, features and advantages of the present invention will become apparent from the following description taken in conjunction with the accompanying drawings which illustrate preferred embodiments by way of example of the present invention.

It is a figure which shows the external appearance of the information processing apparatus which concerns on 1st Embodiment, and the mode of an operator. It is a figure which shows the external appearance of the information processing apparatus of 2nd Embodiment. It is a figure which shows the reading part of 2nd Embodiment. It is a figure which shows the hardware constitutions of the information processing apparatus of 2nd Embodiment. It is a block diagram which shows the information processing apparatus of 2nd Embodiment. It is a figure which shows the biometric feature table of 2nd Embodiment. It is a figure which shows the state at the time of reading of the vein of the palm of the right hand of 2nd Embodiment. It is a figure which shows the state at the time of reading of the vein of the palm of the right hand of 2nd Embodiment. It is a figure which shows the detection of the direction characteristic part of the hand of 2nd Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of 2nd Embodiment. It is a flowchart which shows the biometric feature information authentication process of 2nd Embodiment. It is a flowchart which shows the biometric feature information collation process of 2nd Embodiment. It is a figure which shows the message window at the time of registration of 2nd Embodiment. It is a block diagram which shows the information processing apparatus of the 1st modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of the 2nd modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information authentication process of the 2nd modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information collation process of the 2nd modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of the 3rd modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information authentication process of the 3rd modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information collation process of the 3rd modification of 2nd Embodiment. It is a figure which shows the biometrics feature table of the 4th modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of the 4th modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information authentication process of the 4th modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information collation process of the 4th modification of 2nd Embodiment. It is a figure which shows the biometrics feature table of the 5th modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of the 5th modification of 2nd Embodiment. It is a flowchart which shows the biometrics characteristic information authentication process of the 5th modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information collation process of the 5th modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of the 6th modification of 2nd Embodiment. It is a flowchart which shows the biometrics characteristic information authentication process of the 6th modification of 2nd Embodiment. It is a flowchart which shows the biometric feature information collation process of the 6th modification of 2nd Embodiment. It is a figure which shows the external appearance of the information processing apparatus of the 7th modification of 2nd Embodiment. It is a figure which shows the biometrics feature table of the 7th modification of 2nd Embodiment. It is a figure which shows the state at the time of reading of the vein of the palm of the right hand of the 7th modification of 2nd Embodiment. It is a figure which shows the detection of the direction characteristic part of the hand of the 7th modification of 2nd Embodiment. It is a figure which shows the external appearance of the information processing apparatus of the 8th modification of 2nd Embodiment. It is a figure which shows the state at the time of reading of the vein of the palm of the right hand of the 8th modification of 2nd Embodiment. It is a figure which shows the state at the time of reading of the vein of the palm of the left hand of the 8th modification of 2nd Embodiment. It is a block diagram which shows the information processing apparatus of 3rd Embodiment. It is a figure which shows the reading part of 3rd Embodiment. It is a figure which shows the biometric feature table of 3rd Embodiment. It is a figure which shows the state at the time of reading of the vein of the index finger of the right hand of 3rd Embodiment. It is a figure which shows the detection of the direction characteristic part of the finger | toe of 3rd Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of 3rd Embodiment. It is a flowchart which shows the biometric feature information authentication process of 3rd Embodiment. It is a flowchart which shows the biometric feature information collation process of 3rd Embodiment. It is a figure which shows the message window at the time of registration of 3rd Embodiment. It is a block diagram which shows the information processing apparatus of 4th Embodiment. It is a figure which shows the reading part of 4th Embodiment. It is a figure which shows the biometric feature table of 4th Embodiment. It is a figure which shows the state at the time of reading of the vein of the index finger of the right hand of 4th Embodiment. It is a figure which shows the determination method of the direction of the finger | toe of 4th Embodiment. It is a figure which shows the determination method of the direction of the finger | toe of 4th Embodiment. It is a figure which shows another determination method of the direction of the finger | toe of 4th Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of 4th Embodiment. It is a flowchart which shows the biometric feature information authentication process of 4th Embodiment. It is a flowchart which shows the biometric feature information collation process of 4th Embodiment. It is a block diagram which shows the information processing apparatus of 5th Embodiment. It is a figure which shows the reading part of 5th Embodiment. It is a figure which shows the biometric feature table of 5th Embodiment. It is a figure which shows the state at the time of reading of the vein of the palm of the right hand of 5th Embodiment. It is a figure which shows the state at the time of reading of the vein of the index finger of the right hand of 5th Embodiment. It is a figure which shows the detection of the direction characteristic part of 5th Embodiment. It is a figure which shows the detection of the direction characteristic part of 5th Embodiment. It is a flowchart which shows the biometric feature information acquisition registration process of 5th Embodiment. It is a flowchart which shows the biometric feature information authentication process of 5th Embodiment. It is a flowchart which shows the biometric feature information collation process of 5th Embodiment. It is a block diagram which shows the information processing apparatus of 6th Embodiment. It is a figure which shows the biometric feature table of 6th Embodiment. It is a figure which shows the state at the time of reading of the palm of the right hand of 6th Embodiment, and the vein of a middle finger. It is a flowchart which shows the biometric feature information acquisition registration process of 6th Embodiment. It is a flowchart which shows the biometric feature information authentication process of 6th Embodiment. It is a flowchart which shows the biometric feature information collation process of 6th Embodiment. It is a block diagram which shows the information processing apparatus of 7th Embodiment. It is a figure which shows the biometric feature table of 7th Embodiment. It is a figure which shows the state at the time of reading of the palm and finger vein of the right hand of 7th Embodiment. It is a figure which shows the external appearance of the automatic transaction apparatus of 8th Embodiment.

Hereinafter, the present embodiment will be described with reference to the drawings.
[First Embodiment]
FIG. 1 is a diagram illustrating an appearance of an information processing apparatus according to the first embodiment and a state of an operator.

  The information processing apparatus 10 in FIG. 1 is an apparatus that can receive an operation input from the operator 20 and execute processing. The information processing apparatus 10 includes a sensor 11 that scans the hand of the operator 20 and reads hand information. In addition, the information processing apparatus 10 includes, for example, a key input unit 12 in which a plurality of operation keys are arranged, and a display device 13 that displays an image. FIG. 1 shows a state in which the information processing apparatus 10 is viewed from the vertically upward direction with respect to the operation surface 10a on which the key input unit 12 is arranged. As an example, the display device 13 is provided so as to be rotatable with respect to the operation surface 10a.

  The sensor 11 reads, for example, fingerprints, finger veins, palm veins, and the like as information on the hand of the operator 20. The sensor 11 has a plurality of scanning directions, and the main scanning direction among the plurality of scanning directions is the D1 direction in FIG. In the example of FIG. 1, the sensor 11 is assumed to be rectangular. The main scanning direction of the rectangular sensor 11 is a direction along one side. Note that the sensor 11 is not necessarily rectangular, but even if it is not rectangular, the sensor 11 reads hand information by scanning with the D1 direction as the main scanning direction.

  The sensor 11 is arranged so that the main scanning direction D1 intersects the D2 direction in which the end 10b on the side facing the operator 20 in the housing of the information processing apparatus 10 extends. In the example of FIG. 1, the sensor 11 is arranged so that the main scanning direction D1 is at an oblique angle with respect to the D2 direction. As a result, the operator 20 can hold the hand on the sensor 11 in a natural posture and cause the sensor 11 to read hand information.

  For example, when the operator 20 operates the information processing apparatus 10, the upper body of the operator 20 is usually substantially parallel to the direction D2 in which the end 10b of the housing extends. In this state, the operator 20 performs a key input operation by extending both arms forward and placing the left and right hands on the upper surface of the key input unit 12. FIG. 1 shows a state where the right hand 21 is arranged on the upper surface of the key input unit 12 as an example.

  For example, when the sensor 11 reads information on the right hand 21 in such a normal state, the operator 20 rotates the position of the right hand 21 around the right elbow 22 in the counterclockwise direction in FIG. Thus, the right hand 21 is placed on the upper surface of the sensor 11. In this way, by simply rotating the position of the right hand 21 around the right elbow 22, the direction of the right hand 21 is perpendicular to the main scanning direction D 1 of the sensor 11 while maintaining a natural posture (that is, the sensor 11 in the sub-scanning direction), the right hand 21 can be held over the sensor 11. The operator 20 does not have to bend the wrist forcibly to align the direction of the hand with the sub-scanning direction of the sensor 11. Further, it is not necessary to change the position of the trunk of the operator 20 in order to align the direction of the hand with the sub-scanning direction of the sensor 11.

  Since the operator 20 can adjust the hand direction to the sub-scanning direction of the sensor 11 in a natural posture, the sensor 11 can accurately read the hand information. Therefore, it is possible to improve the quality of hand information registered as biometric information for verification and the accuracy of processing during verification.

  In the above description, the main scanning direction D1 of the sensor 11 is oblique with respect to the direction D2, but the main scanning direction D1 is perpendicular to the direction D2 (that is, the sub-scanning direction of the sensor 11 is the direction). The sensor 11 may be arranged so as to be parallel to D2. In this case, the operator 20 puts out the elbow 22 slightly forward and rotates the right hand 21 around the elbow 22 so that the forearm 23 is parallel to the direction D2, thereby moving the right hand 21 in the sub-scanning direction of the sensor 11. Can be arranged along. Further, with both the right hand and the left hand, the hand can be held over the sensor 11 in a natural posture as described above.

  As another example, the main scanning direction D1 intersects the operation key arrangement direction D3 (longitudinal direction of the region of the key input unit 12) in the key input unit 12 or the main scanning direction D4 of the display device 13. As such, the sensor 11 may be arranged. In any of these cases, as described above, the hand can be held over the sensor 11 in a natural posture, and the hand information can be accurately read by the sensor 11.

[Second Embodiment]
FIG. 2 is a diagram illustrating an appearance of the information processing apparatus according to the second embodiment. An information processing apparatus 100 shown in FIG. 2 is a notebook type (laptop type) personal computer to which a security function by biometric authentication using a palm vein is added. The information processing apparatus 100 includes a display unit 120 having an LCD (Liquid Crystal Display) 121, a keyboard 131, and a main body unit 130 having a reading unit 142.

  Each of the display unit 120 and the main body unit 130 has a substantially rectangular parallelepiped housing having a front surface, a rear surface facing the front surface, and two side surfaces connecting them. The display unit 120 and the main unit 130 are connected to each other near the rear surface of the main unit 130 so as to be opened and closed by a hinge (not shown). When the display unit 120 and the main body unit 130 are in the closed state, the external appearance of the information processing apparatus 100 is a substantially rectangular parallelepiped as a whole.

  The LCD 121 is a display device having a display screen for displaying characters or images. In addition to the LCD, other thin display devices such as an organic EL (Electroluminescence) display may be used as the display device. The keyboard 131 is an input device for inputting characters and performing other operations.

  The reading unit 142 is an input device that inputs biometric information by reading the veins of the palm of the hand when the user holds the palm up. The reading unit 142 includes a square vein sensor that acquires a biological image of the palm vein by reading the vein of the user's palm. The vein sensor is arranged so that each side of the vein sensor is parallel to each side of the reading unit 142. The reading unit 142 is on the same top surface of the main body 130 as the keyboard 131 of the information processing apparatus 100, and each side of the square vein sensor is 45 ° on the front and side surfaces of the information processing apparatus 100 at the front center of the keyboard 131. They are arranged at an angle.

  The vein sensor reads vein information by scanning an object to be read. The main scanning direction of the vein sensor is parallel to one side of the square vein sensor. Accordingly, the main scanning direction of the vein sensor is parallel to one side of the reading unit 142. In the present embodiment, as an example, the main scanning direction of the vein sensor is the D11 direction in FIG. The vein sensor is arranged so that the angle formed by the main scanning direction D11 and the direction D12 in which the front surface 130a of the main body 130 extends is 45 °.

  The vein sensor is, for example, either the direction D12 of the front surface 130a of the main body 130, the operation key arrangement direction D13 (the longitudinal direction of the keyboard 131) on the keyboard 131, or the main scanning direction D14 of the LCD 121 and the main scanning direction D11. The angle may be 45 °.

  In addition, in the information processing apparatus 100 of the present embodiment, the notebook type personal computer has been described. However, the information processing apparatus 100 is an example of the information processing apparatus, and the user authentication function of the present embodiment is not limited to a mobile phone or PDA (Personal Digital Assistant) mobile communication terminal devices, desktop type personal computers, automated teller machines (ATMs) that accept and withdraw deposits from banks, information processing The present invention can be applied to an information processing apparatus that performs user authentication, such as a system terminal device.

FIG. 3 is a diagram illustrating a reading unit according to the second embodiment. A reading unit 142 illustrated in FIG. 3 is an input device that allows a user to read a vein of a palm and input biometric information.
The reading unit 142 includes a vein sensor that acquires biological information of the palm vein by reading the vein of the palm of the user. The palm vein has a high identification capability because it has a larger amount of information than other veins, and the vein of the vein is thick, so that stable authentication that is not easily affected by the temperature is possible. In addition, since it is in-vivo information, it is difficult to counterfeit, and since it is not affected by body surface such as rough hands, dryness, and wetness, the application rate is high. Further, since it is non-contact, hygienic and natural operability is realized, user resistance is low, and high-speed authentication is possible. Note that the reading unit 142 may read a palm print on the palm. The reading unit 142 may read a finger vein or a fingerprint.

  FIG. 4 is a diagram illustrating a hardware configuration of the information processing apparatus according to the second embodiment. The information processing apparatus 100 shown in FIG. 4 is a notebook type personal computer as described above, and the entire apparatus is controlled by a CPU (Central Processing Unit) 101. A random access memory (RAM) 102, a hard disk drive (HDD) 103, a graphic processing device 104, an input interface 105, and a communication interface 106 are connected to the CPU 101 via a bus 107.

  The RAM 102 temporarily stores at least part of an OS (Operating System) program and application programs to be executed by the CPU 101. The RAM 102 stores various data necessary for processing by the CPU 101. The HDD 103 stores an OS and application programs.

  A display device such as an LCD 121 is connected to the graphic processing device 104. The graphic processing device 104 can display an image on a display screen of a display device such as the LCD 121 in accordance with a command from the CPU 101. Further, the graphic processing device 104 and the LCD 121 are connected by, for example, a serial communication cable, and control signals and image signals are alternately transmitted and received.

  Input devices such as a keyboard 131 and a mouse 132 are connected to the input interface 105. The input interface 105 outputs a signal sent from an input device such as a keyboard 131 to the CPU 101 via the bus 107. An authentication unit 141 is connected to the input interface 105.

  The communication interface 106 can be connected to a communication line such as a LAN (Local Area Network). The communication interface 106 can send and receive data to and from other computers via a communication line.

  The authentication unit 141 accepts input of biometric information acquired from the veins of the user's palm and generates biometric feature information indicating the features of the biometric information. Moreover, it authenticates based on the received biometric information. When the authentication unit 141 succeeds in the authentication, the information processing apparatus 100 executes predetermined processing of the information processing apparatus 100 such as enabling the information processing apparatus 100 to be activated. The authentication unit 141 includes an authentication control unit 141a and a biometric feature information storage unit 141b.

  The authentication control unit 141a controls authentication using the biometric feature information of the palm vein. The biometric feature information storage unit 141b stores biometric feature information used for authentication performed by the authentication unit 141. The biometric feature information storage unit 141b includes an HDD. The biometric feature information storage unit 141b may include an EEPROM (Electronically Erasable and Programmable Read Only Memory). The authentication unit 141 can store biometric feature information and identification / authentication information used for authentication in the HDD of the biometric feature information storage unit 141b, and is acquired by the biometric feature information stored in the HDD and the reading unit 142. Authentication is performed based on the biometric feature information.

  The reading unit 142 is an input device that inputs biometric information by reading the veins of the palm of the hand when the user holds the palm up. The reading unit 142 includes a living body detection unit 142a, an imaging unit 142b that acquires a vein image of the palm, and a light source unit 142c that emits near-infrared light when the vein is imaged.

  The living body detection unit 142a detects the base of the finger of the palm and also detects the height of the palm from the upper surface of the reading unit 142. Note that “the base of the finger of the palm” refers to a region including a valley between adjacent fingers. The living body detection unit 142a includes, for example, an image sensor as a mechanism for detecting the base of the finger of the palm. The detection result by the image sensor is used to determine the direction of the hand with respect to the reading unit 142. Note that this image sensor may also be used as the imaging unit 142b. In addition, the living body detection unit 142a includes a distance sensor as a mechanism for detecting the height of the palm, for example.

  The imaging unit 142b is a vein sensor that images a biological vein. The light source unit 142c is a light source that irradiates near infrared light. When the palm is detected by the living body detection unit 142a, the palm is irradiated with near infrared light from the light source unit 142c, and the palm portion is imaged by the imaging unit 142b. Thereby, the reduced hemoglobin in the vein in the subcutaneous tissue of the palm is reflected in black because it absorbs near-infrared light, and a mesh-like biological image is acquired.

  When causing the reading unit 142 to read the vein of the palm, the user turns the palm of the hand that causes the reading unit 142 to read the vein. Thereby, the reading part 142 can read the vein of a user's palm.

  Note that the reading unit 142 may be connected to the outside of the information processing apparatus 100, for example. In this case, the living body detection unit 142a may have a function of determining the direction of the hand with respect to the reading unit 142.

With the hardware configuration as described above, the processing functions of the present embodiment can be realized.
FIG. 5 is a block diagram illustrating an information processing apparatus according to the second embodiment. The information processing apparatus 100 according to the second embodiment includes an information acquisition unit 111, a type determination unit 112, an information generation unit 113, a collation unit 114, and a biometric feature information storage unit 141b. The information acquisition unit 111 is connected to a reading unit 142.

  The information acquisition unit 111 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 100. The information acquisition unit 111 can acquire the direction of the living body in a state where the biological image is acquired. The biological image acquired by the information acquisition unit 111 is image information of a palm vein pattern. The direction of the living body is two orthogonal different directions based on the left and right of the hand.

  The reading unit 142 is fixed to the upper part of the information processing apparatus 100. The information acquisition unit 111 determines that the living body is arranged at a predetermined distance from the reading unit 142 and the direction of the living body with respect to the reading unit 142 based on the detection result by the living body detection unit 142a of the reading unit 142. The information acquisition unit 111 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 142.

  In the second embodiment, the directions are orthogonal to each other and the angle with the keyboard 131 is an oblique angle. The direction feature portion is a valley portion at the base of the finger in the palm. The direction feature portion will be described later in detail. Note that the directions of the left and right hands may be opposite to each other.

Further, the information acquisition unit 111 acquires an image (biological image) including a living body imaged by the imaging unit 142b of the reading unit 142.
The type determination unit 112 determines the type of biological information based on the direction of the biological body determined by the information acquisition unit 111. The type indicates the left and right of the hand that is the basis for generating biometric information.

  The information generation unit 113 generates biological information indicating the characteristics of the biological body based on the biological image acquired by the information acquisition unit 111. The information generation unit 113 generates collation biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 112. The collation biometric feature information is basically data having the same configuration as the biometric feature information stored in the biometric feature information storage unit 141b. The biometric feature information of the user to be authenticated (in the second embodiment, palm veins). ) Feature information. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown.

  The information generation unit 113, for example, biometric information based on the biometric image acquired by the information acquisition unit 111, the type of biometric information determined by the type determination unit 112, and identification information that identifies an individual corresponding to the biometric information, Is generated and stored in the biometric feature information storage unit 141b. Thereby, the biometric feature information including the biometric information and the type of the user who has a normal authority registered in advance and is used for authentication is registered. The type is different for the left and right hands.

  When registering the biometric information of the user, the information generating unit 113 stores the generated biometric feature information in the biometric feature information storage unit 141b. At the time of authentication using the user's biometric information, the verification unit 114 performs authentication using the verification biometric feature information generated by the information generation unit 113.

  The collation unit 114 performs authentication using the collation biometric feature information generated by the information generation unit 113. The collation unit 114 extracts biometric feature information whose type matches the collation biometric feature information from the biometric feature information storage unit 141b, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 100 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.

The biometric feature information storage unit 141b stores biometric feature information indicating the biometric information and the type of the biometric information. Thereby, a user's biometric information and a classification are matched and memorize | stored.
FIG. 6 is a diagram illustrating a biometric feature table according to the second embodiment. The biometric feature table 141b1 illustrated in FIG. 6 is set in the biometric feature information storage unit 141b included in the information processing apparatus 100 according to the second embodiment. The biometric feature table 141b1 is a table that manages biometric feature information used for biometric authentication of the information processing apparatus 100.

  The biometric feature table 141b1 includes “number”, “ID”, “left / right”, and “feature data” as items. In the biometric feature table 141b1, values set in the above items are associated with each other as biometric feature information.

The number is a code that can uniquely identify the biometric feature information. Numbers are set on a one-to-one basis with respect to biometric feature information. Different numbers are set for different biometric feature information of the same user.
The ID is a code that can uniquely identify the user of the biometric feature information. The same ID is set in the biometric feature information of the same user. Different IDs are set in the biometric feature information of different users.

  The left and right indicate the type of palm vein indicated by the biometric feature information. “Right” is set for the biological feature information of the vein of the palm of the right hand. “Left” is set for the biological feature information of the vein of the palm of the left hand.

The feature data indicates a file name of data indicating biometric information.
The biometric feature table 141b1 illustrated in FIG. 6 is an example, and any item can be set in the biometric feature table.

  FIG. 7 and FIG. 8 are diagrams illustrating a state at the time of reading the vein of the palm of the right hand according to the second embodiment. FIG. 7 is a diagram of the state when the information processing apparatus 100 reads the vein of the palm of the right hand as viewed from above. FIG. 8 is a view of the state when the information processing apparatus 100 reads the vein of the palm of the right hand as viewed from the front. As illustrated in FIGS. 7 and 8, the information processing apparatus 100 includes a display unit 120 and a main body unit 130. A keyboard 131 and a reading unit 142 are provided on the upper surface of the main body 130 of the information processing apparatus 100. The reading unit 142 is on the same top surface of the main body 130 as the keyboard 131 of the information processing apparatus 100, and each side of the square vein sensor is 45 ° on the front and side surfaces of the information processing apparatus 100 at the front center of the keyboard 131. Arranged at an angle. In addition, a user's head 201, torso 202, upper right arm 203, right lower arm 204, and right hand palm 205 are shown.

  When the user causes the reading unit 142 to read the vein of the palm, as shown in FIG. 7, the user moves the palm (for example, the palm 205 of the right hand) that reads the vein to the left of the side of the information processing apparatus 100. It is positioned so as to be parallel to the upper surface of the main body 130 of the information processing apparatus 100 at an angle of 45 °. At this time, the user is a space separated from the vein sensor surface by a certain distance (for example, several centimeters) with the finger 205 of the palm of the right hand open so that the center of the palm coincides with the center of the reading unit 142. Position on top. The user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  Further, as will be described later, by using the base between the fingers of the hand to detect that the palm 205 of the right hand is placed on the reading unit 142, the right and left hands can be determined quickly and reliably. be able to. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. In addition, although FIG. 7 demonstrated the case where the vein of the palm of a user's right hand was read, it is the same also when reading the vein of the palm of a left hand, and description is abbreviate | omitted.

  Further, as shown in FIG. 8, each finger of the user's right hand is located in a space apart from the vein sensor surface by a certain distance from the vein sensor surface together with the palm 205 of the right hand. It is possible to prevent the palm from touching and causing an erroneous operation.

  FIG. 9 is a diagram illustrating detection of the direction characteristic portion of the hand according to the second embodiment. FIG. 9 shows hand direction and biological direction characteristic portions in the information processing apparatus 100 according to the second embodiment. FIG. 9A shows an acquired image 1421 of the palm of the right hand. FIG. 9B shows an acquired image 1422 of the palm of the left hand.

  The acquired images 1421 and 1422 are images acquired by the living body detection unit 142a of the reading unit 142. The acquired images 1421 and 1422 are captured by, for example, an image sensor included in the living body detection unit 142a. In the acquired images 1421, 1422, the upper side in FIG. 9 is the back side (rear side of the information processing apparatus 100) when viewed from the front of the information processing apparatus 100, and the lower side in FIG. 9 (the front side of the information processing apparatus 100), the right side in FIG. 9 is the right side (the right side of the information processing apparatus 100) when viewed from the front of the information processing apparatus 100, and the left side in FIG. It is the left side (left side surface side of the information processing apparatus 100) when viewed from the front.

  As shown in FIG. 9, the image acquired by the living body detection unit 142a includes a right hand detection rectangular image area 1420a for detecting a valley portion of the base of the palm of the right hand palm along the upper left side, and an upper right side. A left hand detection rectangular image area 1420b for detecting a valley portion at the base of the finger of the palm of the left hand is set. The right-hand detection rectangular image region 1420a and the left-hand detection rectangular image region 1420b are provided along two sides that are orthogonal to each other in the acquired image. The valley portion at the base of the finger functions as a direction feature portion.

  When the reading unit 142 detects a palm of a user positioned above the reading unit 142 with a distance sensor (not shown) of the living body detection unit 142a, the reading unit 142 acquires a palm image with an image sensor (not shown) of the living body detection unit 142a, and acquires information. Supplied to the unit 111. The information acquisition unit 111 determines whether or not a finger base valley portion is detected in the right-hand detection rectangular image region 1420a or the left-hand detection rectangular image region 1420b set in the acquired image. The palm image acquisition by the image sensor here is performed without being irradiated with near-infrared light from the light source unit 142c. Therefore, the image acquired here is not an image of the vein of the palm but an image of the appearance of the palm. In the detection process of the valley portion at the base of the finger, this detection process is performed for the position where the rectangular image area for right hand detection 1420a or the rectangular image area for left hand detection 1420b is set in the image of the appearance of the palm.

  When the image of the appearance of the palm is acquired by the living body detection unit 142a as described above, the information acquisition unit 111 includes the right hand detection rectangular image region 1420a and the left hand detection rectangular image region 1420b in the valley portion of the base of the finger. It is determined whether or not a certain direction feature portion exists. For example, if the image is an acquired image 1421 as shown in FIG. 9A, the direction feature portion 1421a1 exists in the right hand detection rectangular image area 1420a, but does not exist in the left hand detection rectangular image area 1420b. It is determined that the palm is located at an angle of “0 °”. Thereby, the information processing apparatus 100 determines that the acquired image is an image of the palm of the right hand.

  For example, when the image is an acquired image 1422 as shown in FIG. 9B, the direction feature portion 1422b1 exists in the left-hand detection rectangular image region 1420b, but does not exist in the right-hand detection rectangular image region 1420a. It is determined that the palm of the user is positioned at an angle of “90 °” clockwise. As a result, the information processing apparatus 100 determines that the acquired image is an image of the palm of the left hand.

  In this manner, when the user's palm is detected, the reading unit 142 acquires an image of the direction feature portion in the palm. The information acquisition unit 111 determines the angle of the palm based on the presence / absence of the direction feature portion of the right hand detection rectangular image region 1420a or the left hand detection rectangular image region 1420b. The type determining unit 112 determines the type of palm (one of the left and right hands) arranged on the reading unit 142 based on the determined palm angle.

  The detection of the direction feature portion in the second embodiment, that is, the valley portion of the base of the finger, for example, the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, the valley portion of the base of the ring finger and the little finger All or at least some combinations may be detected based on being open at a predetermined interval. For example, the detection of the valley portion at the base of the finger may be realized by the position of the base of each finger in the palm acquired image acquired by the living body detection unit 142a and the contour identification between the base of the finger. Good.

Note that the acquired image for detecting the direction feature portion illustrated in FIG. 9 may be an image captured by the imaging unit 142b.
FIG. 10 is a flowchart illustrating biometric feature information acquisition registration processing according to the second embodiment. The biometric feature information acquisition / registration processing is processing for determining the type of palm to be registered, and generating and registering biometric feature information indicating the type of palm and the characteristics of veins. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 10 will be described in order of step number.

  [Step S11] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.

[Step S12] The type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S11.
[Step S13] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S12 and S13 may be reversed or simultaneous.
[Step S14] The information generation unit 113 extracts features of the living body based on the living body image acquired in step S13. The information generation unit 113 generates biometric information indicating the feature extraction result.

  The biometric information may be information indicating, for example, a feature point in a vein (such as a branch point of a vein) reflected in a biometric image. Alternatively, the biological information may be image information obtained by cutting out a region where a vein is reflected from a biological image. Alternatively, the biological information may be a biological image itself.

[Step S15] The information generation unit 113 generates biometric feature information including the type determined in step S12, the biometric information generated in step S14, and the user ID.
[Step S16] The information generation unit 113 stores the biometric feature information generated in step S15 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.

  FIG. 11 is a flowchart illustrating biometric feature information authentication processing according to the second embodiment. The biometric feature information authentication process determines the type of palm to be authenticated, generates verification biometric feature information indicating the type of palm to be authenticated and the characteristics of veins, and verifies the biometric feature information registered in advance to perform authentication. It is processing to do. The collation biometric feature information is data having the same configuration as that of the biometric feature information, and indicates the feature of the user's biometric subject (the palm vein in the second embodiment). The biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 11 will be described in order of step number.

  [Step S <b> 21] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142 a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.

[Step S22] The type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S21.
[Step S23] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image, and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S22 and S23 may be reversed or simultaneous.
[Step S24] The information generation unit 113 extracts features of the living body based on the living body image acquired in step S23. The information generation unit 113 generates biometric information indicating the feature extraction result.

[Step S25] The information generation unit 113 generates verification biometric feature information including the type determined in step S22 and the biometric information generated in step S24.
[Step S26] The matching unit 114 performs biometric feature information matching processing (described later in FIG. 12) using the matching biometric feature information generated in step S25. Thereafter, the process ends.

  10 and 11, the determination of the direction of the living body and the extraction of the characteristics of the living body are performed based on the images from the individual sensors. However, when these processes are performed based on an image from a common sensor, the following processing procedure can be used. For example, at the start of each process of FIGS. 10 and 11, the sensor captures an image and stores the image in the RAM 102. Next, the information acquisition unit 111 determines the direction of the living body based on the image stored in the RAM 102 (corresponding to steps S11 and S21), and the type determination unit 112 determines the type based on the direction determination result. (Corresponding to steps S12 and S22). The sensor also captures an image again in the state in which near-infrared light is irradiated from the light source unit 142 c and stores the image in the RAM 102. Next, the information generation unit 113 extracts biometric features based on the image stored in the RAM 102, and generates biometric information (corresponding to steps S14 and S24). Note that the information generation unit 113 acquires a biological image by extracting a predetermined region where a vein exists from the image stored in the RAM 102 when extracting the characteristics of the biological body (corresponding to steps S13 and S23). Features may be extracted based on a biological image.

  FIG. 12 is a flowchart illustrating biometric feature information matching processing according to the second embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 12 will be described in order of step number.

[Step S31] The collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
[Step S32] The collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S31. When one-to-one matching is performed, the matching unit 114 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 141b.

  [Step S33] The collation unit 114 selects one of the biometric feature information extracted in step S32 and selects biometric information included in each of the selected biometric feature information and collation biometric feature information.

  [Step S34] As a result of the collation in step S33, the collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful. If the verification is successful (step S34 YES), the process proceeds to step S35. On the other hand, if the verification fails (NO in step S34), the process proceeds to step S36.

  [Step S35] The collation unit 114 executes a predetermined process when authentication is successful. Here, the predetermined processing when the authentication is successful may be, for example, authority of the user to log in to the information processing apparatus 100 or a predetermined Internet site, start up an application, permit data access, and the like. Thereafter, the process returns.

  [Step S36] The collation unit 114 determines whether all the biometric feature information extracted in step S32 has been selected in step S33. If all have been selected (step S36 YES), the process proceeds to step S37. On the other hand, if there is an unselected item (NO in step S36), the process proceeds to step S33.

  [Step S37] The collation unit 114 performs a predetermined process when authentication fails. Here, the predetermined processing when the authentication fails is, for example, denial of login to the information processing apparatus 100 or the predetermined Internet site of the user, denial of activation of the application, denial of authorization such as data access permission, etc. It may be an output of a message indicating refusal. Thereafter, the process returns.

  FIG. 13 is a diagram illustrating a message window at the time of registration according to the second embodiment. A message window 121a illustrated in FIG. 13 is an example of a window displayed on the display screen of the LCD 121 included in the information processing apparatus 100. The message window 121a displays a message and an image for notifying the user that biometric feature information has been successfully registered based on the palm read.

  In the message window 121a, for example, a message “Registration of the palm of the right hand was successful” and a biological image showing the vein of the palm of the right hand imaged at the time of registration are displayed. The message window 121a has an OK button 121a1.

  The OK button 121a1 is a button for ending display of the message window 121a. When the user confirms the display contents of the message window 121a, the user can end the display of the message window 121a by operating the OK button 121a1.

  According to the second embodiment as described above, the type of living body such as the right hand or the left hand can be determined according to the angle of the living body such as the palm when acquiring the biological information about the vein of the palm or the like. It becomes possible. As a result, adverse effects caused by an increase in the number of objects to be verified can be suppressed.

  In addition, since the reading unit 142 can acquire biometric images in a plurality of directions, the degree of freedom of the posture of the user when acquiring biometric information is increased, and the burden on the user's shoulder, arm, and wrist joints can be suppressed. it can.

  In addition, it is possible to save the trouble of the operation in which the administrator or the user manually selects the type every time, improve the convenience of the administrator or the user, and quickly and reliably register and update the biometric feature information. .

  Further, by previously classifying and registering biometric feature information by type, one-to-N collation is performed to collate one collating biometric feature information with a plurality of biometric feature information stored in the biometric feature information storage unit 141b. In this case, it is possible to narrow down the target of collation based on the type. For this reason, it is possible to suppress an increase in authentication processing load and processing speed.

  In addition, since the palm can be positioned in the right and left directions, the degree of freedom of the posture of the user's arm and wrist is increased as compared with the case where the left and right palms are read in a single direction. The increase in the burden at the time of reading can be suppressed.

[First Modification of Second Embodiment]
Next, a first modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the first modification of the second embodiment is different from the second embodiment in that biometric authentication is performed by communicating with a server connected via a network.

  FIG. 14 is a block diagram illustrating an information processing apparatus according to a first modification of the second embodiment. An information processing apparatus 100a according to a first modification of the second embodiment includes an information acquisition unit 111, a type determination unit 112, an information generation unit 113a, and a collation unit 114a. The information acquisition unit 111 is connected to a reading unit 142. The server 150 is connected to the information processing apparatus 100a via a network 151 that is a communication line such as a LAN or a WAN (Wide Area Network), and includes a biometric feature information storage unit 141b.

  The information acquisition unit 111 performs the same processing as the information acquisition unit 111 in FIG. That is, the information acquisition unit 111 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 100a. The information acquisition unit 111 also has a function of detecting that a palm is placed at a predetermined height on the reading unit 142 and a function of determining the direction of the hand placed on the reading unit 142.

  Similar to the type determination unit 112 in FIG. 5, the type determination unit 112 determines the type of biometric information based on the direction of the living body determined by the information acquisition unit 111. The type indicates the left and right of the hand that is the basis for generating biometric information.

  Similar to the information generation unit 113 in FIG. 5, the information generation unit 113 a includes collation that includes biological information based on the biological image acquired by the information acquisition unit 111 and the type of biological information determined by the type determination unit 112. Biometric feature information is generated. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown. The information generating unit 113a includes biometric information including biometric information based on the biometric image acquired by the information acquiring unit 111, the type of biometric information determined by the type determining unit 112, and identification information that identifies an individual corresponding to the biometric information. Information is generated and stored in the biometric feature information storage unit 141b of the server 150. As a result, biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered. The type is different for the left and right hands. In addition, when registering the user's biometric information, the information generating unit 113a stores the generated biometric feature information in the biometric feature information storage unit 141b. Moreover, at the time of the authentication by a user's biometric information, the collation part 114a performs collation using the collation biometric feature information produced | generated by the information generation part 113a.

  The collation unit 114a extracts biometric feature information whose type matches the collation biometric feature information from the biometric feature information stored in the biometric feature information storage unit 141b, and the biometric information of the collation biometric feature information and the extracted biometric feature information Match based on. Thereby, the information processing apparatus 100a performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.

  The biometric feature information storage unit 141b stores biometric feature information indicating the biometric information and the type of biometric information generated by the information processing apparatus 100a. Thereby, a user's biometric information and a classification are matched and memorize | stored. Further, at the time of user authentication in the information processing apparatus 100a, the biometric feature information is transmitted to the information processing apparatus 100a via the network 151.

The first modification of the second embodiment as described above also provides the same effects as those of the second embodiment.
In addition, when a large number of users register and update a large number of biometric feature information with a large number of information processing apparatuses via the network 151, security and management efficiency are further improved by centralized management by storing biometric feature information in a centralized manner. Can be increased.

  In addition, since each user can register and update biometric feature information with a large number of information processing apparatuses via the network 151, the convenience of the administrator and the user can be improved.

  In addition, by classifying and registering biometric feature information by type in advance, when a large number of users perform 1-to-N verification on a large number of information processing apparatuses via a network, the target of verification is narrowed down based on the type. Can do. For this reason, it is possible to suppress an increase in authentication processing load and processing speed. In other words, since the one-to-N matching can be performed at high speed, the convenience for the user can be increased.

  Further, the processing time for 1-to-N collation is proportional to the number (N) of registered biometric feature information. For this reason, it is possible to double the number of pieces of biometric feature information while suppressing an increase in time required for verification and time required for authentication processing.

[Second Modification of Second Embodiment]
Next, a second modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the second modified example of the second embodiment is such that the processing for extracting the features of the palm vein and the processing for matching the features is performed by referring to the type of the living body. This is different from the second embodiment.

  FIG. 15 is a flowchart illustrating biometric feature information acquisition registration processing according to the second modification of the second embodiment. The biometric feature information acquisition / registration processing is processing for determining the type of palm to be registered, and generating and registering biometric feature information indicating the type of palm and the characteristics of veins. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 15 will be described in order of step number.

  [Step S41] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.

[Step S42] The type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S41.
[Step S43] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S42 and S43 may be reversed or simultaneous.
[Step S44] The information generation unit 113a refers to the type determined in step S42 based on the biometric image acquired in step S43, extracts biometric features, and generates biometric information.

  For example, the information generation unit 113a extracts biometric features by different processing procedures according to the type determined in step S42. For example, the information generation unit 113a may select a template image corresponding to the type, and extract a biological feature from the biological image using the selected template image. Alternatively, the information generation unit 113a may change the region for detecting the vein from the biological image or change the direction of scanning the biological image when detecting the vein according to the type. By these processes, it is possible to obtain an effect of increasing the processing efficiency when extracting features of the living body or increasing the extraction accuracy.

  For example, in step S43, the information acquisition unit 111 may perform processing according to the type determined in step S42. For example, in step S43, the information acquisition unit 111 may change the imaging region when capturing a biological image according to the type.

[Step S45] The information generation unit 113a generates biometric feature information including the type determined in step S42, the biometric information generated in step S44, and the user ID.
[Step S46] The information generation unit 113a stores the biometric feature information generated in step S45 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.

  FIG. 16 is a flowchart illustrating biometric feature information authentication processing according to the second modification of the second embodiment. The biometric feature information authentication process determines the type of palm to be authenticated, generates verification biometric feature information indicating the type of palm to be authenticated and the characteristics of veins, and verifies the biometric feature information registered in advance to perform authentication. It is processing to do. The biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. Hereinafter, the process illustrated in FIG. 16 will be described in order of step number.

  [Step S51] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.

[Step S52] The type determining unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S51.
[Step S53] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image, and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S52 and S53 may be reversed or simultaneous.
[Step S54] The information generation unit 113a extracts features of the living body with reference to the type determined in step S52 based on the biological image acquired in step S53. In step S54, the information generation unit 113a extracts biological features and generates biological information by the same processing procedure as in step S44 of FIG.

  For example, in step S53, the information acquisition unit 111 may perform processing according to the type determined in step S52. For example, in step S53, the information acquisition unit 111 may change the imaging region when capturing a biological image according to the type.

[Step S55] The information generation unit 113a generates verification biometric feature information including the type determined in step S52 and the biometric information generated in step S54.
[Step S56] The matching unit 114a performs biometric feature information matching processing (described later in FIG. 17) using the matching biometric feature information generated in step S55. Thereafter, the process ends.

  FIG. 17 is a flowchart illustrating biometric feature information matching processing according to the second modification of the second embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 17 will be described in order of step number.

[Step S61] The collation unit 114a acquires collation biometric feature information generated by the biometric feature information authentication process.
[Step S62] The collation unit 114a refers to the biometric feature information storage unit 141b and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S61. When one-to-one matching is performed, the matching unit 114a extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 141b.

  [Step S63] The matching unit 114a selects an unselected one of the biometric feature information extracted in step S62, and selects the biometric feature selected with reference to the type determined in step S52 of the biometric feature information authentication process. The information is compared with the matching biometric feature information. At this time, the collation unit 114a refers to the type of the living body and performs a process suitable for the type, so that the efficiency of the process can be improved. For example, the collation unit 114a changes the order or the calculation method when comparing the biological information according to the type.

  [Step S64] The collation unit 114a determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in step S63. If the verification is successful (step S64 YES), the process proceeds to step S65. On the other hand, if the verification fails (NO in step S64), the process proceeds to step S66.

[Step S65] The collation unit 114a performs a predetermined process when the authentication is successful. Thereafter, the process returns.
[Step S66] The collation unit 114a determines whether all the biometric feature information extracted in step S62 has been selected in step S63. If all have been selected (YES in step S66), the process proceeds to step S67. On the other hand, if there is an unselected item (NO in step S66), the process proceeds to step S63.

[Step S67] The collation unit 114a performs a predetermined process when authentication fails. Thereafter, the process returns.
The second modification example of the second embodiment as described above also provides the same effects as those of the second embodiment.

  In addition, by acquiring a living body in advance by referring to the type of living body (whether it is a left or right palm vein), extracting and collating features, it is possible to obtain biometric information, extract features, and compare features Processing efficiency can be increased, and the accuracy of biometric information acquisition, feature extraction, and feature matching can be increased.

[Third Modification of Second Embodiment]
Next, a third modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the third modified example of the second embodiment is the second implementation in that the extraction of the vein of the palm and the matching of the features are performed by correcting the angle of the living body. The form is different.

  For example, in the biological image acquired by the information processing apparatus 100, when the direction of the right hand of the user is used as a reference, the direction of the left hand rotates by a predetermined angle (for example, 90 ° clockwise) with respect to the placement direction of the right hand. Suppose you are. In such a case, even if the feature can be extracted from the biological image acquired in the right hand direction, the direction from the biological image acquired in the direction of the left hand that is rotated by a predetermined angle with respect to the mounting direction of the right hand The feature may not be extracted because of the difference.

  In the third modification of the second embodiment, the biometric image of the left hand is rotated and corrected by a predetermined angle (for example, 90 ° counterclockwise) to align the direction of the biometric image and extract features. And feature matching. The predetermined angle can be set arbitrarily.

  FIG. 18 is a flowchart illustrating biometric feature information acquisition and registration processing according to the third modification of the second embodiment. The biometric feature information acquisition / registration processing is processing for determining the type of palm to be registered, and generating and registering biometric feature information indicating the type of palm and the characteristics of veins. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 18 will be described in order of step number.

  [Step S <b> 71] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142 a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.

[Step S72] The type determining unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S71.
[Step S <b> 73] The information acquisition unit 111 causes the imaging unit 142 b of the reading unit 142 to capture an image and acquires a biological image in which a palm vein is reflected.

Note that the processing order of steps S72 and S73 may be reversed or simultaneous.
[Step S74] Based on the biological image acquired in Step S73, the information generation unit 113 corrects the angle of the biological image based on the direction determined in Step S71, extracts the characteristics of the biological image, and generates biological information. To do. Here, the correction of the angle of the living body image is performed by rotating the living body image acquired in step S73 in the direction opposite to the angle indicating the direction determined in step S71, regardless of the angle of the living body with respect to the reading unit 142. Processing is performed so that the palm veins are in the same direction. However, the present invention is not limited to this, and the angle may be corrected on the processing side for extracting features without rotating the biological image.

[Step S75] The information generation unit 113 generates biometric feature information including the type determined in step S72, the biometric information generated in step S74, and the user ID.
[Step S76] The information generation unit 113 stores the biometric feature information generated in step S75 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.

  FIG. 19 is a flowchart illustrating a biometric feature information authentication process according to a third modification of the second embodiment. The biometric feature information authentication process determines the type of palm to be authenticated, generates verification biometric feature information indicating the type of palm to be authenticated and the characteristics of veins, and verifies the biometric feature information registered in advance to perform authentication. It is processing to do. The biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 19 will be described in order of step number.

  [Step S81] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.

[Step S82] The type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S81.
[Step S83] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S82 and S83 may be reversed or simultaneous.
[Step S84] The information generation unit 113 corrects the angle of the biological image based on the direction determined in Step S81 based on the biological image acquired in Step S83, extracts the characteristics of the biological image, and acquires the biological information. Generate. However, the present invention is not limited to this, and the angle may be corrected on the processing side for extracting features without rotating the biological image.

[Step S85] The information generation unit 113 generates verification biometric feature information including the type determined in step S84 and the biometric information generated in step S84.
[Step S86] The collation unit 114 executes biometric feature information collation processing (described later in FIG. 20) using the collation biometric feature information generated in step S85. Thereafter, the process ends.

  FIG. 20 is a flowchart illustrating the biometric feature information matching process according to the third modification of the second embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 20 will be described in order of step number.

[Step S91] The collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
[Step S92] The collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S91. When one-to-one matching is performed, the matching unit 114 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 141b.

  [Step S93] The matching unit 114 selects an unselected one of the biometric feature information extracted in step S92, and collates the biometric information included in each of the selected biometric feature information and the matching biometric feature information.

  [Step S94] The collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S93. If the verification is successful (step S94 YES), the process proceeds to step S95. On the other hand, if the verification fails (NO in step S94), the process proceeds to step S96.

[Step S95] The collation unit 114 executes a predetermined process when authentication is successful. Thereafter, the process returns.
[Step S96] The collation unit 114 determines whether all the biometric feature information extracted in step S92 has been selected in step S93. If all have been selected (YES in step S96), the process proceeds to step S97. On the other hand, if there is an unselected item (NO in step S96), the process proceeds to step S93.

[Step S97] The collation unit 114 executes a predetermined process when authentication fails. Thereafter, the process returns.
The third modification of the second embodiment as described above also has the same effect as that of the second embodiment.

  In addition, it is possible to improve the efficiency of the feature extraction process by determining the direction of the biometric information and extracting the biometric feature by correcting the angle based on the determined direction. Can improve the accuracy.

[Fourth Modification of Second Embodiment]
Next, a fourth modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the fourth modification of the second embodiment is different from the second embodiment in that the biometric feature information includes an angle indicating the direction of the living body instead of the type of the living body.

  For example, in the biological image acquired by the information processing apparatus 100, when the direction of the right hand of the user is used as a reference, the direction of the left hand rotates by a predetermined angle (for example, 90 ° clockwise) with respect to the placement direction of the right hand. Suppose you are.

  In the fourth modification of the second embodiment, instead of information indicating the type of living body of biological characteristic information (for example, which of the left and right palm veins) is 0 ° with respect to the right hand, By setting an angle indicating a predetermined angle for the left hand, the type of biological information is specified. The predetermined angle can be set arbitrarily.

  FIG. 21 is a diagram illustrating a biometric feature table according to a fourth modification of the second embodiment. The biometric feature table 141b2 illustrated in FIG. 21 is set in the biometric feature information storage unit 141b included in the information processing apparatus 100. The biometric feature table 141b2 is a table that manages biometric feature information used for biometric authentication of the information processing apparatus 100.

  The biometric feature table 141b2 includes “number”, “ID”, “angle”, and “feature data” as items. In the biometric feature table 141b2, values set in the above items are associated with each other as biometric feature information.

  The angle indicates an angle when the palm vein indicated by the biometric feature information is detected. For the biometric feature information of the vein of the palm of the right hand, “0” indicating 0 ° as a reference angle is set. As for the biometric feature information of the vein of the palm of the left hand, “90” or “180” indicating a predetermined angle rotated clockwise from the reference angle is set. Here, as shown in FIG. 21, a plurality of different angles can be set as the predetermined angle. For example, as described above, the angle is set to “0” indicating 0 ° for the biometric feature information of the right hand, and “90” and 180 ° indicating 90 ° for the biometric feature information of the left hand. “180” is set. In this case, the biometric feature information whose angle is 0 is the biometric feature information of the right hand. The biometric feature information whose angle is 90 is the biometric feature information of the left hand acquired in a state where the direction of the right hand and the direction of the left hand are orthogonal to the direction in which the left hand rotates 90 ° clockwise with respect to the right hand. The biometric feature information whose angle is 180 is the biometric feature information of the left hand acquired in a state where the direction of the right hand and the direction of the left hand are opposite. As described above, the fourth modification of the second embodiment is applicable even when there are a plurality of types of angles between the left hand direction and the right hand direction.

Note that the biometric feature table 141b2 illustrated in FIG. 21 is an example, and any item can be set in the biometric feature table.
FIG. 22 is a flowchart illustrating biometric feature information acquisition and registration processing according to the fourth modification of the second embodiment. The biometric feature information acquisition / registration process is a process for determining the direction of the hand to be registered and generating and registering biometric feature information indicating the direction of the hand and the characteristics of the vein. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 22 will be described in order of step number.

  [Step S <b> 101] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142 a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.

[Step S102] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image, and acquires a biological image in which the veins of the palm are reflected.
[Step S103] The information generation unit 113 extracts features of the living body based on the biological image acquired in step S102. The information generation unit 113 generates biometric information indicating the feature extraction result.

  [Step S104] The information generation unit 113 generates biometric feature information including the angle indicating the direction of the hand determined in step S101, the biometric information generated in step S103, and the user ID.

  [Step S105] The information generation unit 113 stores the biometric feature information generated in step S104 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.

  FIG. 23 is a flowchart illustrating a biometric feature information authentication process according to the fourth modified example of the second embodiment. The biometric feature information authentication process determines the direction of the hand to be authenticated, generates verification biometric feature information indicating the direction of the hand to be authenticated and the features of the vein, and performs authentication by comparing with biometric feature information registered in advance. It is processing to do. The biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 23 will be described in order of step number.

  [Step S <b> 111] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142 a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.

[Step S112] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image, and acquires a biological image in which the veins of the palm are reflected.
[Step S113] The information generation unit 113 extracts features of the living body based on the living body image acquired in step S112. The information generation unit 113 generates biometric information indicating the feature extraction result.

  [Step S114] The information generation unit 113 generates verification biometric feature information including the angle indicating the direction of the hand determined in step S111 and the biometric information generated in step S113.

  [Step S115] The matching unit 114 performs a biometric feature information matching process (described later in FIG. 24) using the matching biometric feature information generated in step S114. Thereafter, the process ends.

  FIG. 24 is a flowchart illustrating biometric feature information matching processing according to the fourth modification example of the second embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 24 will be described in order of step number.

[Step S121] The collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
[Step S122] The collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the angle of the collation biometric feature information acquired in step S121. When the one-to-one matching is performed, the matching unit 114 extracts the biometric feature information in which both the angle of the matching biometric feature information and the user ID match from the biometric feature information storage unit 141b.

  [Step S123] The collation unit 114 selects one of the biometric feature information extracted in step S122, and selects biometric information included in each of the selected biometric feature information and collation biometric feature information.

  [Step S124] As a result of the collation in step S123, the collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful. If the verification is successful (step S124 YES), the process proceeds to step S125. On the other hand, if the verification fails (NO in step S124), the process proceeds to step S126.

[Step S125] The collation unit 114 performs a predetermined process when the authentication is successful. Thereafter, the process returns.
[Step S126] The collation unit 114 determines whether all the biometric feature information extracted in step S122 has been selected in step S123. If all have been selected (YES in step S126), the process proceeds to step S127. On the other hand, if there is an unselected item (NO in step S126), the process proceeds to step S123.

[Step S127] The collation unit 114 executes predetermined processing when authentication fails. Thereafter, the process returns.
The fourth modification of the second embodiment as described above also has the same effect as that of the second embodiment.

  In addition, the direction of the living body with respect to the reading unit 142 is determined, and the features of the living body are extracted and collated with an angle based on the determined direction. Extraction and collation processing of biometric feature information can be performed.

[Fifth Modification of Second Embodiment]
Next, a fifth modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the fifth modification of the second embodiment is different from the second embodiment in that the biological feature information includes an angle indicating the direction of the living body in addition to the type of the living body.

  For example, in the biological image acquired by the information processing apparatus 100, when the direction of the right hand of the user is used as a reference, the direction of the left hand rotates by a predetermined angle (for example, 90 ° clockwise) with respect to the placement direction of the right hand. Suppose you are.

  In the fifth modification of the second embodiment, in addition to information indicating the type of living body (for example, the left or right palm vein) of the biological feature information, the right hand is 0 °, By setting an angle indicating a predetermined angle for the left hand, when new biometric feature information is acquired, it is possible to update the biometric feature information of the same type and angle. The predetermined angle can be set arbitrarily.

  FIG. 25 is a diagram illustrating a biometric feature table of the fifth modification example of the second embodiment. The biometric feature table 141b3 illustrated in FIG. 25 is set in the biometric feature information storage unit 141b included in the information processing apparatus 100. The biometric feature table 141b3 is a table that manages biometric feature information used for biometric authentication of the information processing apparatus 100.

  In the biometric feature table 141b3, “number”, “ID”, “left / right”, “angle”, and “feature data” are provided as items. In the biometric feature table 141b3, values set in the above items are associated with each other as biometric feature information.

The fifth modification example of the second embodiment is also applicable when there are a plurality of types of hand angles for each of the left hand and the right hand.
Note that the biometric feature table 141b3 illustrated in FIG. 25 is an example, and any item can be set in the biometric feature table.

  FIG. 26 is a flowchart illustrating biometric feature information acquisition and registration processing according to the fifth modification of the second embodiment. The biometric feature information acquisition / registration process is a process for determining the direction and type of a hand to be registered, and generating and registering biometric feature information indicating the direction, type, and vein characteristics of the hand. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 26 will be described in order of step number.

  [Step S <b> 131] The information acquisition unit 111 determines that a palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142 a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.

[Step S132] The type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S131.
[Step S133] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S132 and S133 may be reversed or simultaneous.
[Step S134] The information generation unit 113 extracts features of biological information based on the biological image acquired in step S133. The information generation unit 113 generates biometric information indicating the feature extraction result.

[Step S135] The information generation unit 113 generates biometric feature information including the type determined in step S132, the biometric information generated in step S134, and the user ID.
[Step S136] The information generation unit 113 is the same user as the biometric feature information generated in Step S135, and has the same angle indicating the type determined in Step S132 and the direction of the hand determined in Step S131. It is determined whether or not feature information exists. If there is biometric feature information of the same user and the same type and angle (YES in step S136), the process proceeds to step S137. On the other hand, if there is no biometric feature information of the same user and the same type and angle (NO in step S136), the process proceeds to step S138. Here, whether or not the users of the biometric feature information are the same, for example, accepts input of personal information such as a user ID and a name that can identify the ID when performing biometric feature information acquisition processing, You may determine by comparing with ID.

  [Step S137] The information generation unit 113 stores the biometric feature information generated in step S135 in the biometric feature information storage unit 141b. Thereby, the biometric feature information stored in the biometric feature information storage unit 141b is updated. Thereafter, the process ends.

  [Step S138] The information generation unit 113 newly stores the biometric feature information generated in step S135 in the biometric feature information storage unit 141b. Thereby, biometric feature information is newly registered in the biometric feature information storage unit 141b. Thereafter, the process ends.

  FIG. 27 is a flowchart illustrating biometric feature information authentication processing according to the fifth modification example of the second embodiment. The biometric feature information authentication process determines the direction and type of the authentication target hand, generates matching biometric feature information indicating the direction, type, and vein characteristics of the authentication target hand. This is a process of verifying by verification. The biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 27 will be described in order of step number.

  [Step S <b> 141] The information acquisition unit 111 determines that the palm is placed at a predetermined height position on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142 a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.

[Step S142] The type determining unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S141.
[Step S143] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S142 and S143 may be reversed or simultaneous.
[Step S144] The information generation unit 113 extracts features of biological information based on the biological image acquired in step S143. The information generation unit 113 generates biometric information indicating the feature extraction result.

  [Step S145] The information generation unit 113 generates collation biometric feature information based on the angle (hand direction) determined in step S141, the type determined in step S142, and the biometric information extracted in step S144. .

  [Step S146] The collation unit 114 executes biometric feature information collation processing (described later in FIG. 28) using the collation biometric feature information generated in step S145. Thereafter, the process ends.

  FIG. 28 is a flowchart illustrating biometric feature information matching processing according to the fifth modified example of the second embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 28 will be described in order of step number.

[Step S151] The collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
[Step S152] The collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the angle (hand direction) and type of the collation biometric feature information acquired in step S151. When one-to-one matching is performed, the matching unit 114 extracts biometric feature information in which the angle, type, and user ID of the matching biometric feature information match from the biometric feature information storage unit 141b.

  [Step S153] The collation unit 114 selects one of the biometric feature information extracted in step S152, and collates the biometric information included in each of the selected biometric feature information and collation biometric feature information. At this time, when the angle of the selected biometric feature information matches the angle of the matching biometric feature information, the matching unit 114 performs matching without correcting the biometric angle. On the other hand, when the angle of the selected biometric feature information and the angle of the matching biometric feature information do not match, matching is performed after correcting the angle of the living body so that the respective angles match.

  [Step S154] The collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S153. If the verification is successful (step S154 YES), the process proceeds to step S155. On the other hand, if the verification fails (step S154 NO), the process proceeds to step S156.

[Step S155] The collation unit 114 executes a predetermined process when authentication is successful. Thereafter, the process returns.
[Step S156] The collation unit 114 determines whether all the biometric feature information extracted in step S152 has been selected in step S153. If all have been selected (YES in step S156), the process proceeds to step S157. On the other hand, if there is an unselected item (NO in step S156), the process proceeds to step S153.

[Step S157] The collation unit 114 performs a predetermined process when authentication fails. Thereafter, the process returns.
The fifth modification of the second embodiment as described above also has the same effect as that of the second embodiment.

  In addition, information on the same type of living body but different angles is newly registered, while biometric feature information having the same type of living body and angle is updated, so that data of a plurality of angles can be obtained for the directions of the left and right hands. Can be managed for each angle in the case of managing in common.

[Sixth Modification of Second Embodiment]
Next, a sixth modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the sixth modification of the second embodiment is different from the second embodiment in that registration is performed after correcting the angle of the living body after extracting the features of the palm veins.

  For example, in the biological image acquired by the information processing apparatus 100, when the direction of the right hand of the user is used as a reference, the direction of the left hand rotates by a predetermined angle (for example, 90 ° clockwise) with respect to the placement direction of the right hand. Suppose you are. In such a case, biometric information generated from the biometric image acquired in the direction of the right hand, and biometric information generated from the biometric image acquired in the direction of the left hand that has a relationship rotated by a predetermined angle with respect to the right hand placement direction, May not be verified due to different directions.

  In the sixth modification of the second embodiment, the biometric feature information of the left hand generated from the biometric image is corrected by rotating it by a predetermined angle (for example, 90 ° counterclockwise), and the direction of the biometric feature information is changed. Matching makes it possible to collate. The predetermined angle can be set arbitrarily.

  FIG. 29 is a flowchart illustrating biometric feature information acquisition and registration processing according to the sixth modified example of the second embodiment. The biometric feature information acquisition / registration processing is processing for determining the type of palm to be registered, and generating and registering biometric feature information indicating the type of palm and the characteristics of veins. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 29 will be described in order of step number.

  [Step S161] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.

[Step S162] The type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S161.
[Step S163] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S162 and S163 may be reversed or simultaneous.
[Step S164] The information generation unit 113 extracts features of the living body based on the living body image acquired in Step S163. The information generation unit 113 generates biometric information indicating the feature extraction result.

  [Step S165] The information generation unit 113 generates biometric feature information including the type determined in step S162, the biometric information generated in step S164, and the user ID. Moreover, the information generation part 113 correct | amends the angle of the biometric information contained in biometric feature information based on the direction determined by step S161. Here, the correction of the angle of the biometric information is performed by converting the biometric information generated in step S164 so that the biometric information is converted into information when rotated in the direction opposite to the angle determined in step S161. Regardless, processing is performed so that the biometric information included in the biometric feature information becomes information based on the same direction.

Note that the information generation unit 113 may correct the angle of the biological information before generating the biological feature information.
[Step S166] The information generation unit 113 stores the biometric feature information generated in step S165 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.

  FIG. 30 is a flowchart illustrating a biometric feature information authentication process according to the sixth modification of the second embodiment. The biometric feature information authentication process determines the type of palm to be authenticated, generates verification biometric feature information indicating the type of palm to be authenticated and the characteristics of veins, and verifies the biometric feature information registered in advance to perform authentication. It is processing to do. The biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 30 will be described in order of step number.

  [Step S171] The information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.

[Step S172] The type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S171.
[Step S173] The information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.

Note that the processing order of steps S172 and S173 may be reversed or simultaneous.
[Step S174] The information generation unit 113 extracts features of the living body based on the living body image acquired in step S173. The information generation unit 113 generates biometric information indicating the feature extraction result.

  [Step S175] The information generation unit 113 generates verification biometric feature information including the type determined in step S172 and the biometric information generated in step S174. Moreover, the information generation part 113 correct | amends the angle of the biometric information contained in collation biometric feature information based on the direction determined by step S171 similarly to step S165 of FIG.

  [Step S176] The matching unit 114 performs biometric feature information matching processing (described later in FIG. 31) using the matching biometric feature information generated in step S175. Thereafter, the process ends.

  FIG. 31 is a flowchart illustrating biometric feature information matching processing according to the sixth modification of the second embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 31 will be described in order of step number.

[Step S181] The collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
[Step S182] The collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S181. When one-to-one matching is performed, the matching unit 114 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 141b.

  [Step S183] The collation unit 114 selects one of the biometric feature information extracted in step S182 and selects biometric information included in each of the selected biometric feature information and collation biometric feature information.

  [Step S184] The collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in step S183. If the verification is successful (step S184: YES), the process proceeds to step S185. On the other hand, if the verification fails (NO in step S184), the process proceeds to step S186.

[Step S185] The collation unit 114 executes a predetermined process when authentication is successful. Thereafter, the process returns.
[Step S186] The collation unit 114 determines whether all the biometric feature information extracted in step S182 has been selected in step S183. If all have been selected (YES in step S186), the process proceeds to step S187. On the other hand, if there is an unselected one (step S186 NO), the process proceeds to step S183.

[Step S187] The collation unit 114 executes a predetermined process when authentication fails. Thereafter, the process returns.
The sixth modification of the second embodiment as described above also has the same effect as that of the second embodiment.

  In addition, by determining the direction of the living body with respect to the reading unit 142, correcting and registering the biological information with an angle based on the determined direction, it is possible to improve the efficiency of the feature matching process, The accuracy of feature matching can be increased.

[Seventh Modification of Second Embodiment]
Next, a seventh modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the seventh modification of the second embodiment is different from the second embodiment in that the directions of the left and right living bodies (hands) are in the reverse direction (rotated 180 °).

  FIG. 32 is a diagram illustrating an appearance of an information processing apparatus according to a seventh modification example of the second embodiment. An information processing apparatus 300 illustrated in FIG. 32 is a notebook personal computer to which a security function based on biometric authentication using a palm vein is added. The information processing apparatus 300 includes a display unit 120 having an LCD 121, a keyboard 131, and a main body unit 330 having a reading unit 342.

  Each of the display unit 120 and the main body unit 330 has a substantially rectangular parallelepiped housing having a front surface, a rear surface facing the front surface, and two side surfaces connecting them. The display unit 120 and the main body 330 are connected to each other near the rear surface of the main body 330 so as to be opened and closed by a hinge (not shown). When the display unit 120 and the main body unit 330 are in the closed state, the appearance of the information processing apparatus 300 is a substantially rectangular parallelepiped as a whole.

  The LCD 121 is a display device having a display screen for displaying characters or images. In addition to the LCD, other thin display devices such as an organic EL display may be used as the display device. The keyboard 131 is an input device for inputting characters and performing other operations.

  The reading unit 342 is an input device that inputs a biological image by reading a vein of a palm by a user holding the palm. The reading unit 342 includes a square vein sensor that acquires a biological image of the palm vein by reading the vein of the user's palm. The vein sensor is arranged so that each side of the vein sensor is parallel to each side of the reading unit 342. The reading unit 342 is on the same top surface of the main body 330 as the keyboard 131 of the information processing apparatus 300, and each side of the square vein sensor is parallel to the front and side surfaces of the information processing apparatus 300 at the front center of the keyboard 131. Is arranged.

  Similar to the above embodiments, the vein sensor reads vein information by scanning the reading object. The main scanning direction of the vein sensor is parallel to one side of the square vein sensor. Therefore, the main scanning direction of the vein sensor is parallel to one side of the reading unit 342. In this embodiment, as an example, the main scanning direction of the vein sensor is the D31 direction in FIG. The vein sensor is arranged so that the angle formed by the main scanning direction D31 and the direction D32 in which the front surface 330a of the main body 330 extends is 90 °.

  The vein sensor is, for example, either the direction D32 of the front surface 330a of the main body 330, the arrangement direction D33 of operation keys on the keyboard 131 (longitudinal direction of the keyboard 131), or the main scanning direction D34 of the LCD 121 and the main scanning direction D31. The angle may be 90 °.

  In addition, in the information processing apparatus 300 of the present embodiment, a notebook type personal computer has been described. However, the information processing apparatus 300 is an example of an information processing apparatus, and the user authentication function of the present embodiment is not limited to a mobile phone or a PDA. And other mobile communication terminal devices, desktop personal computers, automatic teller machines for accepting and withdrawing deposits from banks, information processing system terminal devices, etc. Can be applied to.

  FIG. 33 is a diagram illustrating a biometric feature table according to a seventh modification example of the second embodiment. The biometric feature table 141b4 illustrated in FIG. 33 is set in the biometric feature information storage unit 141b included in the information processing apparatus 300 according to the seventh modification example of the second embodiment. The biometric feature table 141b4 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 300.

  The biometric feature table 141b4 is provided with “number”, “ID”, “left / right”, “angle”, and “feature data” as items. In the biometric feature table 141b4, values set in the above items are associated with each other as biometric feature information.

  As described above, the angle indicates an angle when the palm vein indicated by the biometric feature information is detected. For the biometric feature information of the vein of the palm of the right hand, “0” indicating 0 ° as a reference angle is set. As for the biometric feature information of the vein of the palm of the left hand, “90” or “180” indicating a predetermined angle rotated clockwise from the reference angle is set. Here, as shown in FIG. 33, a plurality of different angles can be set as the predetermined angle. Further, in the seventh modification example of the second embodiment, the direction of the biometric feature information of the right hand and the direction of the biometric feature information of the left hand are opposite directions (180 ° rotation). Also, depending on the angle item, the biometric feature information of the left hand that is opposite to the direction of the biometric feature information of the right hand and the biometric feature in which the direction of the biometric feature information of the left hand is orthogonal (rotated by 90 °) It can be distinguished from feature information. As described above, the seventh modification of the second embodiment is applicable even when there are a plurality of types of angles between the left hand direction and the right hand direction.

Note that the biometric feature table 141b4 illustrated in FIG. 33 is an example, and any item can be set in the biometric feature table.
FIG. 34 is a diagram illustrating a state when the vein of the palm of the right hand is read according to the seventh modification example of the second embodiment. FIG. 34 is a diagram of the state when the information processing apparatus 300 reads the vein of the palm of the right hand as viewed from above. As illustrated in FIG. 34, the information processing apparatus 300 includes a display unit 120 and a main body unit 330. A keyboard 131 and a reading unit 342 are provided on the upper surface of the main body 330 of the information processing apparatus 300. The reading unit 342 is on the upper surface of the same main body 330 as the keyboard 131 of the information processing apparatus 300, and the sides of the rectangular vein sensor having a long horizontal direction are the front and side surfaces of the information processing apparatus 300 at the center in front of the keyboard 131. It is arranged in parallel with. In addition, a user's head 201, torso 202, upper right arm 203, right lower arm 204, and right hand palm 205 are shown.

  When the user causes the reading unit 342 to read the vein of the palm, as shown in FIG. 34, the user makes the palm (for example, the palm 205 of the right hand) that reads the vein parallel to the front surface of the information processing apparatus 300. And it positions so that it may become parallel to the upper surface of the main-body part 330 of the information processing apparatus 300. FIG. At this time, the user is a space separated from the vein sensor surface by a certain distance (for example, several centimeters) with the finger 205 of the palm of the right hand open so that the center of the palm coincides with the center of the reading unit 342. Position on top. The user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  In addition, by using the base between the fingers of the hand to detect that the palm 205 of the right hand is placed on the reading unit 342, it is possible to quickly and reliably determine the left and right hands. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. 34, the case where the vein of the palm of the right hand of the user is read has been described. However, the present invention is not limited to this, and the same applies to the case of reading the vein of the palm of the left hand.

  FIG. 35 is a diagram illustrating detection of the direction characteristic portion of the hand according to the seventh modification example of the second embodiment. FIG. 35 shows the direction features of the hand direction and the biological information in the information processing apparatus 300 according to the second embodiment. FIG. 35A shows an acquired image 3421 of the palm of the right hand. FIG. 35B shows an acquired image 3422 of the palm of the left hand. The acquired images 3421 and 3422 are images acquired by the living body detection unit 142a of the reading unit 342. The acquired images 3421 and 3422 are captured by, for example, an image sensor provided in the living body detection unit 142a.

  As shown in FIG. 35, the image acquired by the living body detection unit 142a includes a right hand detection rectangular image area 3420a for detecting the root part of the palm of the right hand palm, and a valley part of the base of the finger of the left hand palm. A left-hand detection rectangular image region 3420b for detecting the image is set. The right-hand detection rectangular image region 3420a and the left-hand detection rectangular image region 3420b are provided along two parallel sides of the acquired image. The valley portion at the base of the finger functions as a direction feature portion.

  When the reading unit 342 detects a palm of a user positioned above the reading unit 342 with a distance sensor (not shown) of the living body detection unit 142a, the reading unit 342 acquires a palm image using an image sensor (not shown) of the living body detection unit 142a, and acquires information. Supplied to the unit 111. The information acquisition unit 111 determines whether or not a valley portion at the base of the finger is detected in the right-hand detection rectangular image region 3420a or the left-hand detection rectangular image region 3420b set in the acquired image. The palm image acquisition by the image sensor here is performed without being irradiated with near-infrared light from the light source unit 142c. Therefore, the image acquired here is not an image of the vein of the palm but an image of the appearance of the palm. In the detection process of the valley portion at the base of the finger, this detection process is performed for the position where the right hand detection rectangular image area 3420a or the left hand detection rectangular image area 3420b is set in the image of the appearance of the palm.

  In the case where the image of the appearance of the palm is acquired by the living body detection unit 142a as described above, the information acquisition unit 111 includes the right hand detection rectangular image region 3420a and the left hand detection rectangular image region 3420b in the valley portion of the base of the finger. It is determined whether or not a certain direction feature portion exists. For example, if the image is an acquired image 3421 as shown in FIG. 35A, the direction feature portion 3421a1 exists in the right-hand detection rectangular image region 3420a, but not in the left-hand detection rectangular image region 3420b. It is determined that the palm is located at an angle of “0 °”. As a result, the information processing apparatus 300 determines that the acquired image is an image of the vein of the palm of the right hand.

  Also, for example, when the image is an acquired image 3422 as shown in FIG. 35B, the direction feature portion 3422b1 exists in the left-hand detection rectangular image region 3420b, but not in the right-hand detection rectangular image region 3420a. It is determined that the palm of the user is located at an angle of “180 °”. Thereby, the information processing apparatus 300 determines that the acquired image is an image of the vein of the palm of the left hand.

  In this way, when the user's palm is detected, the reading unit 342 acquires an image of the direction feature portion in the palm. The information acquisition unit 111 determines the angle of the palm based on the presence or absence of the direction feature portion of the right-hand detection rectangular image region or the left-hand detection rectangular image region. The type determination unit 112 determines the type of the palm arranged on the reading unit 342 (one of the left and right hands) based on the determined palm angle.

Note that the acquired image for detecting the direction feature portion illustrated in FIGS. 35A and 35B may be an image captured by the imaging unit 142b.
The detection of the direction feature portion in the seventh modification of the second embodiment, that is, the valley portion of the base of the finger, for example, the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, the ring finger and the little finger What is necessary is just to detect based on that all or at least one part combination of the valley part of the root of this is open at predetermined intervals. For example, the detection of the valley portion at the base of the finger may be realized by the position of the base of each finger in the palm acquired image acquired by the living body detection unit 142a and the contour identification between the base of the finger. Good.

The seventh modification of the second embodiment as described above also has the same effect as that of the second embodiment.
[Eighth Modification of Second Embodiment]
Next, an eighth modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. In the information processing apparatus according to the eighth modification of the second embodiment, the position of the reading unit is deviated from the center line of the main body, and the directions of the left and right living bodies (hands) are on each side of the main body. The second embodiment differs from the second embodiment in that it intersects at a predetermined angle other than orthogonal or parallel.

  FIG. 36 is a diagram illustrating an appearance of an information processing apparatus according to an eighth modification of the second embodiment. An information processing apparatus 400 illustrated in FIG. 36 is a notebook personal computer to which a security function based on biometric authentication using a palm vein is added. The information processing apparatus 400 includes a display unit 120 having an LCD 121, a keyboard 131, and a main body unit 430 having a reading unit 442.

  The LCD 121 is a display device having a display screen for displaying characters or images. In addition to the LCD, other thin display devices such as an organic EL display may be used as the display device. The keyboard 131 is an input device for inputting characters and performing other operations.

  The reading unit 442 is an input device that inputs a biometric image by reading the veins of the palm by the user holding the palm. The reading unit 442 includes a square vein sensor that acquires a biological image of a palm vein by reading the vein of the palm of the user. The vein sensor is arranged so that each side of the vein sensor is parallel to each side of the reading unit 442. The reading unit 442 is on the same top surface of the main body 430 as the keyboard 131 of the information processing apparatus 400, and in front of the keyboard 131 and on the left side, one side of the square vein sensor is located on the side of the main body 430. The other side is inclined at 65 ° with respect to the front surface of the main body 430. In the information processing apparatus 400 according to the eighth modified example of the second embodiment, the reading unit 442 is provided on the upper surface of the main body 430 and on the left side. It may be provided. In addition, the main body 430 may be provided at an arbitrary angle with respect to the front surface and the side surface.

  Similar to the above embodiments, the vein sensor reads vein information by scanning the reading object. The main scanning direction of the vein sensor is parallel to one side of the square vein sensor. Therefore, the main scanning direction of the vein sensor is parallel to one side of the reading unit 442. In this embodiment, as an example, the main scanning direction of the vein sensor is the D41 direction in FIG. The vein sensor is arranged such that an angle formed between the main scanning direction D41 and the direction D42 in which the front surface 430a of the main body 430 extends is 65 °.

  The vein sensor is, for example, either the direction D42 of the front surface 430a of the main body 430, the operation key arrangement direction D43 (the longitudinal direction of the keyboard 131) on the keyboard 131, or the main scanning direction D44 of the LCD 121 and the main scanning direction D41. The angle may be 65 °.

  Further, in the information processing apparatus 400 according to the present embodiment, the notebook type personal computer has been described. However, the information processing apparatus 400 is an example of the information processing apparatus, and the user authentication function according to the present embodiment can be applied to a mobile phone or a PDA. And other mobile communication terminal devices, desktop personal computers, automatic teller machines for accepting and withdrawing deposits from banks, information processing system terminal devices, etc. Can be applied to.

  FIG. 37 is a diagram illustrating a state when the vein of the palm of the right hand is read according to the eighth modification example of the second embodiment. FIG. 37 is a diagram of the state when the information processing apparatus 400 reads the vein of the palm of the right hand as viewed from above. As illustrated in FIG. 37, the information processing apparatus 400 includes a display unit 120 and a main body unit 430. A keyboard 131 and a reading unit 442 are provided on the upper surface of the main body 430 of the information processing apparatus 400. In addition, a user's head 201, torso 202, upper right arm 203, right lower arm 204, and right hand palm 205 are shown.

  When the user causes the reading unit 442 to read the vein of the palm 205 of the right hand, the user holds the palm 205 of the right hand with a predetermined angle with respect to the front surface of the information processing apparatus 400 as shown in FIG. The main body portion 430 of 400 is positioned so as to be parallel to the upper surface. At this time, the user is a space separated from the vein sensor surface by a certain distance (for example, several centimeters) with the palm of the right hand 205 open with the finger open so that the center of the palm coincides with the center of the reading unit 442. Position on top. As shown in FIG. 37, since the reading unit 442 is installed on the left side of the main body unit 430, the user's right arm (upper right arm unit 203, right lower arm unit 204) is compared with the second embodiment. Thus, the body portion 202 is bent toward the body portion 202 and is located near the body portion 202. The user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  Moreover, by using the base between the fingers of the hand for detection by the vein sensor of the palm 205 of the right hand, the detection can be performed without any problem. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced.

  FIG. 38 is a diagram illustrating a state when the vein of the palm of the left hand is read according to the eighth modification example of the second embodiment. FIG. 38 is a diagram of the state when the information processing apparatus 400 reads the vein of the palm of the left hand as viewed from above. As illustrated in FIG. 38, the information processing apparatus 400 includes a display unit 120 and a main body unit 430. A keyboard 131 and a reading unit 442 are provided on the upper surface of the main body 430 of the information processing apparatus 400. In addition, a user's head 201, torso 202, left upper arm 206, left lower arm 207, and left hand palm 208 are shown.

  When the user causes the reading unit 442 to read the vein of the palm 208 of the left hand, the user holds the palm 208 of the left hand that reads the vein at a predetermined angle with respect to the front surface of the information processing apparatus 400 as shown in FIG. And it arrange | positions so that it may become in parallel with the upper surface of the main-body part 430 of the information processing apparatus 400. FIG. At this time, the user has a space separated by a certain distance (for example, several centimeters) from the vein sensor surface with the finger 208 of the palm of the left hand open so that the center of the palm coincides with the center of the reading unit 442. Position on top. As shown in FIG. 38, since the reading unit 442 is installed on the left side of the main body 430, the user's left arm (the upper left arm 206 and the lower left arm 207) is the same as in the second embodiment or FIG. Compared to the case of the palm of the right hand described above, it is in a stretched state and is located far from the body portion 202. The user does not need to twist the wrist between the palm 208 of the left hand and the left lower arm 207 at the time of reading, and can be made almost straight. Along with this, each finger of the user's left hand is straightened and opened sufficiently and the four bases between the finger of the user's left hand are sufficiently spaced. Therefore, the palm 208 of the left hand has no horizontal twist, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  Further, by using the base between the fingers of the hand to detect that the palm 208 of the left hand is placed on the reading unit 442, the left and right hands can be determined quickly and reliably. Also, the angle of the left wrist part of the user, the left lower arm part 207 and the left upper arm part 206 from the left wrist, the elbow between the left lower arm part 207 and the left upper arm part 206, and between the left upper arm part 206 and the body part 202 There is no unreasonable posture of the left shoulder, and the burden on the user can be reduced.

The eighth modification of the second embodiment as described above also provides the same effects as those of the second embodiment.
[Third Embodiment]
Next, a third embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the third embodiment is different from the second embodiment in that the biological information is feature information about the finger vein.

  FIG. 39 is a block diagram illustrating an information processing apparatus according to the third embodiment. An information processing apparatus 500 according to the third embodiment includes an information acquisition unit 511, a type determination unit 512, an information generation unit 513, a collation unit 514, and a biometric feature information storage unit 541b. In addition, a reading unit 542 is connected to the information acquisition unit 511.

  The information acquisition unit 511 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 500. The information acquisition unit 511 can acquire the direction of the living body in a state where the biological image is acquired. The biological image acquired by the information acquisition unit 511 is image information of a finger vein pattern. The direction of the living body is two orthogonal different directions based on the left and right of the hand. In addition, the information processing apparatus 500 includes a guide that indicates the direction of a finger to be read. Details of the guide will be described later.

  The reading unit 542 is fixed to the upper part of the information processing apparatus 500. The reading unit 542 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG. The information acquisition unit 511 determines that the living body is arranged at a predetermined distance from the reading unit 542 and the direction of the living body with respect to the reading unit 542 based on the detection result by the living body detection unit 142a of the reading unit 542. The information acquisition unit 511 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 542. In the third embodiment, the directions are orthogonal to each other, and the angle with the keyboard 131 is an oblique angle. The direction feature portion is a valley portion at the base of the finger in the palm.

Further, the information acquisition unit 511 acquires a biological image obtained by imaging a living body by the imaging unit 142b of the reading unit 542.
The information acquisition unit 511 receives an input of the finger type (thumb, forefinger, middle finger, ring finger, and little finger distinction) from which the biometric information of the vein is acquired by the user.

The type determination unit 512 determines the type of biometric information based on the direction of the biometric acquired by the information acquisition unit 511. The type indicates the left and right of the hand that is the basis for generating biometric information.
The information generation unit 513 generates biological information indicating the characteristics of the living body based on the living body image acquired by the information acquisition unit 511. The information generation unit 513 generates verification biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 512. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown. The information generation unit 513 includes, for example, the biological information acquired by the information generation unit 513, the type of biological information determined by the type determination unit 512, and the type of finger received by the information acquisition unit 511 (thumb, index finger, Biometric feature information including a middle finger, a ring finger, and a little finger) and identification information for identifying an individual corresponding to the biometric information is generated and stored in the biometric feature information storage unit 541b. Thereby, the biometric information indicating the biometric information and the type and the type of the finger of the user who has a normal authority registered in advance for use in authentication is registered. In addition, when registering the user's biometric information, the information generating unit 513 stores the generated biometric feature information in the biometric feature information storage unit 541b. Further, at the time of authentication using the user's biometric information, the verification unit 514 performs authentication using the verification biometric feature information generated by the information generation unit 513.

  The collation unit 514 extracts biometric feature information whose type matches the collation biometric feature information, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 500 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.

  The biometric feature information storage unit 541b stores biometric feature information indicating the biometric information, the type of biometric information, and the type of finger. As a result, the user's biometric information, type, and finger type are stored in association with each other.

FIG. 40 is a diagram illustrating a reading unit according to the third embodiment. A reading unit 542 illustrated in FIG. 40 is an input device that allows a user to read a vein of a finger and input a biological image.
The reading unit 542 includes a vein sensor that acquires a biological image of a finger vein by reading the vein of the user's finger. The reading unit 542 has a square shape and is inclined by 45 °. A cross-shaped guide 542 a having sides (side A and side B) parallel to the sides of the reading unit 542 is provided on the upper surface of the reading unit 542. The side A of the guide 542a is inclined 45 ° to the left when viewed from the front of the information processing apparatus 500. The side B of the guide 542a is inclined 45 ° to the right when viewed from the front of the information processing apparatus 500.

  The user positions a finger for acquiring a biological image along one of the cross-shaped sides of the guide 542a. That is, the user positions the finger to be acquired along the side A when the information processing apparatus 500 acquires a biological image of the vein of one finger of the right hand. Further, when the user causes the information processing apparatus 500 to acquire a biological image of a vein of one finger of the left hand, the user positions the finger to be acquired along the side B. This allows the user to acquire a natural image of the right hand finger biometric image and the left hand biometric image of the user in a natural and easy posture, and the information processing apparatus 500 is also an acquisition target. It is possible to determine whether the finger is the finger of the right hand or the finger of the left hand.

  Since finger veins are in-vivo information, they are difficult to counterfeit, and are not affected by body surface such as rough hands, dryness / wetting, etc., so the application rate is high. Further, since it is non-contact, hygienic and natural operability is realized, user resistance is low, and high-speed authentication is possible. Note that the reading unit 542 may read a fingerprint. Further, the reading unit 542 is not limited to a square shape, and may have any shape.

  In the information processing apparatus 500 according to the third embodiment, it is automatically determined whether the vein is that of the finger of the right hand or the finger of the left hand. On the other hand, the distinction between the thumb, the index finger, the middle finger, the ring finger, and the little finger of each of the right hand and the left hand is determined based on a user input when obtaining biometric information.

  Further, in the information processing apparatus 500, as will be described in detail later with reference to FIG. 43, based on the opposite side to the side where the valley portion of the base of the finger, which is the direction feature portion, is the finger of the acquisition right hand? Determine if it is the finger of the left hand. Further, in the information processing apparatus 500, a biometric image of a finger vein is acquired in response to the user positioning the finger to be acquired on a space separated from the vein sensor surface by a certain distance (for example, several centimeters).

  FIG. 41 is a diagram illustrating a biometric feature table according to the third embodiment. The biometric feature table 541b1 illustrated in FIG. 41 is set in the biometric feature information storage unit 541b included in the information processing apparatus 500 according to the third embodiment. The biometric feature table 541b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 500.

  The biometric feature table 541b1 includes “number”, “ID”, “left / right”, “finger type”, and “feature data” as items. In the biometric feature table 541b1, values set in the above items are associated with each other as biometric feature information.

  The left and right indicate whether the biometric feature information is for the right hand or the left hand. “Right” is set for the biometric feature information of the vein of any finger of the right hand. “Left” is set for the biometric feature information of the vein of any finger of the left hand.

  The finger type indicates the type of finger indicated by the biometric feature information. For the biometric feature information of the thumb vein, “1” is set. “2” is set for the biometric feature information of the index finger vein. “3” is set for the biometric feature information of the middle finger vein. “4” is set for the biometric feature information of the vein of the ring finger. “5” is set for the biometric feature information of the vein of the little finger. The left and right indicate types in the biometric feature information.

  The biometric feature table 541b1 shown in FIG. 41 is an example, and any item can be set in the biometric feature table. For example, in addition to the left and right items, an item of an angle indicating the left and right as angles (for example, “0 ° for the right hand and“ 90 ° ”for the left hand) may be provided. Further, an angle item may be provided instead of the left and right items.

  FIG. 42 is a diagram illustrating a state when the vein of the index finger of the right hand is read according to the third embodiment. FIG. 42 is a diagram of the state when the information processing apparatus 500 reads the vein of the index finger of the right hand as viewed from above. As illustrated in FIG. 42, the information processing apparatus 500 includes a display unit 120 and a main body unit 530. A keyboard 131 and a reading unit 542 are provided on the upper surface of the main body 530 of the information processing apparatus 500. The reading unit 542 is on the same top surface of the main body 530 as the keyboard 131 of the information processing apparatus 500, and each side of a square vein sensor having a cross-shaped guide 542 a is located at the front center of the keyboard 131. It is arranged at an angle of 45 ° on the front and side surfaces. Also, the user's head 201, torso 202, upper right arm 203, right lower arm 204, right hand palm 205, right hand thumb 205a, right hand index finger 205b, right hand middle finger 205c, right hand ring finger 205d, right hand The little finger 205e is shown.

  When the user causes the reading unit 542 to read the vein of the finger of the hand, the user moves the finger for reading the vein (for example, the index finger 205b of the right hand) to the left of the side surface of the information processing apparatus 500 as illustrated in FIG. It is positioned so as to be parallel to the upper surface of the main body 530 of the information processing apparatus 500 at an angle of 45 °. At this time, the user opens the finger of the right hand so that the center line of the index finger 205b of the right hand to be read matches the center line of one side of the cross of the reading unit 542 (dotted line AA in FIG. 40). The finger is positioned in a space separated from the vein sensor surface by a certain distance (for example, several centimeters). The user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably with respect to the direction feature portion. Therefore, the right hand or left hand can be determined quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. 42, the case where the vein of the index finger of the right hand of the user is read has been described. However, the present invention is not limited to this, and the same applies to the case of reading the vein of the finger other than the index finger of the right hand and the finger of the left hand. Yes, the description is omitted.

  Further, each finger of the user's right hand is located in a space that is a certain distance away from the vein sensor surface together with the palm 205 of the right hand, so that the finger or palm touches the keyboard 131 or other operation unit to cause an erroneous operation. This can be suppressed.

  FIG. 43 is a diagram illustrating detection of the direction characteristic portion of the finger according to the third embodiment. FIG. 43 shows left and right hand and directional features of the living body in the information processing apparatus 500 according to the third embodiment. FIG. 43A shows an acquired image 5421 of the right hand finger. FIG. 43B shows an acquired image 5422 of the finger vein of the left hand.

  The acquired images 5421 and 5422 are images acquired by the living body detection unit 142a of the reading unit 542. The acquired images 5421 and 5422 are captured by, for example, an image sensor included in the living body detection unit 142a. In the acquired images 5421 and 5422, the upper side in FIG. 43 is the back side when viewed from the front of the information processing apparatus 500, the lower side in FIG. 43 is the front side when viewed from the front of the information processing apparatus 500, and the right side in FIG. 43 is the right side when viewed from the front of the information processing apparatus 500, and the left side in FIG. 43 is the left side when viewed from the front of the information processing apparatus 500.

  As shown in FIG. 43, the image acquired by the living body detection unit 142a includes a right hand detection rectangular image region 5420a for detecting a valley portion of the base of the palm of the right hand palm along the lower right side, and a lower left side. A left hand detection rectangular image region 5420b for detecting a valley portion at the base of the finger of the palm of the left hand is set. The right-hand detection rectangular image region 5420a and the left-hand detection rectangular image region 5420b are provided along two sides that are orthogonal to each other in the acquired image. The valley portion at the base of the finger functions as a direction feature portion.

  When the reading unit 542 detects a finger of a user's hand located above the reading unit 542 with a distance sensor (not shown) of the living body detection unit 142a, the reading unit 542 acquires an image of the finger of the hand with an image sensor (not shown) of the living body detection unit 142a. To the information acquisition unit 511. The information acquisition unit 511 determines whether a valley portion at the base of the finger is detected in the right-hand detection rectangular image region 5420a or the left-hand detection rectangular image region 5420b set in the acquired image. The image acquisition of the finger by the image sensor here is performed without being irradiated with near-infrared light from the light source unit 142c. Therefore, the image acquired here is not an image of a finger vein but an image of the appearance of the finger. In the detection processing of the valley portion at the base of the finger, this detection processing is performed for the position where the right hand detection rectangular image region 5420a or the left hand detection rectangular image region 5420b is set in the image of the appearance of the finger.

  When the image of the appearance of the finger is acquired as described above by the living body detection unit 142a, the information acquisition unit 511 is in the valley portion of the base of the finger in the rectangular image region 5420a for right hand detection or the rectangular image region 5420b for left hand detection. It is determined whether or not a certain direction feature portion exists. For example, in the case of an acquired image 5421 as shown in FIG. 43A, the direction feature portion 5421a1 exists in the right-hand detection rectangular image region 5420a, but does not exist in the left-hand detection rectangular image region 5420b. , It is determined to be located at an angle of “0 °”. Thereby, the information processing apparatus 500 determines that the acquired image is an image of the finger of the right hand.

  For example, in the case of an acquired image 5422 as shown in FIG. 43B, the direction feature portion 5422b1 exists in the left-hand detection rectangular image region 5420b, but does not exist in the right-hand detection rectangular image region 5420a. It is determined that the palm is positioned at an angle of “90 °” clockwise. As a result, the information processing apparatus 500 determines that the acquired image is an image of the finger of the left hand.

  In this manner, when the finger of the user's hand is detected, the reading unit 542 acquires an image of the direction feature portion at the base of the finger. The information acquisition unit 511 determines the angle of the hand based on the presence or absence of the direction feature portion of the right hand detection rectangular image region or the left hand detection rectangular image region. The information acquisition unit 511 determines the type of finger (right finger or left hand finger) in the acquired image based on the determined hand angle. In the following description of the third embodiment, the information indicated by the finger type is described as “hand left and right”.

Note that the acquired image for detecting the direction feature portion illustrated in FIGS. 43A and 43B may be an image captured by the imaging unit 142b.
The detection of the direction feature portion in the third embodiment, that is, the valley portion of the base of the finger, for example, the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, the valley portion of the base of the ring finger and the little finger All or at least some combinations may be detected based on being open at a predetermined interval. For example, the detection of the valley portion of the base of the finger may be realized by the position of the base of each finger in the acquired image of the finger acquired by the living body detection unit 142a and the contour identification between the base of the finger. Good.

  FIG. 44 is a flowchart illustrating biometric feature information acquisition registration processing according to the third embodiment. The biometric feature information acquisition / registration process is a process for determining the type of finger to be registered, and generating and registering biometric feature information indicating the type of finger (left and right of the hand), the type of finger, and the characteristics of veins. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a finger vein. In the following, the process illustrated in FIG. 44 will be described in order of step number.

[Step S <b> 191] The information acquisition unit 511 receives an input of a finger type from which a biometric information of veins is acquired by the user.
[Step S192] The information acquisition unit 511 determines that the finger is placed at a predetermined height on the reading unit 542 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 542. If it is determined that a finger is placed, the information acquisition unit 511 detects a valley portion at the base of the finger, which is a direction feature portion, from a captured image by the image sensor provided in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the finger to be registered is determined. In the third embodiment, the type of finger is specified based on the user's input, and the direction feature portion is used to determine the left and right of the hand.

[Step S193] The type determination unit 512 determines the type of finger (left and right of the hand) based on the direction of the finger determined in step S192.
[Step S194] The information acquisition unit 511 causes the imaging unit 142b of the reading unit 542 to capture an image and acquires a biological image in which a finger vein is reflected.

Note that the processing order of steps S193 and S194 may be reversed or simultaneous.
[Step S195] The information generation unit 513 extracts features of the living body based on the living body image acquired in step S194. The information generation unit 513 generates biometric information indicating the feature extraction result.

  [Step S196] The information generation unit 513 generates biometric feature information including the type determined in step S193, the type of finger received in step S191, the biometric information generated in step S195, and the user ID. To do.

  [Step S197] The information generation unit 513 stores the biometric feature information generated in step S196 in the biometric feature information storage unit 541b. Thereby, the biometric feature information is registered in the biometric feature information storage unit 541b. Thereafter, the process ends.

  FIG. 45 is a flowchart illustrating biometric feature information authentication processing according to the third embodiment. The biometric feature information authentication process is a process for determining the type of the finger to be authenticated, generating collation biometric information indicating the characteristics of the vein of the finger to be authenticated, and performing authentication by collating with the biometric feature information registered in advance. It is. The collation biometric feature information is data having the same configuration as that of the biometric feature information, and indicates the feature of the authentication target user's biometric (in the third embodiment, the finger vein). The biometric feature information authentication process is executed, for example, when the user authenticates with a finger vein. In the following, the process illustrated in FIG. 45 will be described in order of step number.

  [Step S <b> 201] The information acquisition unit 511 determines that a finger is placed at a predetermined height on the reading unit 542 based on the detection result of the distance sensor included in the living body detection unit 142 a of the reading unit 542. If it is determined that a finger is placed, the information acquisition unit 511 detects a valley portion at the base of the finger, which is a direction feature portion, from a captured image by the image sensor provided in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the finger to be authenticated is determined.

[Step S202] The type determination unit 512 determines the type of finger (left and right of the hand) based on the direction of the finger determined in step S201.
[Step S203] The information acquisition unit 511 causes the imaging unit 142b of the reading unit 542 to capture an image and acquires a biological image in which a finger vein is reflected.

Note that the processing order of steps S202 and S203 may be reversed or simultaneous.
[Step S204] The information generation unit 513 extracts a feature of the living body based on the living body image acquired in step S203. The information generation unit 513 generates biometric information indicating the feature extraction result.

[Step S205] The information generation unit 513 generates verification biometric feature information including the type determined in step S202 and the biometric information generated in step S204.
[Step S206] The matching unit 514 performs biometric feature information matching processing (described later in FIG. 46) using the matching biometric feature information generated in step S205. Thereafter, the process ends.

  FIG. 46 is a flowchart illustrating biometric feature information matching processing according to the third embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 46 will be described in order of step number.

[Step S211] The collation unit 514 acquires collation biometric feature information generated by the biometric feature information authentication process.
[Step S212] The collation unit 514 refers to the biometric feature information storage unit 541b and extracts biometric feature information that matches the type (left and right) of the collation biometric feature information acquired in step S211. When one-to-one matching is performed, the matching unit 514 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 541b.

  [Step S213] The collation unit 514 selects one of the biometric feature information extracted in step S212 and selects biometric information included in each of the selected biometric feature information and collation biometric feature information.

  [Step S214] The collation unit 514 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S213. If the verification is successful (step S214 YES), the process proceeds to step S215. On the other hand, if the verification fails (NO in step S214), the process proceeds to step S216.

[Step S215] The collation unit 514 executes predetermined processing when authentication is successful. Thereafter, the process returns.
[Step S216] The collation unit 514 determines whether all the biometric feature information extracted in Step S212 has been selected in Step S213. If all have been selected (YES in step S216), the process proceeds to step S217. On the other hand, if there is an unselected item (NO in step S216), the process proceeds to step S213.

[Step S217] The collation unit 514 executes predetermined processing when authentication fails. Thereafter, the process returns.
FIG. 47 is a diagram illustrating a message window at the time of registration according to the third embodiment. A message window 121b illustrated in FIG. 47 is an example of a window displayed on the display screen of the LCD 121 included in the information processing apparatus 500. In the message window 121b, a message and an image for notifying the user that biometric feature information has been successfully registered based on the read finger are displayed.

  In the message window 121b, for example, a message “successful registration of the index finger of the right hand” and a biological image showing the vein of the index finger of the right hand imaged at the time of registration are displayed. The message window 121b has an OK button 121b1.

  The OK button 121b1 is a button for ending the display of the message window 121b. When the user confirms the display contents of the message window 121b, the user can end the display of the message window 121b by operating the OK button 121b1.

  According to the third embodiment as described above, it is possible to determine whether the finger is the right hand or the left hand according to the angle of the living body when acquiring the biological information about the vein of the finger. It becomes possible.

  In addition, since the biometric image can be acquired in a plurality of directions in the reading unit 542, the degree of freedom of the user's posture when acquiring the biometric information is increased, and the burden on the user's shoulder, arm, and wrist joint can be suppressed. it can.

  In addition, it is possible to improve the convenience for the administrator and the user, and to quickly and surely register and update the biometric feature information, without the trouble of the operation in which the administrator or the user manually selects the left and right each time. .

  Further, by previously classifying and registering biometric feature information according to type (left and right hand) and finger type, one piece of biometric feature information and a plurality of pieces of biometric feature information stored in the biometric feature information storage unit 541b Can be narrowed down based on the type. For this reason, it is possible to suppress an increase in authentication processing load and processing speed.

[Fourth Embodiment]
Next, a fourth embodiment will be described. Differences from the third embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the fourth embodiment is different from the third embodiment in that it does not require designation of a finger type at the time of biometric information acquisition or registration.

  FIG. 48 is a block diagram illustrating an information processing apparatus according to the fourth embodiment. An information processing apparatus 600 according to the fourth embodiment includes an information acquisition unit 611, a type determination unit 612, an information generation unit 613, a collation unit 614, and a biometric feature information storage unit 641b. In addition, a reading unit 642 is connected to the information acquisition unit 611.

  The information acquisition unit 611 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 600. The information acquisition unit 611 can acquire the direction of the living body in a state where the biological image is acquired. The biological image acquired by the information acquisition unit 611 is image information of a plurality of types of finger vein patterns. The direction of the living body is a direction in which two orthogonal different directions based on the left and right of the hand and the directions of the fingers are combined. In addition, the information processing apparatus 600 includes a guide that indicates a direction corresponding to each of a plurality of types of fingers to be read.

  The reading unit 642 is fixed to the upper part of the information processing apparatus 600. The reading unit 642 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG. The information acquisition unit 611 determines that the living body is disposed at a predetermined distance from the reading unit 642 and the direction of the living body with respect to the reading unit 642 based on the detection result by the living body detection unit 142a of the reading unit 642. The information acquisition unit 611 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 642.

  In the fourth embodiment, the directions of the left and right hands to be placed are orthogonal to each other and the angle with the keyboard 131 is an oblique angle. Further, the direction of each finger is a predetermined direction with respect to the direction of the hand. The direction characteristic portion is a valley portion at the base of the finger in the palm of the finger to be read.

In addition, the information acquisition unit 611 acquires a biological image in which a living body is captured by the imaging unit 142b of the reading unit 642.
The type determination unit 612 determines the type of biometric information based on the direction of the biometric acquired by the information acquisition unit 611. The type indicates the left and right hand and finger type that are the basis for generating biometric information.

  The information generation unit 613 generates biological information indicating the characteristics of the biological body based on the biological image acquired by the information acquisition unit 611. The information generation unit 613 generates collation biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 612. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown. The information generation unit 613 includes, for example, biological features including the biological information acquired by the information generation unit 613, the type of biological information determined by the type determination unit 612, and identification information that identifies an individual corresponding to the biological information. Information is generated and stored in the biometric feature information storage unit 641b. As a result, biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered. The type includes information indicating whether the finger is the right hand or the left hand, and information indicating whether the thumb, index finger, middle finger, ring finger, or little finger. In the following description of the fourth embodiment, the contents indicated by the former are described as “left and right hands”, and the contents indicated by the latter are described as “finger types”.

  The information generation unit 613 stores the generated biometric feature information in the biometric feature information storage unit 641b when the user's biometric information is registered. Further, at the time of authentication using the user's biometric information, the verification unit 614 performs authentication using the verification biometric feature information generated by the information generation unit 613.

  The collation unit 614 extracts biometric feature information whose type matches the collation biometric feature information, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 600 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed. The type indicates the right and left of the hand and the type of finger.

The biometric feature information storage unit 641b stores biometric feature information indicating the biometric information and the type of the biometric information. Thereby, a user's biometric information and a classification are matched and memorize | stored.
FIG. 49 is a diagram illustrating a reading unit according to the fourth embodiment. A reading unit 642 illustrated in FIG. 49 is an input device that allows a user to read a vein of a finger and input a biological image.

  The reading unit 642 includes a vein sensor that acquires biological information of the finger vein by reading the vein of the user's finger. The reading unit 642 has a square shape and is inclined by 45 °.

  Around the reader 642, guides 642c, 642d1, 642d2, 642e1, 642e2, 642f1, 642f2, 642g1, 642g2, 642h1, 642h2, 642i1, 642i2, 642j1, 642j1, 642k1, 642k2, 642l are provided. A guide 642m is provided in the center of the reading unit 642.

  A direction C connecting the guide 642c and the guide 642m is set to have a predetermined angle (for example, 180 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction D connecting the guide 642d1 and the guide 642d2 is set to have a predetermined angle (for example, 150 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction E connecting the guide 642e1 and the guide 642e2 is set to have a predetermined angle (for example, 135 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction F connecting the guide 642f1 and the guide 642f2 is set to have a predetermined angle (for example, 120 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction G connecting the guide 642g1 and the guide 642g2 is set to have a predetermined angle (for example, 105 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction H connecting the guide 642h1 and the guide 642h2 is set to have a predetermined angle (for example, 75 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction I connecting the guide 642i1 and the guide 642i2 is set to have a predetermined angle (for example, 60 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction J connecting the guide 642j1 and the guide 642j2 is set to have a predetermined angle (for example, 45 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction K connecting the guide 642k1 and the guide 642k2 is set to have a predetermined angle (for example, 30 °) clockwise with respect to the left-right direction toward the information processing apparatus 600. A direction L connecting the guides 642l and 642m is set to have a predetermined angle (for example, 0 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.

  Each guide corresponds to a finger type. That is, the guide 642c and the guide 642m correspond to the thumb of the left hand. The guide 642d1 and the guide 642d2 correspond to the left index finger. The guide 642e1 and the guide 642e2 correspond to the middle finger of the left hand. Guide 642f1 and guide 642f2 correspond to the ring finger of the left hand. The guide 642g1 and the guide 642g2 correspond to the little finger of the left hand. The guide 642h1 and the guide 642h2 correspond to the little finger of the right hand. The guide 642i1 and the guide 642i2 correspond to the ring finger of the right hand. The guide 642j1 and the guide 642j2 correspond to the middle finger of the right hand. The guide 642k1 and the guide 642k2 correspond to the index finger of the right hand. The guide 642l and the guide 642m correspond to the thumb of the right hand.

  The user places a finger for acquiring biometric information between any of the above combinations of guides. That is, when the user causes the information processing apparatus 600 to acquire the biological information of the vein of the right index finger, the user moves the belly of the index finger of the right hand, which is the acquisition target, along the direction K between the guide 642k1 and the guide 642k2. Position. The same applies to the other fingers, and a description thereof will be omitted. Thereby, when acquiring biometric information of a finger for a user, it can be acquired in a natural and easy posture, and the information processing apparatus 600 also depends on which direction the finger vein is along in the acquired image. It is possible to determine whether the acquisition target finger is a right-hand finger or a left-hand finger, and which type.

Note that the reading unit 642 may read a fingerprint. In addition, the reading unit 642 is not limited to a square shape, and can have any shape.
In the information processing apparatus 600 according to the fourth embodiment, as in the third embodiment, the finger to be acquired is placed on the right hand based on the side opposite to the side with the valley portion of the base of the finger that is the direction feature portion. Whether the finger is a finger or a left hand and the kind of the finger is determined. Further, in the information processing apparatus 600, a biometric image of a finger vein is acquired when the user places the acquisition target finger on a space separated from the vein sensor surface by a certain distance (for example, several centimeters).

  FIG. 50 is a diagram illustrating a biometric feature table according to the fourth embodiment. The biometric feature table 641b1 illustrated in FIG. 50 is set in the biometric feature information storage unit 641b included in the information processing apparatus 600 according to the fourth embodiment. The biometric feature table 641b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 600.

  The biometric feature table 641b1 includes “number”, “ID”, “left / right”, “finger type”, “angle”, and “feature data” as items. In the biometric feature table 641b1, values set in the above items are associated with each other as biometric feature information.

  The angle indicates the angle of the finger when the finger vein indicated by the biometric feature information is detected. In the fourth embodiment, the angle is associated with the type of finger. For example, with respect to the biometric feature information of the vein of the thumb of the right hand, “0”, which is a reference angle and indicates 0 ° that is an angle in the direction L, is set. For the biometric feature information of the right index finger vein, “30” indicating the direction K angle of 30 ° is set. For the biometric feature information of the middle finger vein of the right hand, “45” indicating 45 degrees that is the angle of direction J is set. For the biometric feature information of the vein of the ring finger of the right hand, “60” indicating 60 degrees that is the angle of direction I is set. With respect to the biometric feature information of the vein of the little finger of the right hand, “75” indicating 75 ° that is the angle of direction H is set. For the biometric feature information of the left thumb vein, “180” indicating 180 degrees that is the angle of direction C is set. For the biometric feature information of the vein of the left index finger, “150” indicating 150 degrees that is the angle of direction D is set. For the biometric feature information of the vein of the middle finger of the left hand, “135” indicating 135 degrees that is the angle of direction E is set. For the biometric feature information of the left finger ring, “120” indicating 120 degrees that is the angle of the direction F is set. For the biometric feature information of the vein of the little finger of the left hand, “105” indicating 105 ° that is the angle of the direction G is set.

Such biometric feature information can be distinguished from the biometric feature information of which finger in which hand according to the type of finger specified by left and right and finger type.
The biometric feature table 641b1 illustrated in FIG. 50 is an example, and any item can be set in the biometric feature table. For example, the left and right and finger type items may be omitted. Alternatively, the angle item may be omitted.

  FIG. 51 is a diagram illustrating a state when reading the vein of the index finger of the right hand according to the fourth embodiment. FIG. 51 is a diagram of the state when the information processing apparatus 600 reads the vein of the index finger of the right hand as viewed from above. As illustrated in FIG. 51, the information processing apparatus 600 includes a display unit 120 and a main body unit 630. A keyboard 131 and a reading unit 642 are provided on the upper surface of the main body 630 of the information processing apparatus 600. The reading unit 642 is on the same top surface of the main body 630 as the keyboard 131 of the information processing apparatus 600, and each side of the square vein sensor surrounded by the guides 642 c to 642 m is in the center of the front side of the keyboard 131. The front and side surfaces of 600 are arranged at a 45 ° angle. Also, the user's head 201, torso 202, upper right arm 203, right lower arm 204, right hand palm 205, right hand thumb 205a, right hand index finger 205b, right hand middle finger 205c, right hand ring finger 205d, right hand The little finger 205e is shown.

  When the user causes the reading unit 642 to read the vein of the finger of the hand, the user watches the finger for reading the vein (for example, the index finger 205b of the right hand) with respect to the front surface of the information processing apparatus 600 as shown in FIG. It is positioned so as to be parallel to the upper surface of the main body 630 of the information processing apparatus 600 at an angle of 30 ° around. At this time, the user opens the right hand finger on a space separated from the vein sensor surface by a certain distance (for example, several centimeters) so that the center line of the right index finger 205b to be read coincides with the direction K. Position your finger. The user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably with respect to the direction feature portion. Therefore, the right hand or left hand can be determined quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. In addition, although FIG. 51 demonstrated the case where the vein of the index finger of a user's right hand was read, it is not limited to this, and the same applies to the case of reading the vein of a finger other than the index finger of the right hand and the finger of the left hand. Yes, the description is omitted.

  Further, each finger of the user's right hand is located in a space that is a certain distance away from the vein sensor surface together with the palm 205 of the right hand, so that the finger or palm touches the keyboard 131 or other operation unit to cause an erroneous operation. This can be suppressed.

  52 and 53 are diagrams illustrating a finger direction determination method according to the fourth embodiment. 52 and 53 show acquired images 6421 and 6422 when reading the vein of the middle finger of the right hand.

  The acquired images 6421 and 6422 are images acquired by the living body detection unit 142a of the reading unit 642. The acquired images 6421 and 6422 are captured by, for example, an image sensor included in the living body detection unit 142a. In the acquired images 6421 and 6422, the upper side in FIGS. 52 and 53 is the back side when viewed from the front of the information processing apparatus 600, and the lower side in FIGS. 52 and 53 is the front side when viewed from the front of the information processing apparatus 600. 52 and 53, the right side is the right side when viewed from the front of the information processing apparatus 600, and the left side of FIGS. 52 and 53 is the left side when viewed from the front of the information processing apparatus 600. FIG. 52 shows an index finger image 205b1 of the right hand of the user, an image 205c1 of the middle finger of the right hand, an image 205d1 of the ring finger of the right hand, and an image 205e1 of the little finger of the right hand. FIG. 53 shows an index finger image 205b2 of the right hand of the user, an image 205c2 of the middle finger of the right hand, an image 205d2 of the ring finger of the right hand, and an image 205e2 of the little finger of the right hand.

  As shown in FIG. 52, in the fourth embodiment, the acquired image 6421 includes a right hand detection rectangular image area 6421n0, 6421n1, 6421n2, 6421n3, 6421n4 for detecting a valley portion at the base of the finger of the palm of the right hand. Left hand detection rectangular image areas 6421n5, 6421n6, 6421n7, 6421n8, and 6421n9 are set for detecting a valley portion at the base of the finger of the palm of the left hand. The right-hand detection rectangular image regions 6421n0 to 6421n4 and the left-hand detection rectangular image regions 6421n5 to 6421n9 are provided inside two lower sides of the acquired image 6421. The right hand detection rectangular image regions 6421n0 to 6421n4 correspond to the thumb of the right hand, the index finger of the right hand, the middle finger of the right hand, the ring finger of the right hand, and the little finger of the right hand, and correspond to the directions L to H, respectively. The left hand detection rectangular image areas 6421n5 to 6421n9 correspond to the little finger of the left hand, the ring finger of the left hand, the middle finger of the left hand, the index finger of the left hand, and the thumb of the left hand, respectively, and correspond to the directions G to C, respectively. The information processing apparatus 600 detects a direction feature portion when a direction feature portion that is a valley portion at the base of the finger is detected in any of the right hand detection rectangular image regions 6421n0 to 6421n4 and the left hand detection rectangular image regions 6421n5 to 6421n9. It is determined that the finger corresponding to the right-hand detection rectangular image region or the left-hand detection rectangular image region in which is detected is a reading target. The valley portion at the base of the finger functions as a direction feature portion.

  The acquired images 6421 and 6422 are images acquired by an image sensor (not shown) of the living body detection unit 142a of the reading unit 642. The image acquisition of the finger by the image sensor here is performed without being irradiated with near-infrared light from the light source unit 142c. Therefore, the image acquired here is not an image of a finger vein but an image of the appearance of the finger. In the detection processing of the valley portion at the base of the finger, this detection processing is performed for the position where each rectangular image region is set in the image of the appearance of the finger.

  Here, in the acquired image 6421, a directional feature portion 6421n21 that is a valley portion of the index finger, middle finger, and ring finger base of the user's right hand exists in the right hand detection rectangular image region 6421n2 corresponding to the right hand middle finger.

  In the information processing apparatus 600 according to the fourth embodiment, when the user reads the vein of the middle finger of the right hand, the palm in the state where the finger is opened is on a space separated from the vein sensor surface by a certain distance (for example, several centimeters). To be located. At this time, the user positions the root of the middle finger of the right hand above the right hand detection rectangular image region 6421n2 corresponding to the middle finger based on the guides 642j1 and 642j2 corresponding to the middle finger of the right hand, and moves the guide 642m at the center of the vein sensor. The passing direction J and the center line of the middle finger of the right hand are matched. When the information processing apparatus 600 detects that the middle finger of the right hand is positioned at a predetermined height by a distance sensor (not shown) of the living body detection unit 142a, the right hand detection rectangular image areas 6421n0 to 6421n4 corresponding to the fingers of both hands, For the left-hand detection rectangular image regions 6421n5 to 6421n9, it is determined whether or not there is a valley portion at the base of the finger. At this time, a valley portion (direction feature portion 6421n21) at the base of the finger exists in the right-hand detection rectangular image region 6421n2 corresponding to the middle finger of the right hand. For this reason, the information processing apparatus 600 determines that the finger to be read is the middle finger of the right hand. The detection of the valley portion of the base of the finger can be performed by, for example, a combination of all or at least a combination of the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, and the valley portion of the base of the ring finger and the little finger at a predetermined interval. You may detect based on being open. For example, the detection of the valley portion of the base of the finger may be realized by the position of the base of each finger in the acquired image of the palm acquired by the imaging unit (not shown) and the contour identification between the base of the finger. Good. Although the determination example of the middle finger of the right hand has been shown, the same applies to the case of other fingers, and the description thereof is omitted.

  Further, in the information processing apparatus 600 according to the fourth embodiment, an erroneous determination of the type of finger in the acquired image 6422 is prevented as shown in FIG. In FIG. 53, the direction J passes through the right hand detection rectangular image region 6422n2 corresponding to the middle finger of the right hand and the guide 642m at the center of the vein sensor. The tangent line P is a tangent line of the contour of the index finger of the right hand positioned so as to pass through the guide 642m. The tangent line P may be a tangent line in the vicinity of the guide 642m of the target finger.

  In the example of FIG. 53, the user's right index finger image 205b2 is positioned above the reading unit 642 by the user. In FIG. 53, in the acquired image 6422, a right hand detection rectangular image region 6422n2 corresponding to the middle finger, other right hand detection rectangular image regions (not shown), and a left hand rectangular image region (not shown) are set. When the information sensor 600 detects the index finger image 205b2 of the right hand with the distance sensor, the information processing device 600 starts reading the finger vein. At this time, the direction feature portion 6422n21 is detected in the right hand detection rectangular image region 6422n2 corresponding to the middle finger of the right hand from the image acquired by the living body detection unit 142a. In this case, if the determination is made according to the direction feature portion 6422n21, the reading target is the middle finger of the right hand, which is erroneously determined. However, in such a case, the information processing apparatus 600 obtains the tangent line P of the contour line of the index finger image 205b2 of the right hand to be read, and whether or not the intersection angle R between the tangent line P and the direction J is equal to or greater than a predetermined angle Rt. Determine whether. As a result of the determination, when the intersection angle R between the tangent line P and the direction J is equal to or larger than the predetermined angle Rt, it is determined that the position of the finger to be read is incorrect, for example, “remove the palm from the sensor once and fix it to the correct position. The user may read the finger again after prompting the user to correct the position of the finger to be read. The message may be displayed on the LCD 121 or may be output by voice from a speaker (not shown).

  FIG. 54 is a diagram illustrating another method of determining the finger direction according to the fourth embodiment. The information processing apparatus 600 according to the fourth embodiment may prevent an erroneous determination of the finger type as shown in FIG. 54, the upper side in FIG. 54 is the back side when viewed from the front of the information processing apparatus 600, the lower side in FIG. 54 is the front side when viewed from the front of the information processing apparatus 600, and the right side in FIG. The left side in FIG. 54 is the left side when viewed from the front side of the information processing apparatus 600. Further, an index finger image 205b3 of the user's right hand, a middle finger image 205c3 of the right hand, a ring finger image 205d3 of the right hand, and a pinkie image 205e3 of the right hand are shown.

  In the determination method shown in FIG. 54, finger position impossibility regions 6423o1 and 6423o2 in which an image of the user's finger cannot be located are set in the acquired image 6423. The finger position impossibility regions 6423o1 and 6423o2 are the sides on the opposite side of the acquired image 6423 from the direction feature portions 6423n21 and 6423n22 after the direction feature portions 6423n21 and 6423n22 in the right-hand detection rectangular image region 6423n2 are detected by the information processing device 600. It is formed to extend to. Here, when three or more finger position impossible areas are detected in the right hand detection rectangular image area (or left hand detection rectangular image area), the finger position impossible area is similarly defined as the right hand detection rectangular image area (or left hand detection area). (Rectangular image area for detection) is formed to extend from the direction feature portion to the opposite side of the acquired image 6423.

  In the example of FIG. 54, the image of the index finger 205b3 of the right hand to be read is positioned above the reading unit 642 by the user. In FIG. 54, a right-hand detection rectangular image region 6423n2 corresponding to the middle finger, other right-hand detection rectangular image regions (not shown), and a left-hand rectangular image region (not shown) are set in the acquired image 6423. When the information sensor 600 detects the index finger image 205b3 of the right hand with the distance sensor, the information processing device 600 starts reading the finger vein. At this time, the direction characteristic portions 6423n21 and 6423n22 are detected in the right hand detection rectangular image region 6423n2 corresponding to the middle finger of the right hand from the image acquired by the living body detection unit 142a. In this case, if the determination is made according to the direction feature portions 6423n21 and 6423n22, the reading target is the middle finger of the right hand, and the determination is erroneous. However, in such a case, the information processing apparatus 600 determines whether or not the middle finger image 205b3 of the right hand overlaps at least one of the finger position impossible regions 6423o1 and 6423o2 as shown in FIG. As a result of the determination, when it overlaps with the finger position impossible areas 6423o1 and 6423o2 (in the example of FIG. 54, the middle finger image 205b3 of the right hand overlaps with the finger position impossible area 6423o1), the position of the finger to be read For example, output a message such as “remove the palm from the sensor and fix it to the correct position”, prompt the user to correct the position of the finger to be read, and then read the finger again. May be performed. The message may be displayed on the LCD 121 or may be output by voice from a speaker (not shown).

52, 53, and 54, the acquired image for detecting the direction feature portion may be an image captured by the imaging unit 142b.
FIG. 55 is a flowchart illustrating biometric feature information acquisition and registration processing according to the fourth embodiment. The biometric feature information acquisition / registration process is a process for determining the type of a finger to be registered, and generating and registering biometric feature information indicating the type of finger (left / right / finger type) and vein characteristics. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a finger vein. In the following, the process illustrated in FIG. 55 will be described in order of step number.

  [Step S221] The information acquisition unit 611 determines that a finger is placed at a predetermined height on the reading unit 642 based on the detection result of the distance sensor provided in the living body detection unit 142a of the reading unit 642. If it is determined that a finger is placed, the information acquisition unit 611 detects a valley portion at the base of the finger, which is a directional feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected directional feature portion. The direction of the finger to be registered is determined. In the fourth embodiment, the contour of a finger to be registered is detected, and the direction of the finger is determined from the detected contour of the finger, thereby preventing erroneous detection of finger placement. The direction feature is used to determine the left and right hand and finger type.

[Step S222] The type determining unit 612 determines the type of finger (left and right hand and finger type) based on the direction of the hand determined in step S221.
[Step S223] The information acquisition unit 611 causes the imaging unit 142b of the reading unit 642 to capture an image, and acquires a biological image in which a finger vein is reflected.

Note that the processing order of steps S222 and S223 may be reversed or simultaneous.
[Step S224] The information generation unit 613 extracts features of biometric information based on the biometric image acquired in step S223. The information generation unit 613 generates biometric information indicating the feature extraction result.

[Step S225] The information generation unit 613 generates biometric feature information including the type determined in step S222, the biometric information generated in step S224, and the user ID.
[Step S226] The information generation unit 613 stores the biometric feature information generated in step S225 in the biometric feature information storage unit 641b. Thereby, the biometric feature information is registered in the biometric feature information storage unit 641b. Thereafter, the process ends.

  FIG. 56 is a flowchart illustrating biometric feature information authentication processing according to the fourth embodiment. The biometric feature information authentication process determines the type of finger to be authenticated (left and right of the hand and the type of finger), generates verification biometric feature information indicating the type of finger to be authenticated and the characteristics of the vein, and is registered in advance. This is a process of verifying against biometric feature information. The collation biometric feature information is data having the same configuration as that of the biometric feature information, and indicates the features of the authentication target user's biometric (in the fourth embodiment, the finger vein). The biometric feature information authentication process is executed, for example, when the user authenticates with a finger vein. In the following, the process illustrated in FIG. 56 will be described in order of step number.

  [Step S231] The information acquisition unit 611 determines that a finger is placed at a predetermined height on the reading unit 642 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 642. If it is determined that a finger is placed, the information acquisition unit 611 detects a valley portion at the base of the finger, which is a directional feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected directional feature portion. The direction of the finger to be authenticated is determined. In the fourth embodiment, the contour of a finger to be registered is detected, and the direction of the finger is determined from the detected contour of the finger, thereby preventing erroneous detection of finger placement. The direction feature is used to determine the left and right hand and finger type.

[Step S232] The type determination unit 612 determines the type of finger (left and right hand and finger type) based on the direction of the finger determined in step S231.
[Step S233] The information acquisition unit 611 causes the imaging unit 142b of the reading unit 642 to capture an image, and acquires a biological image in which a finger vein is reflected.

Note that the processing order of steps S232 and S233 may be reversed or simultaneous.
[Step S234] The information generation unit 613 extracts features of biometric information based on the biometric image acquired in step S233. The information generation unit 613 generates biometric information indicating the feature extraction result.

  [Step S235] The information generation unit 613 generates collation biometric feature information including the type (left and right hand and finger type) determined in step S232 and the biometric information generated in step S234.

  [Step S236] The collation unit 614 executes biometric feature information collation processing (described later in FIG. 57) using the collation biometric feature information generated in step S235. Thereafter, the process ends.

  FIG. 57 is a flowchart illustrating biometric feature information matching processing according to the fourth embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 57 will be described in order of step number.

[Step S241] The collation unit 614 acquires collation biometric feature information generated by the biometric feature information authentication process.
[Step S242] The collation unit 614 refers to the biometric feature information storage unit 641b, and extracts biometric feature information that matches the type of collation biometric feature information (left and right hand and finger type) acquired in step S241. When one-to-one matching is performed, the matching unit 614 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 641b.

  [Step S243] The collation unit 614 selects one of the unselected biometric feature information extracted in step S242, and collates the biometric information included in each of the selected biometric feature information and the collated biometric feature information.

  [Step S244] The collation unit 614 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S243. If the verification is successful (step S244 YES), the process proceeds to step S245. On the other hand, if the verification fails (NO in step S244), the process proceeds to step S246.

[Step S245] The collation unit 614 executes a predetermined process when the authentication is successful. Thereafter, the process returns.
[Step S246] The collation unit 614 determines whether all the biometric feature information extracted in Step S242 has been selected in Step S243. If all have been selected (YES in step S246), the process proceeds to step S247. On the other hand, if there is an unselected item (NO in step S246), the process proceeds to step S243.

[Step S247] The collation unit 614 performs a predetermined process when authentication fails. Thereafter, the process returns.
According to the fourth embodiment as described above, the same effects as those of the third embodiment can be obtained.

  In addition, at the time of acquiring the biological information of the finger vein, it is possible to determine which type of finger is the 10 fingers according to the angle of the biological body. In addition, this eliminates the need for the user to input all ten finger types as well as left and right, and can suppress the burden of operations during registration. Moreover, when performing 1-to-N collation, the object of collation can be narrowed down according to classification about 10 types of fingers. For this reason, it becomes possible to perform collation at high speed when performing one-to-N collation. Alternatively, it is possible to increase the number of pieces of biometric feature information while suppressing an increase in time required for verification and time required for authentication processing.

[Fifth Embodiment]
Next, a fifth embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the fifth embodiment is different from the second embodiment in that both palm veins and finger veins can be used to generate biometric information.

  FIG. 58 is a block diagram illustrating an information processing apparatus according to the fifth embodiment. An information processing apparatus 700 according to the fifth embodiment includes an information acquisition unit 711, a type determination unit 712, an information generation unit 713, a collation unit 714, and a biometric feature information storage unit 741b. The information acquisition unit 711 is connected to a reading unit 742.

  The information acquisition unit 711 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 700. The information acquisition unit 711 can acquire the direction of the living body in a state where the biological image is acquired. The biological images acquired by the information acquisition unit 711 are palm vein pattern image information and finger vein pattern image information. The directions at the palm are two orthogonal different directions based on the left and right of the hand. The finger direction is two orthogonal different directions based on the left and right of the hand. In addition, the information processing apparatus 700 includes a guide that indicates the direction of the finger to be read. The information acquisition unit 711 determines the direction of the living body by determining the position of the direction feature portion in the acquired image.

  The reading unit 742 is fixed to the upper part of the information processing apparatus 700. The reading unit 742 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG. The information acquisition unit 711 determines that the living body is arranged at a predetermined distance from the reading unit 742 and the direction of the living body with respect to the reading unit 742 based on the detection result by the living body detection unit 142a of the reading unit 742. The information acquisition unit 711 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 742. In the fifth embodiment, the directions are orthogonal to each other and the angle with the keyboard 131 is an oblique angle. In the case of a palm or a finger, the direction characteristic portion is a valley portion at the base of the finger.

  In addition, the information acquisition unit 711 determines whether the direction is correct based on the tangent line of the contour line of the finger from the image obtained by the living body detection unit 142a. The determination is not limited to this, and the determination may be made based on the center line of the contour line of the finger.

The information acquisition unit 711 accepts an input of a finger type in which biometric information of veins is registered by the user.
In addition, the information acquisition unit 711 acquires a biological image in which a living body is captured by the imaging unit 142b of the reading unit 742.

  The type determination unit 712 determines the type of biometric information based on the direction of the biometric acquired by the information acquisition unit 711. The type includes information indicating the left and right of the hand that is the basis for generating biometric information. The type includes information indicating whether the information is about palm or information about a finger. As will be described in detail later, the type determination unit 712 determines whether the type of the living body is information related to the palm or information related to the finger based on the determination result of the position of the direction feature portion.

  The information generation unit 713 generates biological information indicating the characteristics of the biological body based on the biological image acquired by the information acquisition unit 711. The information generation unit 713 generates collation biometric feature information including the generated biometric information and the biometric type determined by the type determination unit 712. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown. The information generation unit 713 specifies, for example, the biological information generated by the information generation unit 713, the type of the living body determined by the type determination unit 712, the type of finger received from the user, and the individual corresponding to the biological information. The biometric feature information including the identification information to be generated is generated and stored in the biometric feature information storage unit 741b. As a result, biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered. The type includes information indicating the left and right of the hand and information indicating whether the hand is a palm or a finger. Further, when registering the user's biometric information, the information generating unit 713 stores the generated biometric feature information in the biometric feature information storage unit 741b. At the time of authentication using the user's biometric information, the matching unit 714 performs authentication using the authentication biometric feature information generated by the information generation unit 713.

  The collation unit 714 extracts biometric feature information whose type matches the collation biometric feature information, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 700 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.

The biometric feature information storage unit 741b stores biometric feature information indicating the biometric information and the type of biometric information. Thereby, a user's biometric information and a classification are matched and memorize | stored.
FIG. 59 is a diagram illustrating a reading unit according to the fifth embodiment. A reading unit 742 illustrated in FIG. 59 is an input device that allows a user to read a finger vein or a palm vein and input a biometric image.

  The reading unit 742 has a vein sensor that acquires a biological image of a finger vein or a palm vein by reading a vein of a user's finger or a palm vein. The reading unit 742 has a square shape and is inclined by 45 °. A cross-shaped guide 742 a having sides (side A, side B) parallel to the sides of the reading unit 742 is provided on the upper surface of the reading unit 742. The side A of the guide 742a is inclined 45 ° to the left when viewed from the front of the information processing apparatus 700. The side B of the guide 742a is inclined 45 ° to the right when viewed from the front of the information processing apparatus 700. The vein sensor of the reading unit 742 can acquire a biological image of a vein regardless of the inside or outside of the guide 742a in order to acquire a biological image of a palm vein.

  When acquiring a biological image of a palm vein, the user places the palm on which the biological image is acquired on the upper surface of the reading unit 742, as in the information processing apparatus 100 according to the second embodiment. Thus, the user can acquire the natural image of the palm of the right hand and the biometric image of the palm of the left hand for the user in a natural and easy posture, and the information processing apparatus 700 can also be acquired. It can be determined whether the palm of the hand is the right hand palm or the left hand palm.

  Further, when acquiring a biological image of a finger vein, the user positions a finger for acquiring a biological image along one of the cross-shaped sides of the guide 742a, as in the third embodiment. Accordingly, the user can acquire a natural image of the right hand finger biometric image and the left hand biometric image of the user in a natural and easy posture, and the information processing apparatus 700 can also be acquired. It is possible to determine whether the finger is the finger of the right hand or the finger of the left hand.

Note that the reading unit 742 may read a fingerprint of a finger or a palm print of a palm. Further, the reading unit 742 is not limited to a square shape, and may have any shape.
In the information processing apparatus 700 according to the fifth embodiment, the details will be described later with reference to FIGS. 63 and 64. However, as in the second and third embodiments, the user can It is automatically determined whether the biometric information to be read is a palm vein or a finger vein, and whether it is a left or right hand. On the other hand, the distinction between the thumb, the index finger, the middle finger, the ring finger, and the little finger of each of the right hand and the left hand is determined based on a user input when obtaining biometric information. Further, in the information processing apparatus 700, biometric information on the palm or finger vein is acquired in response to the user positioning the palm or finger to be acquired on a space separated from the vein sensor surface by a certain distance (for example, several centimeters). To do.

  FIG. 60 is a diagram illustrating a biometric feature table according to the fifth embodiment. The biometric feature table 741b1 illustrated in FIG. 60 is set in the biometric feature information storage unit 741b included in the information processing apparatus 700 according to the fifth embodiment. The biometric feature table 741b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 700.

  The biometric feature table 741b1 is provided with “number”, “ID”, “left / right”, “finger type”, “finger”, “hand”, and “feature data” as items. In the biometric feature table 741b1, values set in the above items are associated with each other as biometric feature information.

  The finger indicates that the biometric feature information is the biometric feature information of the finger. The hand indicates that the biometric feature information is the biometric feature information of the hand. For example, when the biometric feature information is biometric feature information of a finger, “0” is set for the finger and not set for the hand (here, indicated by “−”). When the biometric feature information is the biometric feature information of the palm, “0” is set for the hand, not for the finger.

  The biometric feature table 741b1 illustrated in FIG. 60 is an example, and any item can be set in the biometric feature table. For example, if the finger item is omitted and a value is set for the finger type item, it may be indicated that it is the biometric feature information of the finger.

  FIG. 61 is a diagram illustrating a state when the vein of the palm of the right hand is read according to the fifth embodiment. FIG. 61 is a view of the state when the information processing apparatus 700 reads the vein of the palm of the right hand as viewed from above. As illustrated in FIG. 61, the information processing apparatus 700 includes a display unit 120 and a main body unit 730, as in the information processing apparatus 100 according to the second embodiment. A keyboard 131 and a reading unit 742 are provided on the upper surface of the main body 730 of the information processing apparatus 700. The reading unit 742 is the same top surface of the main body 730 as the keyboard 131 of the information processing device 700, and each side of the square vein sensor is 45 ° on the front and side surfaces of the information processing device 700 at the front center of the keyboard 131. Arranged at an angle. The reading unit 742 includes a guide 742a, but the guide 742a is not used when reading the palm vein. In addition, a user's head 201, torso 202, upper right arm 203, right lower arm 204, and right hand palm 205 are shown.

  When the user causes the reading unit 742 to read the vein of the palm, as shown in FIG. 61, the user moves the palm of the person reading the vein (for example, the palm 205 of the right hand) to the left of the side of the information processing apparatus 700. The information processing device 700 is positioned parallel to the upper surface of the main body 730 at an angle of 45 °. At this time, the user is a space separated from the vein sensor surface by a certain distance (for example, several centimeters) with the palm of the right hand 205 open with the finger open so that the center of the palm coincides with the center of the reading unit 742. Position on top. The user does not have to twist the wrist between the palm 205 of the right hand and the right lower arm 204 when reading the vein of the palm, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  In addition, by using the base between the fingers of the hand to detect the palm 205 of the right hand, it is possible to quickly and surely determine the left and right hands. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. In addition, although FIG. 61 demonstrated the case where the vein of the palm of a user's right hand was read, it is the same also when reading the vein of the palm of a left hand, and description is abbreviate | omitted.

  Further, each finger of the user's right hand is located in a space apart from the vein sensor surface together with the palm 205 of the right hand, so that the finger or palm touches the keyboard 131 or other operation unit to cause an erroneous operation. This can be suppressed.

  FIG. 62 is a diagram illustrating a state during reading of the vein of the index finger of the right hand according to the fifth embodiment. FIG. 62 is a view of the state when the information processing apparatus 700 reads the vein of the right index finger viewed from above. As illustrated in FIG. 62, the information processing apparatus 700 includes a display unit 120 and a main body unit 730. A keyboard 131 and a reading unit 742 are provided on the upper surface of the main body 730 of the information processing apparatus 700. The reading unit 742 is on the same top surface of the main body 730 as the keyboard 131 of the information processing apparatus 700, and each side of a square vein sensor having a cross-shaped guide 742 a is located at the front center of the keyboard 131. It is arranged at an angle of 45 ° on the front and side surfaces. Also, the user's head 201, torso 202, upper right arm 203, right lower arm 204, right hand palm 205, right hand thumb 205a, right hand index finger 205b, right hand middle finger 205c, right hand ring finger 205d, right hand The little finger 205e is shown.

  When the user causes the reading unit 742 to read the vein of the finger of the hand, as shown in FIG. 62, the user moves the finger that reads the vein (for example, the index finger 205b of the right hand) to the left of the side surface of the information processing apparatus 700. It is positioned so as to be parallel to the upper surface of the main body 730 of the information processing apparatus 700 at an angle of 45 °. At this time, the user opens the right hand finger so that the center line of the index finger 205b of the right hand to be read coincides with the center line of one side of the cross of the reading unit 742, and a certain distance from the vein sensor surface ( For example, a finger is placed in a space separated by several centimeters. The user does not need to twist the wrist between the palm 205 of the right hand and the right lower arm 204 when reading the finger vein, as in the case of reading the vein of the palm, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably with respect to the direction feature portion. Therefore, the right hand or left hand can be determined quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. 62, the case where the vein of the index finger of the user's right hand is read has been described. However, the present invention is not limited to this, and the same applies to the case where the vein of the finger other than the index finger of the right hand and the finger of the left hand is read. Yes, the description is omitted.

  Further, each finger of the user's right hand is located in a space that is a certain distance away from the vein sensor surface together with the palm 205 of the right hand, so that the finger or palm touches the keyboard 131 or other operation unit to cause an erroneous operation. This can be suppressed.

  FIG. 63 and FIG. 64 are diagrams illustrating detection of a directional feature portion according to the fifth embodiment. FIG. 63A shows an acquired image 7421 when the vein of the palm of the right hand is read. FIG. 63B shows an acquired image 7422 when the vein of the palm of the left hand is read. FIG. 64A shows an acquired image 7423 when the finger vein of the right hand is read. FIG. 64B shows an acquired image 7424 when the finger vein of the left hand is read.

  The acquired images 7421 to 7424 are images acquired by the living body detection unit 142a of the reading unit 742. The acquired images 7421 to 7424 are captured by, for example, an image sensor included in the living body detection unit 142a. In the acquired images 7421 to 7424, the upper side in FIGS. 63 and 64 is the back side when viewed from the front of the information processing apparatus 700, and the lower side in FIGS. 63 and 64 is the front side when viewed from the front of the information processing apparatus 700. 63 and 64, the right side is the right side when viewed from the front of the information processing apparatus 700, and the left side in FIGS. 63 and 64 is the left side when viewed from the front of the information processing apparatus 700.

  As shown in FIGS. 63 and 64, the image acquired by the living body detection unit 142a includes a right hand detection rectangular image region 7420a for detecting a valley portion of the base of the palm of the right hand palm along the upper left side, and an upper right portion. The left hand detection rectangular image area 7420b for detecting the valley portion of the base of the left hand palm along the side of the left hand, and the left finger detection rectangle for detecting the valley portion of the base of the left hand palm along the lower left side The right finger detection rectangular image region 7420d for detecting the valley portion of the base of the palm of the right hand palm is set along the lower right side of the image region 7420c. The right-hand detection rectangular image region 7420a and the left-hand detection rectangular image region 7420b are provided along two sides that are orthogonal to each other in the acquired image. The right finger detection rectangular image region 7420d and the left finger detection rectangular image region 7420c are provided along two mutually orthogonal sides in the acquired image. The valley portion at the base of the finger functions as a direction feature portion.

  Here, the image acquisition by the image sensor of the living body detection unit 142a is performed without being irradiated with near infrared light from the light source unit 142c. Therefore, the image acquired here is not a vein image but an image of the appearance of a palm or a finger. In the detection process of the valley portion at the base of the finger, this detection process is performed for the position where each rectangular image region is set in the image of the palm or the appearance of the finger.

  In the right hand palm acquisition image 7421 shown in FIG. 63 (A), the right hand detection rectangular image region 7420a has a direction feature portion 7421a1 that is a valley portion of the index finger, middle finger, ring finger, and little finger base of the user's right hand. . Accordingly, the information processing apparatus 700 can determine that the palm of the right hand of the user is located above the vein sensor of the reading unit 742.

  In the left hand palm acquisition image 7422 shown in FIG. 63 (B), the left hand detection rectangular image region 7420b has a direction feature portion 7422b1 which is a valley portion of the index finger, middle finger, ring finger, and little finger base of the user's left hand. . Thereby, the information processing apparatus 700 can determine that the palm of the left hand of the user is located above the vein sensor of the reading unit 742.

  In the acquired image 7423 of the right hand finger shown in FIG. 64 (A), the right finger detection rectangular image area 7420d has the base of any three adjacent fingers of the index finger, middle finger, ring finger, and little finger of the user's right hand. There is a direction feature portion 7423d1 which is a valley portion. As a result, the information processing apparatus 700 can determine that the finger of the right hand of the user is located in the information of the vein sensor of the reading unit 742.

  In the left finger detection image 7424 shown in FIG. 64 (B), the left finger detection rectangular image region 7420c has the base of any three adjacent fingers among the index finger, middle finger, ring finger, and little finger of the user's left hand. There is a direction feature portion 7424c1 that is a valley portion. Thereby, the information processing apparatus 700 can determine that the finger of the user's left hand is located in the information of the vein sensor of the reading unit 742.

  The reading unit 742 is a distance sensor (not shown) of the living body detection unit 142a. When the reading unit 742 detects a palm or a finger of a user positioned above the reading unit 742, the imaging unit 742b acquires a palm or finger image and provides the acquired image. Whether or not a valley portion of the base of the finger is detected in the right hand detection rectangular image region 7420a, left hand detection rectangular image region 7420b, left finger detection rectangular image region 7420c, or right finger detection rectangular image region 7420d. judge.

  In this manner, when the reading unit 742 detects the palm or finger of the user, the reading unit 742 acquires an image of the palm or finger. The reading unit 742 reads the right hand detection rectangular image area 7420a, the left hand detection rectangular image area 7420b, the left finger detection rectangular image area 7420c, or the right finger detection rectangular image area 7420d based on the presence / absence of a direction feature portion. Is the finger or the palm, and the left or right hand (type of acquisition target).

  63A, 63B, 64A, and 64B, the acquired image for detecting the direction feature portion is an image captured by the imaging unit 142b. There may be.

  The detection of the direction feature portion in the fifth embodiment, that is, the valley portion of the base of the finger, for example, the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, the valley portion of the base of the ring finger and the little finger All or at least some combinations may be detected based on being open at a predetermined interval. For example, the detection of the valley portion at the base of the finger may be realized by the position of the base of each finger in the palm acquired image acquired by the living body detection unit 142a and the contour identification between the base of the finger. Good.

  FIG. 65 is a flowchart illustrating biometric feature information acquisition registration processing according to the fifth embodiment. The biometric feature information acquisition / registration process determines whether the biometric object to be registered is a finger or a palm and the right and left of the hand, and in the case of a finger, accepts the input of the finger type by the user, and the palm type (hand Left and right) or finger type (left and right hand / finger type) and biometric feature information indicating vein characteristics. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a finger or palm vein. In the following, the process illustrated in FIG. 65 will be described in order of step number.

  [Step S251] The information acquisition unit 711 determines that a palm or a finger is placed at a predetermined height on the reading unit 742, based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 742. . If it is determined that a palm or a finger is placed, the information acquisition unit 711 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor provided in the living body detection unit 142a, and the detected direction feature portion. Determine the position. In the fifth embodiment, the direction feature portion is used to determine whether the acquisition target is a palm or a finger and the left and right of the hand. In the case of a finger, the contour of the finger to be registered may be detected, the direction of the finger may be determined from the detected finger contour, and erroneous detection of finger placement may be prevented.

  [Step S252] If the result of determination in step S251 is that the acquisition target is a finger, the information acquisition unit 711 displays a message window that requests the user to input the finger type, and the user acquires the target finger. Accept input of type.

  [Step S253] The type determination unit 712 determines the type of acquisition target (whether it is a finger or a palm and the left and right of the hand) based on the position of the direction feature portion determined in step S251.

[Step S254] The information acquisition unit 711 causes the imaging unit 142b of the reading unit 742 to capture an image, and acquires a biological image in which a palm or finger vein is reflected.
It should be noted that the processing order of steps S253 and S254 may be reversed or simultaneous when the palm and finger vein reading operations are the same.

  [Step S255] The information generation unit 713 extracts features of the biological information based on the biological image acquired in step S254. The information generation unit 713 generates biometric information indicating the feature extraction result.

  [Step S256] The information generation unit 713 includes the type determined in step S253, the type of the finger received in step S252 in the case of a finger, the biological information generated in step S255, and the user ID. Generate feature information.

  [Step S257] The information generation unit 713 stores the biometric feature information generated in step S256 in the biometric feature information storage unit 741b. Thereby, the biometric feature information is registered in the biometric feature information storage unit 741b. Thereafter, the process ends.

  FIG. 66 is a flowchart illustrating biometric feature information authentication processing according to the fifth embodiment. The biometric feature information authentication process determines whether the biometric subject to be authenticated is a finger or a palm and the left and right sides of the hand, generates collation biometric feature information indicating the type of palm or the type of finger and the characteristics of a vein, This is a process of verifying against registered biometric feature information. The collation biometric feature information is data having the same configuration as the biometric feature information, and indicates the features of the user's biometric subject (in the fifth embodiment, a finger vein or a palm vein). The biometric feature information authentication process is executed, for example, when the user authenticates with a finger vein or a palm vein. In the following, the process illustrated in FIG. 66 will be described in order of step number.

  [Step S261] The information acquisition unit 711 determines that a palm or a finger is placed at a predetermined height on the reading unit 742, based on a detection result by a distance sensor provided in the living body detection unit 142a of the reading unit 742. . If it is determined that a palm or a finger is placed, the information acquisition unit 711 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor provided in the living body detection unit 142a, and the detected direction feature portion. Determine the position. In the case of a finger, the contour of the finger to be registered may be detected, the direction of the finger may be determined from the detected finger contour, and erroneous detection of finger placement may be prevented.

  [Step S262] The type determination unit 712 determines the type of the acquisition target (whether it is a finger or a palm and the left and right of the hand) based on the position of the direction feature portion determined in step S261.

[Step S263] The information acquisition unit 711 causes the imaging unit 142b of the reading unit 742 to capture an image and acquires a biological image in which a palm or finger vein is reflected.
Note that the processing order of steps S262 and S263 may be reversed or simultaneous when the palm and finger vein reading operations are the same.

  [Step S264] The information generation unit 713 extracts features of the biological information based on the biological image acquired in step S263. The information generation unit 713 generates biometric information indicating the feature extraction result.

[Step S265] The information generation unit 713 generates verification biometric feature information including the type determined in step S262 and the biometric information generated in step S264.
[Step S266] The collation unit 714 executes biometric feature information collation processing (described later in FIG. 67) using the collation biometric feature information generated in step S265. Thereafter, the process ends.

  FIG. 67 is a flowchart illustrating biometric feature information matching processing according to the fifth embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 67 will be described in order of step number.

[Step S271] The collation unit 714 obtains collation biometric feature information generated by the biometric feature information authentication process.
[Step S272] The collation unit 714 refers to the biometric feature information storage unit 741b, and matches the type of collation biometric feature information (whether it is a finger or a palm and the left and right of the hand) acquired in step S271. Extract information. When one-to-one matching is performed, the matching unit 714 extracts from the biometric feature information storage unit 741b biometric feature information that matches both the type of matching biometric feature information and the user ID.

  [Step S273] The matching unit 714 selects one of the biometric feature information extracted in step S272, and selects biometric information included in each of the selected biometric feature information and the matching biometric feature information.

  [Step S274] The collation unit 714 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S273. If the collation is successful (step S274 YES), the process proceeds to step S275. On the other hand, if the verification fails (NO in step S274), the process proceeds to step S276.

[Step S275] The collation unit 714 executes a predetermined process when the authentication is successful. Thereafter, the process returns.
[Step S276] The collation unit 714 determines whether all the biometric feature information extracted in Step S272 has been selected in Step S273. If all have been selected (YES in step S276), the process proceeds to step S277. On the other hand, if there is an unselected one (step S276 NO), the process proceeds to step S273.

[Step S277] The collation unit 714 executes predetermined processing when authentication fails. Thereafter, the process returns.
The fifth embodiment as described above also provides the same effects as those of the second and third embodiments.

  Further, the user can arbitrarily select and use the finger vein biometric information and the palm vein biometric information for authentication of the user. In addition, it is not necessary for the user to input different biometric types, such as palms or fingers, and the burden of operations during authentication and registration can be suppressed. Further, user authentication can be performed by using both biometric information of finger veins and biometric information of palm veins, so that the accuracy of authentication can be improved.

[Sixth Embodiment]
Next, a sixth embodiment will be described. Differences from the seventh modification of the second embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the sixth embodiment is different from the seventh modification of the second embodiment in that the biological information of the palm vein and the biological information of one specific finger vein are read simultaneously.

  FIG. 68 is a block diagram illustrating an information processing apparatus according to the sixth embodiment. An information processing apparatus 800 according to the sixth embodiment includes an information acquisition unit 811, a type determination unit 812, an information generation unit 813, a collation unit 814, and a biometric feature information storage unit 841b. In addition, a reading unit 842 is connected to the information acquisition unit 811.

  The information acquisition unit 811 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 800. The information acquisition unit 811 can acquire the direction of the living body in a state where the biological image is acquired. The biological image acquired by the information acquisition unit 811 is image information of a palm vein pattern and image information of one finger vein pattern specified in advance. The direction of the palm (the direction of the hand) is the reverse direction (180 ° rotation) based on the left and right of the hand. The direction of the finger is the reverse direction (rotated 180 °) based on the left and right sides of the finger's hand. The information acquisition unit 811 determines the direction of the living body by determining the position of the direction feature portion in the acquired image.

  The reading unit 842 is fixed to the upper part of the information processing apparatus 800. The reading unit 842 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG. The information acquisition unit 811 determines that the living body is arranged at a predetermined distance from the reading unit 842 and the direction of the living body with respect to the reading unit 842 based on the detection result by the living body detection unit 142a of the reading unit 842. The information acquisition unit 811 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 842. In the sixth embodiment, the directions of the left and right hands are opposite (rotated 180 °) and the angle with the keyboard 131 is parallel. The direction feature portion is a valley portion at the base of the finger in the palm.

Further, the information acquisition unit 811 acquires a biological image obtained by imaging a living body by the imaging unit 142b of the reading unit 842.
The type determination unit 812 determines the type of palm biometric information and the type of finger biometric information based on the hand direction acquired by the information acquisition unit 811. The type indicates whether the hand is left or right. In the sixth embodiment, the types of fingers are specified in advance, such as the left and right middle fingers. As will be described in detail later, the type determination unit 812 determines whether the type of generated information is information related to the palm or information related to the finger based on the determination result of the position of the direction feature portion.

  The information generation unit 813 generates biological information indicating the characteristics of the biological body based on the biological image acquired by the information acquisition unit 811. The information generation unit 813 generates collation biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 812. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown. The information generation unit 813 generates biometric feature information including, for example, the generated biometric information, the type of biometric information determined by the type determination unit 812, and identification information that identifies an individual corresponding to the biometric information, It memorize | stores in the biometric feature information storage part 841b. As a result, biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered. In addition, when registering the user's biometric information, the information generation unit 813 stores the generated biometric feature information in the biometric feature information storage unit 841b. At the time of authentication using the user's biometric information, the verification unit 814 performs authentication using the verification biometric feature information generated by the information generation unit 813.

  The collation unit 814 extracts biometric feature information whose type matches the collation biometric feature information, and collates with each palm and finger based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 800 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.

  The biometric feature information storage unit 841b stores biometric feature information indicating the biometric information of the palm acquired by the information acquiring unit 811 and the biometric information of the finger and the type of the biometric information determined by the type determining unit 812. Thereby, the biometric information of the palm of the user and the biometric information of the finger are stored in association with the left and right of the hand.

  FIG. 69 is a diagram illustrating a biometric feature table according to the sixth embodiment. The biometric feature table 841b1 illustrated in FIG. 69 is set in the biometric feature information storage unit 841b included in the information processing apparatus 800 according to the sixth embodiment. The biometric feature table 841b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 800.

  The biometric feature table 841b1 is provided with “number”, “ID”, “left / right”, and “feature data” as items. The feature data has “hand” and “finger 3” as items. In the biometric feature table 841b1, values set in the above items are associated with each other as biometric feature information.

  The hand indicates the file name of the biometric information indicating the characteristics of the vein of the palm of the right hand and the vein of the palm of the left hand. The finger 3 indicates a file name of biometric information indicating the characteristics of either the middle finger vein of the right hand or the middle finger vein of the left hand. Whether the biometric information of the biometric feature information is right-handed or left-handed is indicated by the left and right items.

  Note that the biometric feature table 841b1 illustrated in FIG. 69 is an example, and any item can be set in the biometric feature table. For example, in the sixth embodiment, feature data of the middle finger vein is used, but feature data of another finger vein may be used.

  FIG. 70 is a diagram illustrating a state when reading the palm of the right hand and the vein of the middle finger according to the sixth embodiment. FIG. 70 is a diagram of the state when the information processing apparatus 800 reads the palm of the right hand and the vein of the middle finger as viewed from above. As illustrated in FIG. 70, the information processing apparatus 800 includes a display unit 120 and a main body unit 830. A keyboard 131 and reading units 8425 and 8426 are provided on the upper surface of the main body 830 of the information processing apparatus 800.

  The reading units 8425 and 8426 include a living body detection unit 142a, an imaging unit 142b, and a light source unit 142c shown in FIG. The reading units 8425 and 8426 are on the same top surface of the main body 830 as the keyboard 131 of the information processing apparatus 800, and are arranged side by side in front of the keyboard 131, and each side of the square vein sensor (imaging unit 142 b) Arranged parallel to the front and side of device 800. The imaging unit 142b of the reading unit 8425 can read the vein of the middle finger of the right hand and the vein of the palm of the left hand. The imaging unit 142b of the reading unit 8426 can read the vein of the middle finger of the left hand and the vein of the palm of the right hand. In addition, a user's head 201, body 202, upper right arm 203, right lower arm 204, right hand palm 205, and middle finger 205c of the right hand are shown.

  32, each vein sensor in the reading units 8425 and 8426 in FIG. 70 has an angle of 90 ° with respect to the direction in which the front surface 830a of the main body 830 extends. It is arranged to make. However, each vein sensor includes, for example, either the direction in which the front surface 830a of the main body 830 extends, the operation key arrangement direction on the keyboard 131 (longitudinal direction of the keyboard 131), or the main scanning direction of the LCD 121, and the main vein sensor. You may arrange | position so that an angle with a scanning direction may be 90 degrees.

  When the user causes the reading unit 8425 or 8426 to read the palm vein and the finger vein of the finger specified in advance, as shown in FIG. 70, the user reads the palm of the person reading the vein (for example, the palm 205 of the right hand) and A finger (for example, the middle finger 205c of the right hand) is positioned parallel to the front surface of the information processing apparatus 800 and parallel to the upper surface of the main body 830 of the information processing apparatus 800. When the right hand is to be read, the user can set the center of the palm 205 of the right hand to coincide with the center of the reading unit 8426, and the center line of the middle finger 205c of the right hand is the horizontal center line of the reading unit 8425. Position to match. In addition, the user positions the palm 205 of the right hand so that fingers other than the middle finger 205 c of the right hand to be read do not cover the vicinity of the center line in the horizontal direction of the reading unit 8425 with the finger 205 open. In addition, the user positions the palm 205 of the right hand and the middle finger 205c of the right hand in a space separated from the vein sensor surfaces of the reading units 8426 and 8425 by a certain distance (for example, several centimeters). The user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  In addition, the base between the fingers of the hand is detected in a detection rectangular area set on the left side (not shown) of the acquired image acquired by the image sensor of the living body detection unit 142a included in the reading unit 8426 of the palm 205 of the right hand. It is possible to quickly and surely determine that the right hand has been placed. This detection rectangular area is set on the left side of the acquired image, for example, as shown in the right-hand detection rectangular image area 7420a shown in FIGS. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced.

  When the left hand is to be read, the user can set the center of the palm of the left hand to match the center of the reading unit 8425 and the center line of the middle finger of the left hand is the horizontal center line of the reading unit 8426. Position to match. At this time, the base between the fingers of the hand is detected from a detection rectangular region set on the right side (not shown) of the acquired image acquired by the image sensor of the living body detection unit 142a included in the reading unit 8425. It is determined that the left hand is placed. This detection rectangular area is set on the right side of the acquired image, for example, as the left-hand detection rectangular image area 7420b shown in FIGS.

Note that the acquired image for detecting the direction feature portion may be an image captured by the imaging unit 142b.
FIG. 71 is a flowchart illustrating biometric feature information acquisition registration processing according to the sixth embodiment. The biometric feature information acquisition / registration process is a process for determining the right and left palms and finger hands of a biometric target to be registered, and generating and registering biometric feature information indicating the types of palms and fingers (left and right hands) and veins It is. The biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a finger and a palm vein. In the following, the process illustrated in FIG. 71 will be described in order of step number.

  [Step S281] The information acquisition unit 811 has palms or fingers placed at predetermined height positions on the reading units 8425 and 8426 based on the detection results of the living body detection unit 142a included in the reading units 8425 and 8426, respectively. Determine that. If it is determined that a palm or a finger is placed, the information acquisition unit 811 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in each of the living body detection units 142a of the reading units 8425 and 8426. The position of the detected direction feature portion is determined. In the sixth embodiment, the kind of finger is determined in advance as a predetermined finger (for example, the middle finger), and the direction feature portion is used to determine whether the acquisition target is the left or right hand. The information acquisition unit 811 determines that the hand to be read is the right hand when the direction feature portion is detected in the detection rectangular area set on the left side (not shown) of the acquired image acquired by the reading unit 8426. On the other hand, when the direction feature portion is detected in the detection rectangular area set on the right side (not shown) of the acquired image acquired by the reading unit 8425, the information acquisition unit 811 determines that the hand to be read is the left hand.

[Step S282] The type determination unit 812 determines the type (left and right) of the acquisition target based on the position of the direction feature portion determined in step S281.
[Step S283] The information acquisition unit 811 causes the imaging unit 142b included in each of the reading units 8425 and 8426 to capture an image, and acquires a biological image in which the palm and finger veins are reflected.

Note that the processing order of steps S282 and S283 may be reversed or simultaneous when the palm and finger vein reading operations are the same.
[Step S284] The information generation unit 813 extracts features of the biological information based on the biological image acquired in step S283. The information generation unit 813 generates biometric information indicating the feature extraction result.

[Step S285] The information generation unit 813 generates biometric feature information including the type determined in step S282, the biometric information generated in step S284, and the user ID.
[Step S286] The information generation unit 813 stores the biometric feature information generated in step S285 in the biometric feature information storage unit 841b. Thereby, the biometric feature information is registered in the biometric feature information storage unit 841b. Thereafter, the process ends.

  FIG. 72 is a flowchart illustrating biometric feature information authentication processing according to the sixth embodiment. The biometric feature information authentication process determines the left and right sides of the palm of the biometric subject and the finger, generates biometric feature information indicating the type of the palm and fingers (left and right of the hand) and vein characteristics, and is registered in advance. This is a process of verifying against biometric feature information. The collation biometric feature information is data having the same configuration as the biometric feature information, and indicates the features of the user's biometric subject (in the sixth embodiment, finger veins and palm veins). The biometric feature information authentication process is executed, for example, when the user authenticates with a finger vein and a palm vein. In the following, the process illustrated in FIG. 72 will be described in order of step number.

  [Step S291] The information acquisition unit 811 has palms or fingers placed at predetermined height positions on the reading units 8425 and 8426 based on the detection results of the living body detection unit 142a included in the reading units 8425 and 8426, respectively. Determine that. If it is determined that a palm or a finger is placed, the information acquisition unit 811 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in each of the living body detection units 142a of the reading units 8425 and 8426. The position of the detected direction feature portion is determined. In the sixth embodiment, the kind of finger is determined in advance as a predetermined finger (for example, the middle finger), and the direction feature portion is used to determine whether the acquisition target is the left or right hand. The information acquisition unit 811 determines that the hand to be read is the right hand when the direction feature portion is detected in the detection rectangular area set on the left side (not shown) of the acquired image acquired by the reading unit 8426. On the other hand, when the direction feature portion is detected in the detection rectangular area set on the right side (not shown) of the acquired image acquired by the reading unit 8425, the information acquisition unit 811 determines that the hand to be read is the left hand.

[Step S292] The type determination unit 812 determines the type (left and right) of the acquisition target based on the position of the direction feature portion determined in step S291.
[Step S293] The information acquisition unit 811 causes the imaging unit 142b included in each of the reading units 8425 and 8426 to capture an image, and acquires a biological image in which the palm and finger veins are reflected.

Note that the processing order of steps S292 and S293 may be reversed or simultaneous when the palm and finger vein reading operations are the same.
[Step S294] The information generation unit 813 extracts features of the biological information based on the biological image acquired in step S293. The information generation unit 813 generates biometric information indicating the feature extraction result.

[Step S295] The information generation unit 813 generates verification biometric feature information including the type determined in step S292 and the biometric information generated in step S294.
[Step S296] The collation unit 814 executes biometric feature information collation processing (described later in FIG. 73) using the collation biometric feature information generated in step S295. Thereafter, the process ends.

  FIG. 73 is a flowchart illustrating biometric feature information matching processing according to the sixth embodiment. The biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance. The biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 73 will be described in order of step number.

[Step S301] The collation unit 814 obtains collation biometric feature information generated by the biometric feature information authentication process.
[Step S302] The collation unit 814 refers to the biometric feature information storage unit 841b, and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S301 (palm and left and right hand of the finger). When one-to-one matching is performed, the matching unit 814 extracts from the biometric feature information storage unit 841b biometric feature information that matches both the type of matching biometric feature information and the user ID.

  [Step S303] The collation unit 814 selects one of the biometric feature information extracted in step S302 and selects biometric information included in each of the selected biometric feature information and collation biometric feature information. In the sixth embodiment, when biometric feature information and collation biometric feature information are collated, both palm veins and finger veins are collated.

  [Step S304] The collation unit 814 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S303. If the verification is successful (step S304 YES), the process proceeds to step S305. On the other hand, if the verification fails (NO in step S304), the process proceeds to step S306. For example, it may be determined that the collation is successful when both the palm vein and the finger vein match, and the other cases may be determined as unsuccessful. Moreover, when evaluating a collation result by a numerical value or a step, each collation result may be comprehensively determined.

[Step S305] The collation unit 814 executes predetermined processing when authentication is successful. Thereafter, the process returns.
[Step S306] The collation unit 814 determines whether all the biometric feature information extracted in Step S302 has been selected in Step S303. If all have been selected (YES in step S306), the process proceeds to step S307. On the other hand, if there is an unselected item (NO in step S306), the process proceeds to step S303.

[Step S307] The collation unit 814 executes a predetermined process when authentication fails. Thereafter, the process returns.
In the sixth embodiment, the type of the finger to be read is specified in advance (the middle finger of the right hand or the left hand). However, the present invention is not limited to this, and the information processing apparatus 800 determines based on the direction of the finger itself. May be. Further, any finger other than the middle finger may be specified in advance as a reading target.

According to the sixth embodiment as described above, the effect of combining the same effect as that of the second embodiment and the same effect as that of the third embodiment can be achieved by placing the hand once.
In addition, since user authentication is performed using biometric information of both finger veins and palm veins, it is possible to improve the accuracy of authentication.

  In the sixth embodiment, it is determined that the right hand is placed from the image acquired from the image sensor of the reading unit 8426, and the left hand is placed from the image acquired from the image sensor of the reading unit 8425. It was determined that However, as another example, the direction of the hand may be determined from only one of the reading units 8425 and 8426.

  For example, the information acquisition unit 711 sets the detection rectangular area on the left side of the image acquired from the image sensor of the reading unit 8426. The information acquisition unit 811 determines that the right hand detection rectangular image region 7420a in FIG. 63A has a valley portion at the base of the four fingers of the right hand as the right hand direction feature portion in the detection rectangular region. It is determined that the right hand is placed. On the other hand, the information acquisition unit 811 has a valley portion at the base of the three fingers of the left hand as in the left finger detection rectangular image region 7420c of FIG. It is determined that the left hand is placed. As described above, by determining the direction of the hand from one of the reading units 8425 and 8426, it is only necessary to provide an image sensor in only one of them, and the manufacturing cost can be reduced.

  As another example, the information acquisition unit 811 may always determine the direction of the hand based on acquired images from both image sensors of the reading units 8425 and 8426. For example, the information acquisition unit 811 detects the valley portion of the base of the four fingers of the right hand from the detection rectangular area set on the left side of the image acquired from the image sensor of the reading unit 8426, and the image of the reading unit 8425 When the valley portion of the base of the three fingers of the right hand is detected from the detection rectangular area set on the right side of the image acquired from the sensor, it is determined that the right hand is placed. Thereby, the determination accuracy of the hand direction can be increased.

[Seventh Embodiment]
Next, a seventh embodiment will be described. Differences from the sixth embodiment will be mainly described, and description of similar matters will be omitted. The information processing apparatus according to the seventh embodiment is different from the sixth embodiment in that the biological information of the palm vein and the biological information of a plurality of specific finger veins are read simultaneously.

  FIG. 74 is a block diagram illustrating an information processing apparatus according to the seventh embodiment. The information processing apparatus 900 according to the seventh embodiment includes an information acquisition unit 911, a type determination unit 912, an information generation unit 913, a collation unit 914, and a biometric feature information storage unit 941b. In addition, a reading unit 942 is connected to the information acquisition unit 911.

  The information acquisition unit 911 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 900. The information acquisition unit 911 can acquire the direction of the living body in a state where the biological image is acquired. The biological image acquired by the information acquisition unit 911 is image information of a palm vein pattern and image information of a plurality of finger vein patterns. The direction of the palm (the direction of the hand) is the reverse direction (180 ° rotation) based on the left and right of the hand. The direction of the finger is a direction that combines the reverse direction (rotation by 180 °) and the direction of each finger based on the left and right of the hand. The information acquisition unit 911 determines the direction of the living body by determining the position of the direction feature portion in the acquired image.

  The reading unit 942 is fixed to the upper part of the information processing apparatus 900. The reading unit 942 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG. The information acquisition unit 911 determines that the living body is arranged at a predetermined distance from the reading unit 942 and the direction of the living body with respect to the reading unit 942 based on the detection result by the living body detection unit 142a of the reading unit 942. The information acquisition unit 911 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 942. In the seventh embodiment, the directions of the left and right hands are opposite (rotated 180 °) and the angle with the keyboard 131 is parallel. The direction feature portion is a valley portion at the base of the finger in the palm. For example, the information acquisition unit 911 obtains a contour image of a finger from the image obtained by the living body detection unit 142a of the reading unit 942.

In addition, the information acquisition unit 911 acquires a biological image obtained by imaging a living body by the imaging unit 142b of the reading unit 942.
The type determining unit 912 determines the type of palm and finger biometric information (left and right of the hand) based on the hand direction acquired by the information acquiring unit 911. The type determination unit 912 determines the type of the finger from the finger contour image acquired by the information acquisition unit 911. For example, in the case of the right hand, when viewed from above the reading unit 942, the finger extending in the direction extending from the left base of the center of the middle finger is determined as the index finger, and the finger extending in the direction extending from the right base is determined as the ring finger. To do. In the seventh embodiment, there are a plurality of types of fingers, and they can be specified in advance, such as left and right forefinger, middle finger and ring finger. As will be described in detail later, the type determination unit 912 determines whether the type of generated information is information related to the palm or information related to the finger based on the determination result of the position of the direction feature portion.

  The information generation unit 913 generates biological information indicating the characteristics of the biological body based on the biological image acquired by the information acquisition unit 911. The information generation unit 913 generates collation biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 912. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown. The information generation unit 913 generates biometric feature information including, for example, the generated biometric information, the type of biometric information determined by the type determination unit 912, and identification information that identifies an individual corresponding to the biometric information. It is stored in the feature information storage unit 941b. As a result, biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered. The types are the left and right hand types and the finger type. In addition, when registering the user's biometric information, the information generating unit 913 stores the generated biometric feature information in the biometric feature information storage unit 941b. Further, at the time of authentication using the user's biometric information, the verification unit 914 performs authentication using the verification authentication biometric feature information generated by the information generation unit 913.

  The collation unit 914 extracts biometric feature information whose type matches the collation biometric feature information, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 900 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.

  The biometric feature information storage unit 941b stores biometric feature information indicating the biometric information of the palm acquired by the information acquiring unit 911, the biometric information of the finger, and the type of the biometric information determined by the type determining unit 912. Thereby, the biometric information on the palm of the user and the biometric information on the finger are stored in association with the left and right sides of the hand and the type of the finger.

  FIG. 75 is a diagram illustrating a biometric feature table according to the seventh embodiment. The biometric feature table 941b1 illustrated in FIG. 75 is set in the biometric feature information storage unit 941b included in the information processing apparatus 900 according to the seventh embodiment. The biometric feature table 941b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 900.

  The biometric feature table 941b1 is provided with “number”, “ID”, “left / right”, and “feature data” as items. The feature data includes “hand”, “finger 2”, “finger 3”, and “finger 4” as items. In the biometric feature table 941b1, values set in the above items are associated with each other as biometric feature information.

  The finger 2 indicates a file name of biometric information indicating the characteristics of either the right index finger vein or the left index finger vein. The finger 3 indicates the file name of the biometric information indicating the characteristics of either the middle finger vein of the right hand or the middle finger vein of the left hand, as in the sixth embodiment. The finger 4 indicates the file name of the biometric information indicating the characteristics of either the right finger ring or the left ring finger.

  Note that the biometric feature table 941b1 illustrated in FIG. 75 is an example, and any item can be set in the biometric feature table. For example, in the seventh embodiment, feature data of veins of the index finger, middle finger, and ring finger are used, but feature data of veins of other fingers may be used, and some combinations of these fingers are used. May be.

  FIG. 76 is a diagram illustrating a state when the right hand palm and finger veins are read according to the seventh embodiment. FIG. 76 is a diagram when the information processing apparatus 900 reads from the upper side the state when the palm of the right hand and the veins of the index finger, middle finger, and ring finger are read. As illustrated in FIG. 76, the information processing apparatus 900 includes a display unit 120 and a main body unit 930. A keyboard 131 and reading units 9425 and 9426 are provided on the upper surface of the main body 930 of the information processing apparatus 900.

  The reading units 9425 and 9426 each include a living body detection unit 142a, an imaging unit 142b, and a light source unit 142c shown in FIG. The reading units 9425 and 9426 are on the same top surface of the main body 930 as the keyboard 131 of the information processing apparatus 900, and are arranged in the left and right in front of the keyboard 131 and each of the rectangular vein sensors (imaging unit 142 b) that are long in the vertical direction The sides are arranged in parallel to the front and side surfaces of the information processing apparatus 900. The imaging unit 142b of the reading unit 9425 can read the veins of the index finger, middle finger and ring finger of the right hand, or the vein of the palm of the left hand. The imaging unit 142b of the reading unit 9426 can read the left index finger, middle finger, ring finger vein, or right hand palm vein. Also, the user's head 201, torso 202, upper right arm 203, right lower arm 204, right hand palm 205, right hand thumb 205a, right hand index finger 205b, right hand middle finger 205c, right hand ring finger 205d, right hand The little finger 205e is shown.

  Similar to the vein sensors of the reading units 8425 and 8426 in FIG. 70, each vein sensor in the reading units 9425 and 9426 in FIG. 76 has a main scanning direction of 90 ° with respect to the direction in which the front surface 930a of the main body 930 extends. It is arranged to make an angle. However, each vein sensor includes, for example, either the direction in which the front surface 930a of the main body 930 extends, the operation key arrangement direction on the keyboard 131 (longitudinal direction of the keyboard 131), or the main scanning direction of the LCD 121, and the main vein sensor. You may arrange | position so that an angle with a scanning direction may be 90 degrees.

  When the user causes the reading units 9425 and 9426 to read the veins of the palm and a plurality of finger veins specified in advance, as shown in FIG. 76, the user reads the palm of the person reading the vein (for example, the palm 205 of the right hand). ) And a finger (for example, the index finger 205b of the right hand, the middle finger 205c of the right hand, and the ring finger 205d of the right hand) are positioned parallel to the front surface of the information processing apparatus 900 and parallel to the upper surface of the main body 930 of the information processing apparatus 900. When the right hand is to be read, the user can set the center part of the palm 205 of the right hand to coincide with the center part of the reading unit 9426 and the center line of the middle finger 205c of the right hand is the horizontal center line of the reading unit 9425. Position to match. In addition, the user opens a finger on the palm 205 of the right hand, and the fingers other than the index finger 205b of the right hand, the middle finger 205c of the right hand, and the ring finger 205d of the right hand (the thumb 205a of the right hand and the little finger 205e of the right hand). Is positioned so as not to be above the reading range of the reading unit 9425. In addition, the user positions the palm 205 of the right hand, the index finger 205b of the right hand, the middle finger 205c of the right hand, and the ring finger 205d of the right hand so as to be over the reading range of the reading units 9425 and 9426, and the reading units 9425 and 9426. It is located on a space separated from each vein sensor surface by a certain distance (for example, several centimeters). The user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight. Along with this, each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.

  In addition, the base between the fingers of the hand is detected in a detection rectangular area set on the left side (not shown) of the acquired image acquired by the image sensor of the living body detection unit 142a provided in the reading unit 9426 of the palm 205 of the right hand. It is possible to quickly and surely determine that the right hand has been placed. This detection rectangular area is set on the left side of the acquired image, for example, as shown in the right-hand detection rectangular image area 7420a shown in FIGS. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced.

  When the left hand is to be read, the user can set the center of the palm of the left hand to coincide with the center of the reading unit 9425, and the center line of the middle finger of the left hand is the horizontal center line of the reading unit 9426. Position to match. At this time, the base between the fingers of the hand is detected from a detection rectangular region set on the right side (not shown) of the acquired image acquired by the image sensor of the living body detection unit 142a included in the reading unit 9425. It is determined that the left hand is placed. The detection rectangular area is set on the right side of the acquired image, for example, as a left-hand detection rectangular image area 7420b shown in FIGS.

  As in the sixth embodiment described above, the information acquisition unit 911 may determine the direction of the hand from only one of the reading units 9425 and 9426. Alternatively, the information acquisition unit 911 may always determine the direction of the hand based on the acquired images from both image sensors of the reading units 9425 and 9426.

Further, the acquired image for detecting the direction feature portion may be an image captured by the imaging unit 142b.
In the seventh embodiment, the type of the finger to be read is specified in advance (the index finger, middle finger, and ring finger of the right or left hand). However, the present invention is not limited to this, and the user inputs the type of the finger to be read. May be. Further, the information processing apparatus 900 may make the determination based on the direction of the finger itself. Also, any combination of fingers other than the index finger, middle finger, and ring finger may be specified in advance as a reading target.

Also according to the seventh embodiment as described above, the same effects as in the sixth embodiment can be obtained.
In addition, with respect to user authentication, authentication is performed using biometric information of a plurality of finger veins and biometric information of palm veins, so that the accuracy of authentication can be improved. In addition, at the time of acquiring the biometric information of the finger vein, it is possible to determine which type of finger is the finger among the six fingers of the left and right three fingers according to the angle of the finger biometric. In addition, this eliminates the need for the user to input all six types of fingers as well as left and right, and can suppress the burden of operations during authentication and registration. Moreover, when performing 1-to-N collation, the object of collation can be narrowed down according to classification about 6 types of fingers. For this reason, it becomes possible to perform collation at high speed when performing one-to-N collation. Alternatively, it is possible to increase the number of pieces of biometric feature information while suppressing an increase in time required for verification and time required for authentication processing.

[Eighth Embodiment]
Next, an eighth embodiment will be described. Differences from the eighth modification of the second embodiment will be mainly described, and the same reference numerals are used for the same matters, and descriptions thereof are omitted.

  The eighth embodiment is an eighth modification of the second embodiment in that the information processing apparatus is an automatic transaction apparatus and is an automatic teller machine that accepts and pays out deposits such as banks. Different from the example.

  FIG. 77 is a diagram illustrating an appearance of an automatic transaction apparatus according to the eighth embodiment. The automatic transaction apparatus 1000 includes an operation screen 1081, a bill input / output unit 1082a, a coin input / output unit 1082b, a passbook receiving unit 1083, a card receiving unit 1084, a receipt issuing unit 1085, a reading unit 1086, and a speaker 1087.

  The operation screen 1081 has a display screen for displaying an image showing the contents of a transaction, an image including a message for guiding an ATM user, and a touch panel for receiving user input. The banknote deposit / withdrawal unit 1082a deposits / withdraws banknotes for accepting the user's deposit and for dispensing the user's deposit. The coin deposit / withdrawal unit 1082b deposits / withdraws coins for accepting the deposit of the user and paying out the deposit of the user. The passbook accepting unit 1083 accepts a passbook when accepting a user's deposit, when paying out a user's deposit, and when other users wish to book. The card receiving unit 1084 receives a cash card or the like when the user uses it. The receipt issuing unit 1085 issues a receipt in which usage details are recorded when the user uses it. In order to authenticate the user at the time of use, the reading unit 1086 reads the biometric information of the palm of the user's palm, generates verification biometric feature information, and compares the biometric feature information with the registered biometric feature information. Authenticate. The reading unit 1086 is provided on the right side of the automatic transaction apparatus 1000 when viewed from the front. Thereby, the user can position the left and right palms in different directions and read the palm veins when the automatic transaction apparatus 1000 reads the palm veins. The biometric feature information may be managed by a server as in the first modification of the second embodiment. The automatic transaction apparatus 1000 performs authentication using the reading result obtained by the reading unit 1086. The speaker 1087 outputs a voice guidance and a warning sound for guiding the transaction status and operation to the user.

  In the eighth embodiment, the reading unit 1086 is provided on the right side of the automatic transaction apparatus 1000 when viewed from the front, but is not limited thereto, and may be provided on the left side or the center of the automatic transaction apparatus. Further, the reading unit 1086 reads the vein of the palm of the user, but is not limited to this, and may read the vein of the finger. Also, a fingerprint or palm print may be read. Moreover, you may read combining these. The automatic transaction apparatus 1000 is not limited to an automatic teller machine, and may be an apparatus that performs biometric authentication of a user such as a vending machine.

As described above, according to the eighth embodiment, the automatic transaction apparatus 1000 has the same effects as those of the second embodiment.
The above processing functions can be realized by a computer. In that case, a program describing the processing contents of the functions that the information processing apparatuses 100 to 900 and the automatic transaction apparatus 1000 should have is provided. By executing the program on a computer, the above processing functions are realized on the computer. The program describing the processing contents can be recorded on a computer-readable recording medium. Examples of the computer-readable recording medium include a magnetic storage device, an optical disk, a magneto-optical recording medium, and a semiconductor memory. Magnetic storage devices include hard disk devices (HDD), flexible disks (FD), magnetic tapes, and the like. Examples of the optical disc include a DVD (Digital Versatile Disc), a DVD-RAM, and a CD-ROM / RW (Rewritable). Magneto-optical recording media include MO (Magneto-Optical disk).

  When distributing the program, for example, a portable recording medium such as a DVD or a CD-ROM in which the program is recorded is sold. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.

  The computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. In addition, each time a program is transferred from a server computer connected via a network, the computer can sequentially execute processing according to the received program.

  Further, at least a part of the above processing functions can be realized by an electronic circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or a PLD (Programmable Logic Device).

  The above merely illustrates the principle of the present invention. In addition, many modifications and variations will be apparent to practitioners skilled in this art and the present invention is not limited to the precise configuration and application shown and described above, and all corresponding variations and equivalents may be And the equivalents thereof are considered to be within the scope of the invention.

DESCRIPTION OF SYMBOLS 10 Information processing apparatus 10a Operation surface 10b End part 11 Sensor 12 Key input part 13 Display apparatus 20 Operator 21 Right hand 22 Right elbow 23 Forearm

Claims (4)

  1. In an information processing device,
    Comprising a first sensor to scan the hand reading information of the finger, and a second sensor for reading the information of the palm by scanning the hand,
    Each of the first sensor and the second sensor has a scanning direction perpendicular to a direction in which an end of the information processing apparatus facing the operator extends in a direction perpendicular to the operator. An information processing apparatus, wherein one sensor and the second sensor are fixedly arranged, and the first sensor and the second sensor are arranged along a direction in which the end portion extends.
  2. It said end portion according to claim 1 Symbol placement of the information processing apparatus, wherein the operation key of the surface of the housing is an end of the operator's side in the deployed operating surface.
  3. A plurality of operation keys arranged in a predetermined direction;
    A first sensor that scans a hand and reads finger information ;
    A second sensor that scans the hand and reads palm information ;
    With
    Arrangement the first sensor and each of the scanning direction of the second sensor, so as to intersect perpendicularly to the arrangement direction of the operation key, the first sensor and the second sensor is fixed The information processing apparatus is characterized in that the first sensor and the second sensor are arranged along an arrangement direction of the operation keys.
  4. The first sensor and the second sensor are arranged so that the scanning direction intersects with a longitudinal direction of a key arrangement region where a plurality of the operation keys are arranged. 3. The information processing apparatus according to 3 .
JP2013545731A 2011-11-25 2011-11-25 Information processing device Active JP5794310B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/077119 WO2013076858A1 (en) 2011-11-25 2011-11-25 Information processing device

Publications (2)

Publication Number Publication Date
JPWO2013076858A1 JPWO2013076858A1 (en) 2015-04-27
JP5794310B2 true JP5794310B2 (en) 2015-10-14

Family

ID=48469336

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013545731A Active JP5794310B2 (en) 2011-11-25 2011-11-25 Information processing device

Country Status (2)

Country Link
JP (1) JP5794310B2 (en)
WO (1) WO2013076858A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107851298A (en) * 2015-12-18 2018-03-27 株式会社日立制作所 Organism authentication apparatus and system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7539329B2 (en) * 2004-07-01 2009-05-26 Hewlett-Packard Development Company, L.P. Method and apparatus for enhancing the usability of an electronic device having an integrated fingerprint sensor
JP4675660B2 (en) * 2005-03-29 2011-04-27 富士通株式会社 Multiple simultaneous biometrics authentication device
JP2007226623A (en) * 2006-02-24 2007-09-06 Murata Mach Ltd Fingerprint input device
JP2008136729A (en) * 2006-12-04 2008-06-19 Oki Electric Ind Co Ltd Bioinformation acquisition sensor, bioinformation acquisition device having sensor and consumer transaction facility
JP4748199B2 (en) * 2008-09-30 2011-08-17 ソニー株式会社 Vein imaging apparatus and vein imaging method
JP4862924B2 (en) * 2009-07-24 2012-01-25 ミツミ電機株式会社 Fingerprint image forming apparatus and fingerprint image forming method

Also Published As

Publication number Publication date
JPWO2013076858A1 (en) 2015-04-27
WO2013076858A1 (en) 2013-05-30

Similar Documents

Publication Publication Date Title
Maltoni et al. Handbook of fingerprint recognition
US10346699B2 (en) Methods and systems for enrolling biometric data
JP3825222B2 (en) Personal authentication device, personal authentication system, and electronic payment system
US9817965B2 (en) System and method for authentication with a computer stylus
US6941001B1 (en) To a combined fingerprint acquisition and control device
KR100918286B1 (en) Living body guidance control method for a biometrics authentication device, and biometrics authentication device
CN1327385C (en) Systems and methods with identity verification by streamlined comparison and interpretation of fingerprints and the like
US8605959B2 (en) Apparatus, system, and method for sequenced biometric authentication
JP4403426B2 (en) Biometric authentication device and biometric authentication program
KR20080108214A (en) The system for managing entrance/leaving room and the personal identification apparatus
Jain et al. Biometric Authentication: System Security and User Privacy.
US8917245B2 (en) Information processing apparatus and control method thereof
EP1679666B1 (en) Renewal method and renewal apparatus for a medium having biometric authentication functions
CN100484471C (en) Biometric information processing device and biometric information processing method
US7391891B2 (en) Method and apparatus for supporting a biometric registration performed on an authentication server
US7620244B1 (en) Methods and systems for slant compensation in handwriting and signature recognition
US6546122B1 (en) Method for combining fingerprint templates representing various sensed areas of a fingerprint to derive one fingerprint template representing the fingerprint
JP4640932B2 (en) Automatic transaction control method, automatic transaction apparatus and program thereof
ES2332469T3 (en) Biometric authentication and device for reading image of blood vessels.
US9710632B2 (en) User-authentication gestures
Bolle et al. Guide to biometrics
US20070271466A1 (en) Security or authentication system and method using manual input measurements, such as via user manipulation of a computer mouse
JP4672327B2 (en) Automatic service method, automatic service device and program thereof
US8856543B2 (en) User identification with biokinematic input
CN100384372C (en) Biometric authentication device and its guide image control method

Legal Events

Date Code Title Description
A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20150210

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150507

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20150519

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20150714

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150727

R150 Certificate of patent or registration of utility model

Ref document number: 5794310

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350