WO2013076858A1 - Dispositif de traitement d'informations - Google Patents

Dispositif de traitement d'informations Download PDF

Info

Publication number
WO2013076858A1
WO2013076858A1 PCT/JP2011/077119 JP2011077119W WO2013076858A1 WO 2013076858 A1 WO2013076858 A1 WO 2013076858A1 JP 2011077119 W JP2011077119 W JP 2011077119W WO 2013076858 A1 WO2013076858 A1 WO 2013076858A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
finger
biometric feature
unit
feature information
Prior art date
Application number
PCT/JP2011/077119
Other languages
English (en)
Japanese (ja)
Inventor
岡崎 健
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to JP2013545731A priority Critical patent/JP5794310B2/ja
Priority to PCT/JP2011/077119 priority patent/WO2013076858A1/fr
Publication of WO2013076858A1 publication Critical patent/WO2013076858A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/60Static or dynamic means for assisting the user to position a body part for biometric acquisition
    • G06V40/63Static or dynamic means for assisting the user to position a body part for biometric acquisition by static guides

Definitions

  • This case relates to information processing devices.
  • biometrics biological authentication that can identify an individual with high accuracy using human biological features such as fingerprints, irises, and veins is widely used as a means for identity verification.
  • a sensor that reads biological information is provided in an information processing apparatus such as a personal computer, depending on how the sensor is arranged, the operator may be forced to take an unreasonable posture when holding the hand over the sensor. Further, as the operator's posture becomes unnatural, the relative position of the hand with respect to the sensor is less likely to be an appropriate position, and the quality and authentication accuracy of the registered biometric information may be reduced.
  • This case has been made in view of these points, and an object thereof is to provide an information processing apparatus that allows an operator to read hand information in a natural posture.
  • an information processing apparatus including sensors having a plurality of scanning directions at different angles, which scans a hand and reads information on the hand, is provided.
  • the sensor is fixed so that the main scanning direction of the plurality of scanning directions intersects with the direction in which the end of the casing of the information processing apparatus facing the operator extends. Arranged.
  • a plurality of operation keys arranged in a predetermined direction and a sensor having a plurality of scanning directions at different angles for scanning the hand to read the hand information are provided.
  • An information processing apparatus is provided.
  • the sensor is fixedly arranged so that the main scanning direction of the plurality of scanning directions intersects with the arrangement direction of the operation keys.
  • an information processing apparatus including a display device and sensors having a plurality of scanning directions at different angles that scan the hand and read the hand information.
  • the sensor is fixedly arranged so that the main scanning direction of the plurality of scanning directions intersects the scanning direction of the display device.
  • FIG. 1 is a diagram illustrating an appearance of an information processing apparatus according to the first embodiment and a state of an operator.
  • the information processing apparatus 10 in FIG. 1 is an apparatus that can receive an operation input from the operator 20 and execute processing.
  • the information processing apparatus 10 includes a sensor 11 that scans the hand of the operator 20 and reads hand information.
  • the information processing apparatus 10 includes, for example, a key input unit 12 in which a plurality of operation keys are arranged, and a display device 13 that displays an image.
  • FIG. 1 shows a state in which the information processing apparatus 10 is viewed from the vertically upward direction with respect to the operation surface 10a on which the key input unit 12 is arranged.
  • the display device 13 is provided so as to be rotatable with respect to the operation surface 10a.
  • the sensor 11 reads, for example, fingerprints, finger veins, palm veins, and the like as the hand information of the operator 20.
  • the sensor 11 has a plurality of scanning directions, and the main scanning direction among the plurality of scanning directions is the D1 direction in FIG.
  • the sensor 11 is assumed to be rectangular.
  • the main scanning direction of the rectangular sensor 11 is a direction along one side. Note that the sensor 11 is not necessarily rectangular, but even if it is not rectangular, the sensor 11 reads hand information by scanning with the D1 direction as the main scanning direction.
  • the sensor 11 is arranged so that the main scanning direction D1 intersects the D2 direction in which the end 10b on the side facing the operator 20 in the casing of the information processing apparatus 10 extends.
  • the sensor 11 is arranged so that the main scanning direction D1 is at an oblique angle with respect to the D2 direction.
  • FIG. 1 shows a state where the right hand 21 is arranged on the upper surface of the key input unit 12 as an example.
  • the operator 20 rotates the position of the right hand 21 around the right elbow 22 in the counterclockwise direction in FIG.
  • the right hand 21 is placed on the upper surface of the sensor 11.
  • the direction of the right hand 21 is perpendicular to the main scanning direction D 1 of the sensor 11 while maintaining a natural posture (that is, the sensor 11 in the sub-scanning direction)
  • the right hand 21 can be held over the sensor 11.
  • the operator 20 does not have to bend the wrist forcibly to align the direction of the hand with the sub-scanning direction of the sensor 11. Further, it is not necessary to change the position of the trunk of the operator 20 in order to align the direction of the hand with the sub-scanning direction of the sensor 11.
  • the sensor 11 can read the hand information accurately. Therefore, it is possible to improve the quality of hand information registered as biometric information for verification and the accuracy of processing during verification.
  • the main scanning direction D1 of the sensor 11 is oblique with respect to the direction D2, but the main scanning direction D1 is perpendicular to the direction D2 (that is, the sub-scanning direction of the sensor 11 is the direction).
  • the sensor 11 may be arranged so as to be parallel to D2.
  • the operator 20 puts out the elbow 22 slightly forward and rotates the right hand 21 around the elbow 22 so that the forearm 23 is parallel to the direction D2, thereby moving the right hand 21 in the sub-scanning direction of the sensor 11.
  • the main scanning direction D1 intersects the operation key arrangement direction D3 (longitudinal direction of the region of the key input unit 12) in the key input unit 12 or the main scanning direction D4 of the display device 13.
  • the sensor 11 may be arranged. In any of these cases, as described above, the hand can be held over the sensor 11 in a natural posture, and the hand information can be accurately read by the sensor 11.
  • FIG. 2 is a diagram illustrating an appearance of the information processing apparatus according to the second embodiment.
  • An information processing apparatus 100 shown in FIG. 2 is a notebook type (laptop type) personal computer to which a security function by biometric authentication using a palm vein is added.
  • the information processing apparatus 100 includes a display unit 120 having an LCD (Liquid Crystal Display) 121, a keyboard 131, and a main body unit 130 having a reading unit 142.
  • LCD Liquid Crystal Display
  • Each of the display unit 120 and the main body unit 130 has a substantially rectangular parallelepiped housing having a front surface, a rear surface facing the front surface, and two side surfaces connecting them.
  • the display unit 120 and the main unit 130 are connected to each other near the rear surface of the main unit 130 so as to be opened and closed by a hinge (not shown).
  • a hinge not shown.
  • the LCD 121 is a display device having a display screen for displaying characters or images.
  • other thin display devices such as an organic EL (Electroluminescence) display may be used as the display device.
  • the keyboard 131 is an input device for inputting characters and performing other operations.
  • the reading unit 142 is an input device that inputs biometric information by reading the veins of the palm by the user holding the palm.
  • the reading unit 142 includes a square vein sensor that acquires a biological image of the palm vein by reading the vein of the user's palm.
  • the vein sensor is arranged so that each side of the vein sensor is parallel to each side of the reading unit 142.
  • the reading unit 142 is on the same top surface of the main body 130 as the keyboard 131 of the information processing apparatus 100, and each side of the square vein sensor is 45 ° on the front and side surfaces of the information processing apparatus 100 at the front center of the keyboard 131. They are arranged at an angle.
  • the vein sensor reads vein information by scanning the object to be read.
  • the main scanning direction of the vein sensor is parallel to one side of the square vein sensor. Accordingly, the main scanning direction of the vein sensor is parallel to one side of the reading unit 142.
  • the main scanning direction of the vein sensor is the D11 direction in FIG.
  • the vein sensor is arranged so that the angle formed by the main scanning direction D11 and the direction D12 in which the front surface 130a of the main body 130 extends is 45 °.
  • the vein sensor is, for example, either the direction D12 of the front surface 130a of the main body 130, the operation key arrangement direction D13 (the longitudinal direction of the keyboard 131) on the keyboard 131, or the main scanning direction D14 of the LCD 121 and the main scanning direction D11.
  • the angle may be 45 °.
  • the notebook type personal computer has been described.
  • the information processing apparatus 100 is an example of the information processing apparatus, and the user authentication function of the present embodiment is not limited to a mobile phone or a PDA. (Personal Digital Assistant) and other mobile communication terminal devices, desktop type personal computers, automated teller machines (ATMs) that accept and withdraw deposits from banks, etc., information processing
  • the present invention can be applied to an information processing apparatus that performs user authentication, such as a system terminal device.
  • FIG. 3 is a diagram illustrating a reading unit according to the second embodiment.
  • a reading unit 142 illustrated in FIG. 3 is an input device that allows a user to read a vein of a palm and input biometric information.
  • the reading unit 142 includes a vein sensor that acquires biological information of the palm vein by reading the vein of the palm of the user.
  • the palm vein has a high identification capability because it has a larger amount of information than other veins, and the vein of the vein is thick, so that stable authentication that is not easily affected by the temperature is possible.
  • it is in-vivo information it is difficult to counterfeit, and since it is not affected by body surface such as rough hands, dryness / wetting, the application rate is high.
  • the reading unit 142 may read a palm print on the palm.
  • the reading unit 142 may read a finger vein or a fingerprint.
  • FIG. 4 is a diagram illustrating a hardware configuration of the information processing apparatus according to the second embodiment.
  • the information processing apparatus 100 shown in FIG. 4 is a notebook type personal computer as described above, and the entire apparatus is controlled by a CPU (Central Processing Unit) 101.
  • a RAM (Random Access Memory) 102, a hard disk drive (HDD: Hard Disk Drive) 103, a graphic processing device 104, an input interface 105, and a communication interface 106 are connected to the CPU 101 via a bus 107.
  • the RAM 102 temporarily stores at least a part of an OS (Operating System) program and application programs to be executed by the CPU 101.
  • the RAM 102 stores various data necessary for processing by the CPU 101.
  • the HDD 103 stores an OS and application programs.
  • a display device such as an LCD 121 is connected to the graphic processing device 104.
  • the graphic processing device 104 can display an image on a display screen of a display device such as the LCD 121 in accordance with a command from the CPU 101. Further, the graphic processing device 104 and the LCD 121 are connected by, for example, a serial communication cable, and control signals and image signals are alternately transmitted and received.
  • the input interface 105 is connected to input devices such as a keyboard 131 and a mouse 132.
  • the input interface 105 outputs a signal sent from an input device such as a keyboard 131 to the CPU 101 via the bus 107.
  • An authentication unit 141 is connected to the input interface 105.
  • the communication interface 106 can be connected to a communication line such as a LAN (Local Area Network).
  • the communication interface 106 can send and receive data to and from other computers via a communication line.
  • the authentication unit 141 receives input of biometric information acquired from the veins of the user's palm and generates biometric feature information indicating the features of the biometric information. Moreover, it authenticates based on the received biometric information. When the authentication unit 141 succeeds in the authentication, the information processing apparatus 100 executes predetermined processing of the information processing apparatus 100 such as enabling the information processing apparatus 100 to be activated.
  • the authentication unit 141 includes an authentication control unit 141a and a biometric feature information storage unit 141b.
  • the authentication control unit 141a controls authentication using the biometric feature information of the palm vein.
  • the biometric feature information storage unit 141b stores biometric feature information used for authentication performed by the authentication unit 141.
  • the biometric feature information storage unit 141b includes an HDD.
  • the biometric feature information storage unit 141b may include an EEPROM (ElectronicallynErasable and Programmable Read Only Memory).
  • the authentication unit 141 can store biometric feature information and identification / authentication information used for authentication in the HDD of the biometric feature information storage unit 141b, and is acquired by the biometric feature information stored in the HDD and the reading unit 142. Authentication is performed based on the biometric feature information.
  • the reading unit 142 is an input device that inputs biometric information by reading the veins of the palm by the user holding the palm.
  • the reading unit 142 includes a living body detection unit 142a, an imaging unit 142b that acquires a vein image of the palm, and a light source unit 142c that emits near-infrared light when the vein is imaged.
  • the living body detection unit 142a detects the base of the finger of the palm and also detects the height of the palm from the upper surface of the reading unit 142.
  • the base of the finger of the palm refers to a region including a valley between adjacent fingers.
  • the living body detection unit 142a includes, for example, an image sensor as a mechanism for detecting the base of the finger of the palm. The detection result by the image sensor is used to determine the direction of the hand with respect to the reading unit 142. Note that this image sensor may also be used as the imaging unit 142b.
  • the living body detection unit 142a includes a distance sensor as a mechanism for detecting the height of the palm, for example.
  • the imaging unit 142b is a vein sensor that images a biological vein.
  • the light source unit 142c is a light source that irradiates near infrared light.
  • the palm is detected by the living body detection unit 142a, the palm is irradiated with near infrared light from the light source unit 142c, and the palm portion is imaged by the imaging unit 142b.
  • the reduced hemoglobin in the vein in the subcutaneous tissue of the palm is reflected in black because it absorbs near-infrared light, and a mesh-like biological image is acquired.
  • the user When causing the reading unit 142 to read the vein of the palm, the user turns the palm of the hand that causes the reading unit 142 to read the vein. Thereby, the reading part 142 can read the vein of a user's palm.
  • the reading unit 142 may be connected to the outside of the information processing apparatus 100, for example.
  • the living body detection unit 142a may have a function of determining the direction of the hand with respect to the reading unit 142.
  • FIG. 5 is a block diagram illustrating an information processing apparatus according to the second embodiment.
  • the information processing apparatus 100 according to the second embodiment includes an information acquisition unit 111, a type determination unit 112, an information generation unit 113, a collation unit 114, and a biometric feature information storage unit 141b.
  • the information acquisition unit 111 is connected to a reading unit 142.
  • the information acquisition unit 111 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 100.
  • the information acquisition unit 111 can acquire the direction of the living body in a state where the biological image is acquired.
  • the biological image acquired by the information acquisition unit 111 is image information of a palm vein pattern.
  • the direction of the living body is two orthogonal different directions based on the left and right of the hand.
  • the reading unit 142 is fixed to the upper part of the information processing apparatus 100.
  • the information acquisition unit 111 determines that the living body is arranged at a predetermined distance from the reading unit 142 and the direction of the living body with respect to the reading unit 142 based on the detection result by the living body detection unit 142a of the reading unit 142.
  • the information acquisition unit 111 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 142.
  • the directions are orthogonal to each other and the angle with the keyboard 131 is an oblique angle.
  • the direction feature portion is a valley portion at the base of the finger in the palm. The direction feature portion will be described later in detail. Note that the directions of the left and right hands may be opposite to each other.
  • the information acquisition unit 111 acquires an image (biological image) including a living body imaged by the imaging unit 142b of the reading unit 142.
  • the type determination unit 112 determines the type of biological information based on the direction of the biological body determined by the information acquisition unit 111. The type indicates the left and right of the hand that is the basis for generating biometric information.
  • the information generating unit 113 generates biological information indicating the characteristics of the living body based on the living body image acquired by the information acquiring unit 111.
  • the information generation unit 113 generates collation biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 112.
  • the collation biometric feature information is basically data having the same configuration as the biometric feature information stored in the biometric feature information storage unit 141b.
  • the biometric feature information of the user to be authenticated in the second embodiment, palm veins). ) Feature information. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown.
  • the information generation unit 113 for example, biometric information based on the biometric image acquired by the information acquisition unit 111, the type of biometric information determined by the type determination unit 112, and identification information that identifies an individual corresponding to the biometric information, Is generated and stored in the biometric feature information storage unit 141b.
  • biometric feature information including the biometric information and the type of the user who has a normal authority registered in advance and is used for authentication is registered.
  • the type is different for the left and right hands.
  • the information generation unit 113 stores the generated biometric feature information in the biometric feature information storage unit 141b.
  • the verification unit 114 performs authentication using the verification biometric feature information generated by the information generation unit 113.
  • the collation unit 114 performs authentication using the collation biometric feature information generated by the information generation unit 113.
  • the collation unit 114 extracts biometric feature information whose type matches the collation biometric feature information from the biometric feature information storage unit 141b, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information.
  • the information processing apparatus 100 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.
  • the biometric feature information storage unit 141b stores biometric feature information indicating the biometric information and the type of the biometric information. Thereby, a user's biometric information and a classification are matched and memorize
  • FIG. 6 is a diagram illustrating a biometric feature table according to the second embodiment.
  • the biometric feature table 141b1 illustrated in FIG. 6 is set in the biometric feature information storage unit 141b included in the information processing apparatus 100 according to the second embodiment.
  • the biometric feature table 141b1 is a table that manages biometric feature information used for biometric authentication of the information processing apparatus 100.
  • the biometric feature table 141b1 is provided with “number”, “ID”, “left / right”, and “feature data” as items. In the biometric feature table 141b1, values set in the above items are associated with each other as biometric feature information.
  • the number is a code that can uniquely identify the biometric feature information. Numbers are set on a one-to-one basis with respect to biometric feature information. Different numbers are set for different biometric feature information of the same user.
  • the ID is a code that can uniquely identify the user of the biometric feature information. The same ID is set in the biometric feature information of the same user. Different IDs are set in the biometric feature information of different users.
  • Left and right indicate the type of palm vein indicated by the biometric feature information. “Right” is set for the biological feature information of the vein of the palm of the right hand. “Left” is set for the biological feature information of the vein of the palm of the left hand.
  • the feature data indicates a file name of data indicating biometric information.
  • the biometric feature table 141b1 illustrated in FIG. 6 is an example, and any item can be set in the biometric feature table.
  • FIGS. 7 and 8 are diagrams illustrating a state when reading the vein of the palm of the right hand according to the second embodiment.
  • FIG. 7 is a diagram of the state when the information processing apparatus 100 reads the vein of the palm of the right hand as viewed from above.
  • FIG. 8 is a view of the state when the information processing apparatus 100 reads the vein of the palm of the right hand as viewed from the front.
  • the information processing apparatus 100 includes a display unit 120 and a main body unit 130.
  • a keyboard 131 and a reading unit 142 are provided on the upper surface of the main body 130 of the information processing apparatus 100.
  • the reading unit 142 is on the same top surface of the main body 130 as the keyboard 131 of the information processing apparatus 100, and each side of the square vein sensor is 45 ° on the front and side surfaces of the information processing apparatus 100 at the front center of the keyboard 131. Arranged at an angle. In addition, a user's head 201, torso 202, upper right arm 203, right lower arm 204, and right hand palm 205 are shown.
  • the user moves the palm (for example, the palm 205 of the right hand) that reads the vein to the left of the side of the information processing apparatus 100. It is positioned so as to be parallel to the upper surface of the main body 130 of the information processing apparatus 100 at an angle of 45 °. At this time, the user is a space separated from the vein sensor surface by a certain distance (for example, several centimeters) with the finger 205 of the palm of the right hand open so that the center of the palm coincides with the center of the reading unit 142. Position on top.
  • the user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • the right and left hands can be determined quickly and reliably. be able to. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced.
  • FIG. 7 demonstrated the case where the vein of the palm of a user's right hand was read, it is the same also when reading the vein of the palm of a left hand, and description is abbreviate
  • each finger of the user's right hand is located in a space apart from the vein sensor surface by a certain distance from the vein sensor surface together with the palm 205 of the right hand. It is possible to prevent the palm from touching and causing an erroneous operation.
  • FIG. 9 is a diagram illustrating detection of a directional feature portion of the hand according to the second embodiment.
  • FIG. 9 shows hand direction and biological direction characteristic portions in the information processing apparatus 100 according to the second embodiment.
  • FIG. 9A shows an acquired image 1421 of the palm of the right hand.
  • FIG. 9B shows an acquired image 1422 of the palm of the left hand.
  • the acquired images 1421 and 1422 are images acquired by the living body detection unit 142a of the reading unit 142.
  • the acquired images 1421 and 1422 are captured by, for example, an image sensor included in the living body detection unit 142a.
  • the upper side in FIG. 9 is the back side (rear side of the information processing apparatus 100) when viewed from the front of the information processing apparatus 100, and the lower side in FIG. 9 (the front side of the information processing apparatus 100), the right side in FIG. 9 is the right side (the right side of the information processing apparatus 100) when viewed from the front of the information processing apparatus 100, and the left side in FIG. It is the left side (left side surface side of the information processing apparatus 100) when viewed from the front.
  • the image acquired by the living body detection unit 142a includes a right hand detection rectangular image area 1420a for detecting a valley portion of the base of the palm of the right hand palm along the upper left side, and an upper right side.
  • a left hand detection rectangular image area 1420b for detecting a valley portion at the base of the finger of the palm of the left hand is set.
  • the right-hand detection rectangular image region 1420a and the left-hand detection rectangular image region 1420b are provided along two sides that are orthogonal to each other in the acquired image.
  • the valley portion at the base of the finger functions as a direction feature portion.
  • the reading unit 142 When the reading unit 142 detects a palm of a user positioned above the reading unit 142 with a distance sensor (not shown) of the living body detection unit 142a, the reading unit 142 acquires a palm image with an image sensor (not shown) of the living body detection unit 142a, and acquires information. Supplied to the unit 111.
  • the information acquisition unit 111 determines whether or not a finger base valley portion is detected in the right-hand detection rectangular image region 1420a or the left-hand detection rectangular image region 1420b set in the acquired image.
  • the palm image acquisition by the image sensor here is performed without being irradiated with near-infrared light from the light source unit 142c.
  • the image acquired here is not an image of the vein of the palm but an image of the appearance of the palm.
  • this detection process is performed for the position where the rectangular image area for right hand detection 1420a or the rectangular image area for left hand detection 1420b is set in the image of the appearance of the palm.
  • the information acquisition unit 111 includes the right hand detection rectangular image region 1420a and the left hand detection rectangular image region 1420b in the valley portion of the base of the finger. It is determined whether or not a certain direction feature portion exists. For example, if the image is an acquired image 1421 as shown in FIG. 9A, the direction feature portion 1421a1 exists in the right hand detection rectangular image area 1420a, but does not exist in the left hand detection rectangular image area 1420b. It is determined that the palm is located at an angle of “0 °”. Thereby, the information processing apparatus 100 determines that the acquired image is an image of the palm of the right hand.
  • the direction feature portion 1422b1 exists in the left-hand detection rectangular image region 1420b, but does not exist in the right-hand detection rectangular image region 1420a. It is determined that the palm of the user is positioned at an angle of “90 °” clockwise. As a result, the information processing apparatus 100 determines that the acquired image is an image of the palm of the left hand.
  • the reading unit 142 acquires an image of the direction feature portion of the palm.
  • the information acquisition unit 111 determines the angle of the palm based on the presence / absence of the direction feature portion of the right hand detection rectangular image region 1420a or the left hand detection rectangular image region 1420b.
  • the type determining unit 112 determines the type of palm (one of the left and right hands) arranged on the reading unit 142 based on the determined palm angle.
  • the detection of the direction feature portion in the second embodiment that is, the valley portion of the base of the finger, for example, the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, All or at least some combinations may be detected based on being open at a predetermined interval.
  • the detection of the valley portion at the base of the finger may be realized by the position of the base of each finger in the palm acquired image acquired by the living body detection unit 142a and the contour identification between the base of the finger. Good.
  • FIG. 10 is a flowchart illustrating biometric feature information acquisition registration processing according to the second embodiment.
  • the biometric feature information acquisition / registration processing is processing for determining the type of palm to be registered, and generating and registering biometric feature information indicating the type of palm and the characteristics of veins.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 10 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.
  • the type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S11.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.
  • Step S14 The information generation unit 113 extracts features of the living body based on the living body image acquired in step S13.
  • the information generation unit 113 generates biometric information indicating the feature extraction result.
  • the biometric information may be information indicating, for example, feature points in veins (eg, branch points of veins) reflected in a biometric image.
  • the biological information may be image information obtained by cutting out a region where a vein is reflected from a biological image.
  • the biological information may be a biological image itself.
  • the information generation unit 113 generates biometric feature information including the type determined in step S12, the biometric information generated in step S14, and the user ID.
  • the information generation unit 113 stores the biometric feature information generated in step S15 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.
  • FIG. 11 is a flowchart illustrating the biometric feature information authentication process according to the second embodiment.
  • the biometric feature information authentication process determines the type of palm to be authenticated, generates verification biometric feature information indicating the type of palm to be authenticated and the characteristics of veins, and verifies the biometric feature information registered in advance to perform authentication. It is processing to do.
  • the collation biometric feature information is data having the same configuration as that of the biometric feature information, and indicates the feature of the user's biometric subject (the palm vein in the second embodiment).
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 11 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.
  • the type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S21.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image, and acquires a biological image in which the veins of the palm are reflected.
  • Step S24 The information generation unit 113 extracts features of the living body based on the living body image acquired in step S23.
  • the information generation unit 113 generates biometric information indicating the feature extraction result.
  • the information generation unit 113 generates verification biometric feature information including the type determined in step S22 and the biometric information generated in step S24.
  • the matching unit 114 performs biometric feature information matching processing (described later in FIG. 12) using the matching biometric feature information generated in step S25. Thereafter, the process ends.
  • the determination of the direction of the living body and the extraction of the characteristics of the living body are performed based on the images from the individual sensors.
  • the sensor captures an image and stores the image in the RAM 102.
  • the information acquisition unit 111 determines the direction of the living body based on the image stored in the RAM 102 (corresponding to steps S11 and S21), and the type determination unit 112 determines the type based on the direction determination result. (Corresponding to steps S12 and S22).
  • the sensor also captures an image again in the state in which near-infrared light is irradiated from the light source unit 142 c and stores the image in the RAM 102.
  • the information generation unit 113 extracts biometric features based on the image stored in the RAM 102, and generates biometric information (corresponding to steps S14 and S24).
  • the information generation unit 113 acquires a biological image by extracting a predetermined region where a vein exists from the image stored in the RAM 102 when extracting the characteristics of the biological body (corresponding to steps S13 and S23).
  • Features may be extracted based on a biological image.
  • FIG. 12 is a flowchart illustrating the biometric feature information matching process according to the second embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 12 will be described in order of step number.
  • the collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S31.
  • the matching unit 114 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 141b.
  • Step S33 The collation unit 114 selects one of the biometric feature information extracted in step S32 and selects biometric information included in each of the selected biometric feature information and collation biometric feature information.
  • Step S34 The collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S33. If the verification is successful (step S34, YES), the process proceeds to step S35. On the other hand, if the verification fails (NO in step S34), the process proceeds to step S36.
  • the collation unit 114 executes a predetermined process when the authentication is successful.
  • the predetermined processing when the authentication is successful may be, for example, authority of the user to log in to the information processing apparatus 100 or a predetermined Internet site, start up an application, permit data access, and the like. Thereafter, the process returns.
  • Step S36 The collation unit 114 determines whether all the biometric feature information extracted in step S32 has been selected in step S33. If all have been selected (step S36 YES), the process proceeds to step S37. On the other hand, if there is an unselected item (NO in step S36), the process proceeds to step S33.
  • the collation unit 114 executes a predetermined process when authentication fails.
  • the predetermined processing when the authentication fails is, for example, denial of login to the information processing apparatus 100 or the predetermined Internet site of the user, denial of activation of the application, denial of authorization such as data access permission, etc. It may be an output of a message indicating refusal. Thereafter, the process returns.
  • FIG. 13 is a diagram illustrating a message window at the time of registration according to the second embodiment.
  • a message window 121a illustrated in FIG. 13 is an example of a window displayed on the display screen of the LCD 121 included in the information processing apparatus 100.
  • the message window 121a displays a message and an image for notifying the user that biometric feature information has been successfully registered based on the palm read.
  • the message window 121a for example, a message “successful registration of the palm of the right hand” and a biological image showing the vein of the palm of the right hand imaged at the time of registration are displayed.
  • the message window 121a has an OK button 121a1.
  • the OK button 121a1 is a button for ending the display of the message window 121a.
  • the user can end the display of the message window 121a by operating the OK button 121a1.
  • the type of living body such as the right hand or the left hand can be determined according to the angle of the living body such as the palm when acquiring the biological information about the vein of the palm or the like. It becomes possible. As a result, adverse effects caused by an increase in the number of objects to be verified can be suppressed.
  • the reading unit 142 can acquire biometric images in a plurality of directions, the degree of freedom of the posture of the user when acquiring biometric information is increased, and the burden on the user's shoulder, arm, and wrist joints can be suppressed. it can.
  • one-to-N collation is performed to collate one collating biometric feature information with a plurality of biometric feature information stored in the biometric feature information storage unit 141b.
  • the palm can be positioned in the right and left directions, the degree of freedom of the posture of the user's arm and wrist is increased as compared with the case where the left and right palms are read in a single direction. The increase in the burden at the time of reading can be suppressed.
  • FIG. 14 is a block diagram illustrating an information processing apparatus according to a first modification of the second embodiment.
  • An information processing apparatus 100a according to a first modification of the second embodiment includes an information acquisition unit 111, a type determination unit 112, an information generation unit 113a, and a collation unit 114a.
  • the information acquisition unit 111 is connected to a reading unit 142.
  • the server 150 is connected to the information processing apparatus 100a via a network 151 that is a communication line such as a LAN or a WAN (Wide Area Network), and includes a biometric feature information storage unit 141b.
  • the information acquisition unit 111 performs the same processing as the information acquisition unit 111 of FIG. That is, the information acquisition unit 111 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 100a.
  • the information acquisition unit 111 also has a function of detecting that a palm is placed at a predetermined height on the reading unit 142 and a function of determining the direction of the hand placed on the reading unit 142.
  • the type determination unit 112 determines the type of biological information based on the direction of the biological body determined by the information acquisition unit 111, as with the type determination unit 112 in FIG.
  • the type indicates the left and right of the hand that is the basis for generating biometric information.
  • the information generation unit 113 a includes collation that includes biological information based on the biological image acquired by the information acquisition unit 111 and the type of biological information determined by the type determination unit 112. Biometric feature information is generated. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown.
  • the information generating unit 113a includes biometric information including biometric information based on the biometric image acquired by the information acquiring unit 111, the type of biometric information determined by the type determining unit 112, and identification information that identifies an individual corresponding to the biometric information. Information is generated and stored in the biometric feature information storage unit 141b of the server 150.
  • biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered.
  • the type is different for the left and right hands.
  • the information generating unit 113a stores the generated biometric feature information in the biometric feature information storage unit 141b.
  • the collation part 114a performs collation using the collation biometric feature information produced
  • the collation unit 114a extracts biometric feature information whose type matches the collation biometric feature information from the biometric feature information stored in the biometric feature information storage unit 141b, and the biometric information of the collation biometric feature information and the extracted biometric feature information Match based on. Thereby, the information processing apparatus 100a performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.
  • the biometric feature information storage unit 141b stores biometric feature information indicating the biometric information and the type of biometric information generated by the information processing apparatus 100a. Thereby, a user's biometric information and a classification are matched and memorize
  • the first modification of the second embodiment as described above also provides the same effects as those of the second embodiment.
  • security and management efficiency are further improved by centralized management by storing biometric feature information in a centralized manner. Can be increased.
  • each user can register and update biometric feature information with a large number of information processing apparatuses via the network 151, the convenience of the administrator and the user can be improved.
  • the processing time for 1-to-N collation is proportional to the number (N) of registered biometric feature information. For this reason, it is possible to double the number of pieces of biometric feature information while suppressing an increase in time required for verification and time required for authentication processing.
  • FIG. 15 is a flowchart showing the biometric feature information acquisition registration process of the second modification of the second embodiment.
  • the biometric feature information acquisition / registration processing is processing for determining the type of palm to be registered, and generating and registering biometric feature information indicating the type of palm and the characteristics of veins.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 15 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.
  • the type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S41.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.
  • Step S44 The information generation unit 113a refers to the type determined in step S42 based on the biometric image acquired in step S43, extracts biometric features, and generates biometric information.
  • the information generation unit 113a extracts the features of the living body by, for example, different processing procedures according to the type determined in step S42. For example, the information generation unit 113a may select a template image corresponding to the type, and extract a biological feature from the biological image using the selected template image. Alternatively, the information generation unit 113a may change the region for detecting the vein from the biological image or change the direction of scanning the biological image when detecting the vein according to the type. By these processes, it is possible to obtain an effect of increasing the processing efficiency when extracting features of the living body or increasing the extraction accuracy.
  • step S43 the information acquisition unit 111 may perform processing according to the type determined in step S42.
  • the information acquisition unit 111 may change the imaging region when capturing a biological image according to the type.
  • the information generation unit 113a generates biometric feature information including the type determined in step S42, the biometric information generated in step S44, and the user ID.
  • the information generation unit 113a stores the biometric feature information generated in step S45 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.
  • FIG. 16 is a flowchart showing a biometric feature information authentication process according to the second modification of the second embodiment.
  • the biometric feature information authentication process determines the type of palm to be authenticated, generates verification biometric feature information indicating the type of palm to be authenticated and the characteristics of veins, and verifies the biometric feature information registered in advance to perform authentication. It is processing to do.
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 16 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.
  • the type determining unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S51.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image, and acquires a biological image in which the veins of the palm are reflected.
  • Step S54 The information generation unit 113a extracts features of the living body with reference to the type determined in step S52 based on the biological image acquired in step S53. In step S54, the information generation unit 113a extracts biological features and generates biological information by the same processing procedure as in step S44 of FIG.
  • step S53 the information acquisition unit 111 may perform processing according to the type determined in step S52.
  • the information acquisition unit 111 may change the imaging region when capturing a biological image according to the type.
  • the information generation unit 113a generates verification biometric feature information including the type determined in step S52 and the biometric information generated in step S54.
  • the matching unit 114a performs biometric feature information matching processing (described later in FIG. 17) using the matching biometric feature information generated in step S55. Thereafter, the process ends.
  • FIG. 17 is a flowchart showing biometric feature information matching processing according to the second modification of the second embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 17 will be described in order of step number.
  • the collation unit 114a acquires collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 114a refers to the biometric feature information storage unit 141b and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S61.
  • the matching unit 114a extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 141b.
  • the matching unit 114a selects an unselected one of the biometric feature information extracted in step S62, and selects the biometric feature selected with reference to the type determined in step S52 of the biometric feature information authentication process. The information is compared with the matching biometric feature information.
  • the collation unit 114a refers to the type of the living body and performs a process suitable for the type, so that the efficiency of the process can be improved. For example, the collation unit 114a changes the order or the calculation method when comparing the biological information according to the type.
  • Step S64 The collation unit 114a determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in Step S63. If the verification is successful (step S64: YES), the process proceeds to step S65. On the other hand, if the verification fails (NO in step S64), the process proceeds to step S66.
  • Step S65 The collation unit 114a performs a predetermined process when the authentication is successful. Thereafter, the process returns.
  • Step S66 The collation unit 114a determines whether all the biometric feature information extracted in step S62 has been selected in step S63. If all have been selected (YES in step S66), the process proceeds to step S67. On the other hand, if there is an unselected item (NO in step S66), the process proceeds to step S63.
  • Step S67 The collation unit 114a performs a predetermined process when authentication fails. Thereafter, the process returns.
  • the second modification example of the second embodiment as described above also provides the same effects as those of the second embodiment.
  • the information processing apparatus is the second implementation in that the extraction of the vein of the palm and the matching of the features are performed by correcting the angle of the living body. The form is different.
  • the direction of the left hand rotates by a predetermined angle (for example, 90 ° clockwise) with respect to the placement direction of the right hand.
  • a predetermined angle for example, 90 ° clockwise
  • the direction from the biological image acquired in the direction of the left hand that is rotated by a predetermined angle with respect to the mounting direction of the right hand The feature may not be extracted because of the difference.
  • the biometric image of the left hand is rotated and corrected by a predetermined angle (for example, 90 ° counterclockwise) to align the direction of the biometric image and extract features. And feature matching.
  • the predetermined angle can be set arbitrarily.
  • FIG. 18 is a flowchart showing biometric feature information acquisition registration processing according to the third modification of the second embodiment.
  • the biometric feature information acquisition / registration processing is processing for determining the type of palm to be registered, and generating and registering biometric feature information indicating the type of palm and the characteristics of veins.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 18 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.
  • the type determining unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S71.
  • the information acquisition unit 111 causes the imaging unit 142 b of the reading unit 142 to capture an image and acquires a biological image in which a palm vein is reflected.
  • Step S74 Based on the biological image acquired in Step S73, the information generation unit 113 corrects the angle of the biological image based on the direction determined in Step S71, extracts the characteristics of the biological image, and generates biological information. To do.
  • the correction of the angle of the living body image is performed by rotating the living body image acquired in step S73 in the direction opposite to the angle indicating the direction determined in step S71, regardless of the angle of the living body with respect to the reading unit 142. Processing is performed so that the palm veins are in the same direction.
  • the present invention is not limited to this, and the angle may be corrected on the processing side for extracting features without rotating the biological image.
  • the information generation unit 113 generates biometric feature information including the type determined in step S72, the biometric information generated in step S74, and the user ID.
  • the information generation unit 113 stores the biometric feature information generated in step S75 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.
  • FIG. 19 is a flowchart showing a biometric feature information authentication process according to the third modification of the second embodiment.
  • the biometric feature information authentication process determines the type of palm to be authenticated, generates verification biometric feature information indicating the type of palm to be authenticated and the characteristics of veins, and verifies the biometric feature information registered in advance to perform authentication. It is processing to do.
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 19 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.
  • the type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S81.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.
  • Step S84 The information generation unit 113 corrects the angle of the biological image based on the direction determined in Step S81 based on the biological image acquired in Step S83, extracts the characteristics of the biological image, and acquires the biological information. Generate.
  • the present invention is not limited to this, and the angle may be corrected on the processing side for extracting features without rotating the biological image.
  • the information generation unit 113 generates verification biometric feature information including the type determined in step S84 and the biometric information generated in step S84.
  • the collation unit 114 executes biometric feature information collation processing (described later in FIG. 20) using the collation biometric feature information generated in step S85. Thereafter, the process ends.
  • FIG. 20 is a flowchart showing biometric feature information matching processing according to the third modification of the second embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 20 will be described in order of step number.
  • the collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S91.
  • the matching unit 114 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 141b.
  • the matching unit 114 selects one of the biometric feature information extracted in step S92 and selects biometric information included in each of the selected biometric feature information and the matching biometric feature information.
  • Step S94 The collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S93. If the verification is successful (YES in step S94), the process proceeds to step S95. On the other hand, if the verification fails (NO in step S94), the process proceeds to step S96.
  • Step S95 The collation unit 114 executes a predetermined process when authentication is successful. Thereafter, the process returns.
  • the collation unit 114 determines whether all the biometric feature information extracted in step S92 has been selected in step S93. If all have been selected (YES in step S96), the process proceeds to step S97. On the other hand, if there is an unselected item (NO in step S96), the process proceeds to step S93.
  • Step S97 The collation unit 114 executes a predetermined process when authentication fails. Thereafter, the process returns.
  • the third modification of the second embodiment as described above also has the same effect as that of the second embodiment.
  • the direction of the right hand of the user when used as a reference, the direction of the left hand rotates by a predetermined angle (for example, 90 ° clockwise) with respect to the placement direction of the right hand.
  • a predetermined angle for example, 90 ° clockwise
  • the type of biological information is specified.
  • the predetermined angle can be set arbitrarily.
  • FIG. 21 is a diagram illustrating a biometric feature table of the fourth modification example of the second embodiment.
  • the biometric feature table 141b2 illustrated in FIG. 21 is set in the biometric feature information storage unit 141b included in the information processing apparatus 100.
  • the biometric feature table 141b2 is a table that manages biometric feature information used for biometric authentication of the information processing apparatus 100.
  • the biometric feature table 141b2 is provided with “number”, “ID”, “angle”, and “feature data” as items. In the biometric feature table 141b2, values set in the above items are associated with each other as biometric feature information.
  • the angle indicates the angle when the palm vein indicated by the biometric feature information is detected.
  • biometric feature information of the vein of the palm of the right hand “0” indicating 0 ° as a reference angle is set.
  • biometric feature information of the vein of the palm of the left hand “90” or “180” indicating a predetermined angle rotated clockwise from the reference angle is set.
  • a plurality of different angles can be set as the predetermined angle.
  • the angle is set to “0” indicating 0 ° for the biometric feature information of the right hand, and “90” and 180 ° indicating 90 ° for the biometric feature information of the left hand. “180” is set.
  • the biometric feature information whose angle is 0 is the biometric feature information of the right hand.
  • the biometric feature information whose angle is 90 is the biometric feature information of the left hand acquired in a state where the direction of the right hand and the direction of the left hand are orthogonal to the direction in which the left hand rotates 90 ° clockwise with respect to the right hand.
  • the biometric feature information whose angle is 180 is the biometric feature information of the left hand acquired in a state where the direction of the right hand and the direction of the left hand are opposite.
  • the fourth modification of the second embodiment is applicable even when there are a plurality of types of angles between the left hand direction and the right hand direction.
  • FIG. 22 is a flowchart illustrating biometric feature information acquisition and registration processing according to the fourth modification of the second embodiment.
  • the biometric feature information acquisition / registration process is a process for determining the direction of the hand to be registered, and generating and registering biometric feature information indicating the direction of the hand and the characteristics of the vein.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 22 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height position on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image, and acquires a biological image in which the veins of the palm are reflected.
  • the information generation unit 113 extracts features of the living body based on the biological image acquired in step S102.
  • the information generation unit 113 generates biometric information indicating the feature extraction result.
  • the information generation unit 113 generates biometric feature information including the angle indicating the direction of the hand determined in step S101, the biometric information generated in step S103, and the user ID.
  • Step S105 The information generation unit 113 stores the biometric feature information generated in step S104 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.
  • FIG. 23 is a flowchart showing biometric feature information authentication processing according to the fourth modification of the second embodiment.
  • the biometric feature information authentication process determines the direction of the hand to be authenticated, generates verification biometric feature information indicating the direction of the hand to be authenticated and the features of the vein, and performs authentication by comparing with biometric feature information registered in advance. It is processing to do.
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 23 will be described in order of step number.
  • Step S111 The information acquisition unit 111 determines that the palm is placed at a predetermined height position on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image, and acquires a biological image in which the veins of the palm are reflected.
  • the information generation unit 113 extracts features of the living body based on the living body image acquired in step S112.
  • the information generation unit 113 generates biometric information indicating the feature extraction result.
  • Step S114 The information generation unit 113 generates collation biometric feature information including the angle indicating the direction of the hand determined in step S111 and the biometric information generated in step S113.
  • Step S115 The collation unit 114 executes biometric feature information collation processing (described later in FIG. 24) using the collation biometric feature information generated in step S114. Thereafter, the process ends.
  • FIG. 24 is a flowchart showing biometric feature information matching processing according to the fourth modification of the second embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 24 will be described in order of step number.
  • the collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the angle of the collation biometric feature information acquired in step S121.
  • the matching unit 114 extracts the biometric feature information in which both the angle of the matching biometric feature information and the user ID match from the biometric feature information storage unit 141b.
  • Step S123 The collation unit 114 selects one of the biometric feature information extracted in step S122 and selects biometric information included in each of the selected biometric feature information and collation biometric feature information.
  • Step S124 The collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information has succeeded as a result of the collation in step S123. If the verification is successful (step S124: YES), the process proceeds to step S125. On the other hand, if the verification fails (NO in step S124), the process proceeds to step S126.
  • Step S125 The collation unit 114 performs a predetermined process when the authentication is successful. Thereafter, the process returns.
  • the collation unit 114 determines whether all the biometric feature information extracted in step S122 has been selected in step S123. If all have been selected (YES in step S126), the process proceeds to step S127. On the other hand, if there is an unselected item (NO in step S126), the process proceeds to step S123.
  • Step S127 The collation unit 114 executes predetermined processing when authentication fails. Thereafter, the process returns.
  • the fourth modification of the second embodiment as described above also has the same effect as that of the second embodiment.
  • the direction of the living body with respect to the reading unit 142 is determined, and the features of the living body are extracted and collated with an angle based on the determined direction. Extraction and collation processing of biometric feature information can be performed.
  • the direction of the right hand of the user when used as a reference, the direction of the left hand rotates by a predetermined angle (for example, 90 ° clockwise) with respect to the placement direction of the right hand.
  • a predetermined angle for example, 90 ° clockwise
  • the right hand in addition to information indicating the type of living body (for example, the left or right palm vein) of the biological feature information, the right hand is 0 °, By setting an angle indicating a predetermined angle for the left hand, when new biometric feature information is acquired, it is possible to update the biometric feature information of the same type and angle.
  • the predetermined angle can be set arbitrarily.
  • FIG. 25 is a diagram illustrating a biometric feature table of the fifth modification example of the second embodiment.
  • the biometric feature table 141b3 illustrated in FIG. 25 is set in the biometric feature information storage unit 141b included in the information processing apparatus 100.
  • the biometric feature table 141b3 is a table that manages biometric feature information used for biometric authentication of the information processing apparatus 100.
  • the biometric feature table 141b3 is provided with “number”, “ID”, “left / right”, “angle”, and “feature data” as items.
  • values set in the above items are associated with each other as biometric feature information.
  • the fifth modification example of the second embodiment is also applicable when there are a plurality of types of hand angles for each of the left hand and the right hand.
  • the biometric feature table 141b3 illustrated in FIG. 25 is an example, and any item can be set in the biometric feature table.
  • FIG. 26 is a flowchart showing a biometric feature information acquisition / registration process according to a fifth modification of the second embodiment.
  • the biometric feature information acquisition / registration process is a process for determining the direction and type of a hand to be registered, and generating and registering biometric feature information indicating the direction, type, and vein characteristics of the hand.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein.
  • the process illustrated in FIG. 26 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height position on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.
  • the type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S131.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.
  • Step S134 The information generation unit 113 extracts features of biological information based on the biological image acquired in step S133.
  • the information generation unit 113 generates biometric information indicating the feature extraction result.
  • the information generation unit 113 generates biometric feature information including the type determined in step S132, the biometric information generated in step S134, and the user ID.
  • the information generation unit 113 is the same user as the biometric feature information generated in Step S135, and has the same angle indicating the type determined in Step S132 and the direction of the hand determined in Step S131. It is determined whether or not feature information exists. If there is biometric feature information of the same user and the same type and angle (YES in step S136), the process proceeds to step S137. On the other hand, if there is no biometric feature information of the same user and the same type and angle (NO in step S136), the process proceeds to step S138.
  • whether or not the users of the biometric feature information are the same, for example, accepts input of personal information such as a user ID and a name that can identify the ID when performing biometric feature information acquisition processing, You may determine by comparing with ID.
  • Step S137 The information generation unit 113 stores the biometric feature information generated in step S135 in the biometric feature information storage unit 141b. Thereby, the biometric feature information stored in the biometric feature information storage unit 141b is updated. Thereafter, the process ends.
  • Step S138 The information generation unit 113 newly stores the biometric feature information generated in step S135 in the biometric feature information storage unit 141b. Thereby, biometric feature information is newly registered in the biometric feature information storage unit 141b. Thereafter, the process ends.
  • FIG. 27 is a flowchart showing biometric feature information authentication processing according to the fifth modification of the second embodiment.
  • the biometric feature information authentication process determines the direction and type of the authentication target hand, generates matching biometric feature information indicating the direction, type, and vein characteristics of the authentication target hand. This is a process of verifying by verification.
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 27 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height position on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.
  • the type determining unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S141.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.
  • Step S144 The information generation unit 113 extracts features of biological information based on the biological image acquired in step S143.
  • the information generation unit 113 generates biometric information indicating the feature extraction result.
  • Step S145 The information generation unit 113 generates collation biometric feature information based on the angle (hand direction) determined in step S141, the type determined in step S142, and the biometric information extracted in step S144. .
  • Step S146 The matching unit 114 executes biometric feature information matching processing (described later in FIG. 28) using the matching biometric feature information generated in step S145. Thereafter, the process ends.
  • FIG. 28 is a flowchart illustrating biometric feature information matching processing according to the fifth modification of the second embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 28 will be described in order of step number.
  • the collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the angle (hand direction) and type of the collation biometric feature information acquired in step S151.
  • the matching unit 114 extracts biometric feature information in which the angle, type, and user ID of the matching biometric feature information match from the biometric feature information storage unit 141b.
  • the collation unit 114 selects one of the biometric feature information extracted in step S152, and collates the biometric information included in each of the selected biometric feature information and the collated biometric feature information. At this time, when the angle of the selected biometric feature information matches the angle of the matching biometric feature information, the matching unit 114 performs matching without correcting the biometric angle. On the other hand, when the angle of the selected biometric feature information and the angle of the matching biometric feature information do not match, matching is performed after correcting the angle of the living body so that the respective angles match.
  • Step S154 The collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in step S153. If the verification is successful (step S154 YES), the process proceeds to step S155. On the other hand, if the verification fails (NO in step S154), the process proceeds to step S156.
  • Step S155 The collation unit 114 executes a predetermined process when authentication is successful. Thereafter, the process returns.
  • the collation unit 114 determines whether all the biometric feature information extracted in step S152 has been selected in step S153. If all have been selected (YES in step S156), the process proceeds to step S157. On the other hand, if there is an unselected item (NO in step S156), the process proceeds to step S153.
  • Step S157 The collation unit 114 performs a predetermined process when authentication fails. Thereafter, the process returns.
  • the fifth modification of the second embodiment as described above also has the same effect as that of the second embodiment.
  • the direction of the left hand rotates by a predetermined angle (for example, 90 ° clockwise) with respect to the placement direction of the right hand.
  • a predetermined angle for example, 90 ° clockwise
  • biometric information generated from the biometric image acquired in the direction of the right hand, and biometric information generated from the biometric image acquired in the direction of the left hand that has a relationship rotated by a predetermined angle with respect to the right hand placement direction May not be verified due to different directions.
  • the biometric feature information of the left hand generated from the biometric image is corrected by rotating it by a predetermined angle (for example, 90 ° counterclockwise), and the direction of the biometric feature information is changed. Matching makes it possible to collate.
  • the predetermined angle can be set arbitrarily.
  • FIG. 29 is a flowchart showing a biometric feature information acquisition / registration process according to a sixth modification of the second embodiment.
  • the biometric feature information acquisition / registration processing is processing for determining the type of palm to be registered, and generating and registering biometric feature information indicating the type of palm and the characteristics of veins.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a palm vein. In the following, the process illustrated in FIG. 29 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the registration target hand is determined.
  • the type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S161.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.
  • Step S164 The information generation unit 113 extracts features of the living body based on the living body image acquired in Step S163.
  • the information generation unit 113 generates biometric information indicating the feature extraction result.
  • the information generation unit 113 generates biometric feature information including the type determined in step S162, the biometric information generated in step S164, and the user ID. Moreover, the information generation part 113 correct
  • the correction of the angle of the biometric information is performed by converting the biometric information generated in step S164 so that the biometric information is converted into information when rotated in the direction opposite to the angle determined in step S161. Regardless, processing is performed so that the biometric information included in the biometric feature information becomes information based on the same direction.
  • the information generation unit 113 may correct the angle of the biological information before generating the biological feature information.
  • the information generation unit 113 stores the biometric feature information generated in step S165 in the biometric feature information storage unit 141b. Accordingly, the biometric feature information is registered in the biometric feature information storage unit 141b. Thereafter, the process ends.
  • FIG. 30 is a flowchart showing a biometric feature information authentication process according to the sixth modification of the second embodiment.
  • the biometric feature information authentication process determines the type of palm to be authenticated, generates verification biometric feature information indicating the type of palm to be authenticated and the characteristics of veins, and verifies the biometric feature information registered in advance to perform authentication. It is processing to do.
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a palm vein. In the following, the process illustrated in FIG. 30 will be described in order of step number.
  • the information acquisition unit 111 determines that the palm is placed at a predetermined height on the reading unit 142 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 142. If it is determined that the palm is placed, the information acquisition unit 111 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected direction feature portion. Determine the direction of the hand to be authenticated.
  • the type determination unit 112 determines the type (left or right) of the palm based on the direction of the hand determined in step S171.
  • the information acquisition unit 111 causes the imaging unit 142b of the reading unit 142 to capture an image and acquires a biological image in which the veins of the palm are reflected.
  • Step S174 The information generation unit 113 extracts features of the living body based on the living body image acquired in step S173.
  • the information generation unit 113 generates biometric information indicating the feature extraction result.
  • Step S175 The information generation unit 113 generates verification biometric feature information including the type determined in step S172 and the biometric information generated in step S174. Moreover, the information generation part 113 correct
  • Step S176 The matching unit 114 executes biometric feature information matching processing (described later in FIG. 31) using the matching biometric feature information generated in step S175. Thereafter, the process ends.
  • FIG. 31 is a flowchart showing a biometric feature information matching process according to the sixth modification of the second embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 31 will be described in order of step number.
  • the collation unit 114 acquires collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 114 refers to the biometric feature information storage unit 141b, and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S181.
  • the matching unit 114 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 141b.
  • Step S183 The collation unit 114 selects one of the biometric feature information extracted in step S182 and selects biometric information included in each of the selected biometric feature information and collation biometric feature information.
  • Step S184 The collation unit 114 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in step S183. If the verification is successful (step S184: YES), the process proceeds to step S185. On the other hand, if the verification fails (NO in step S184), the process proceeds to step S186.
  • Step S185 The collation unit 114 executes a predetermined process when authentication is successful. Thereafter, the process returns.
  • the collation unit 114 determines whether all the biometric feature information extracted in step S182 has been selected in step S183. If all have been selected (YES in step S186), the process proceeds to step S187. On the other hand, if there is an unselected one (step S186 NO), the process proceeds to step S183.
  • Step S187 The collation unit 114 executes a predetermined process when authentication fails. Thereafter, the process returns.
  • the sixth modification of the second embodiment as described above also has the same effect as that of the second embodiment.
  • FIG. 32 is a diagram illustrating an appearance of an information processing apparatus according to a seventh modification example of the second embodiment.
  • An information processing apparatus 300 illustrated in FIG. 32 is a notebook personal computer to which a security function based on biometric authentication using a palm vein is added.
  • the information processing apparatus 300 includes a display unit 120 having an LCD 121, a keyboard 131, and a main body unit 330 having a reading unit 342.
  • Each of the display unit 120 and the main body unit 330 has a substantially rectangular parallelepiped housing having a front surface, a rear surface facing the front surface, and two side surfaces connecting them.
  • the display unit 120 and the main body 330 are connected to each other near the rear surface of the main body 330 so as to be opened and closed by a hinge (not shown).
  • a hinge not shown.
  • the LCD 121 is a display device having a display screen for displaying characters or images.
  • other thin display devices such as an organic EL display may be used as the display device.
  • the keyboard 131 is an input device for inputting characters and performing other operations.
  • the reading unit 342 is an input device that inputs a biological image by reading a vein of a palm by a user holding the palm.
  • the reading unit 342 includes a square vein sensor that acquires a biological image of the palm vein by reading the vein of the user's palm.
  • the vein sensor is arranged so that each side of the vein sensor is parallel to each side of the reading unit 342.
  • the reading unit 342 is on the same top surface of the main body 330 as the keyboard 131 of the information processing apparatus 300, and each side of the square vein sensor is parallel to the front and side surfaces of the information processing apparatus 300 at the front center of the keyboard 131. Is arranged.
  • the vein sensor reads vein information by scanning the reading object.
  • the main scanning direction of the vein sensor is parallel to one side of the square vein sensor. Therefore, the main scanning direction of the vein sensor is parallel to one side of the reading unit 342.
  • the main scanning direction of the vein sensor is the D31 direction in FIG.
  • the vein sensor is arranged so that the angle formed by the main scanning direction D31 and the direction D32 in which the front surface 330a of the main body 330 extends is 90 °.
  • the vein sensor is, for example, either the direction D32 of the front surface 330a of the main body 330, the arrangement direction D33 of operation keys on the keyboard 131 (longitudinal direction of the keyboard 131), or the main scanning direction D34 of the LCD 121 and the main scanning direction D31.
  • the angle may be 90 °.
  • the information processing apparatus 300 of the present embodiment a notebook type personal computer has been described.
  • the information processing apparatus 300 is an example of an information processing apparatus, and the user authentication function of the present embodiment is not limited to a mobile phone or a PDA.
  • other mobile communication terminal devices desktop personal computers, automatic teller machines for accepting and withdrawing deposits from banks, information processing system terminal devices, etc. Can be applied to.
  • FIG. 33 is a diagram illustrating a biometric feature table of the seventh modification example of the second embodiment.
  • the biometric feature table 141b4 illustrated in FIG. 33 is set in the biometric feature information storage unit 141b included in the information processing apparatus 300 according to the seventh modification example of the second embodiment.
  • the biometric feature table 141b4 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 300.
  • the biometric feature table 141b4 is provided with “number”, “ID”, “left / right”, “angle”, and “feature data” as items.
  • values set in the above items are associated with each other as biometric feature information.
  • the angle indicates the angle when the palm vein indicated by the biometric feature information is detected as described above.
  • biometric feature information of the vein of the palm of the right hand “0” indicating 0 ° as a reference angle is set.
  • biometric feature information of the vein of the palm of the left hand “90” or “180” indicating a predetermined angle rotated clockwise from the reference angle is set.
  • a plurality of different angles can be set as the predetermined angle.
  • the direction of the biometric feature information of the right hand and the direction of the biometric feature information of the left hand are opposite directions (180 ° rotation).
  • the biometric feature information of the left hand that is opposite to the direction of the biometric feature information of the right hand and the biometric feature in which the direction of the biometric feature information of the left hand is orthogonal (rotated by 90 °) with the direction of the biometric feature information of the right hand. It can be distinguished from feature information.
  • the seventh modification of the second embodiment is applicable even when there are a plurality of types of angles between the left hand direction and the right hand direction.
  • FIG. 34 is a diagram illustrating a state when the vein of the palm of the right hand is read according to the seventh modification example of the second embodiment.
  • FIG. 34 is a diagram of the state when the information processing apparatus 300 reads the vein of the palm of the right hand as viewed from above.
  • the information processing apparatus 300 includes a display unit 120 and a main body unit 330.
  • a keyboard 131 and a reading unit 342 are provided on the upper surface of the main body 330 of the information processing apparatus 300.
  • the reading unit 342 is on the upper surface of the same main body 330 as the keyboard 131 of the information processing apparatus 300, and the sides of the rectangular vein sensor having a long horizontal direction are the front and side surfaces of the information processing apparatus 300 at the center in front of the keyboard 131. It is arranged in parallel with.
  • a user's head 201, torso 202, upper right arm 203, right lower arm 204, and right hand palm 205 are shown.
  • the user When the user causes the reading unit 342 to read the vein of the palm, as shown in FIG. 34, the user makes the palm (for example, the palm 205 of the right hand) that reads the vein parallel to the front surface of the information processing apparatus 300. And it positions so that it may become parallel to the upper surface of the main-body part 330 of the information processing apparatus 300.
  • the user is a space separated from the vein sensor surface by a certain distance (for example, several centimeters) with the finger 205 of the palm of the right hand open so that the center of the palm coincides with the center of the reading unit 342. Position on top.
  • the user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • the base between the fingers of the hand to detect that the palm 205 of the right hand is placed on the reading unit 342, it is possible to quickly and reliably determine the left and right hands. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. 34, the case where the vein of the palm of the right hand of the user is read has been described. However, the present invention is not limited to this, and the same applies to the case of reading the vein of the palm of the left hand.
  • FIG. 35 is a diagram illustrating detection of a direction characteristic portion of the hand according to the seventh modification example of the second embodiment.
  • FIG. 35 shows the direction features of the hand direction and the biological information in the information processing apparatus 300 according to the second embodiment.
  • FIG. 35A shows an acquired image 3421 of the palm of the right hand.
  • FIG. 35B shows an acquired image 3422 of the palm of the left hand.
  • the acquired images 3421 and 3422 are images acquired by the living body detection unit 142a of the reading unit 342.
  • the acquired images 3421 and 3422 are captured by, for example, an image sensor provided in the living body detection unit 142a.
  • the image acquired by the living body detection unit 142a includes a right hand detection rectangular image area 3420a for detecting the root part of the palm of the right hand palm, and a valley part of the base of the finger of the left hand palm.
  • a left-hand detection rectangular image region 3420b for detecting the image is set.
  • the right-hand detection rectangular image region 3420a and the left-hand detection rectangular image region 3420b are provided along two parallel sides of the acquired image.
  • the valley portion at the base of the finger functions as a direction feature portion.
  • the reading unit 342 When the reading unit 342 detects a palm of a user positioned above the reading unit 342 with a distance sensor (not shown) of the living body detection unit 142a, the reading unit 342 acquires a palm image using an image sensor (not shown) of the living body detection unit 142a, and acquires information. Supplied to the unit 111.
  • the information acquisition unit 111 determines whether or not a valley portion at the base of the finger is detected in the right-hand detection rectangular image region 3420a or the left-hand detection rectangular image region 3420b set in the acquired image.
  • the palm image acquisition by the image sensor here is performed without being irradiated with near-infrared light from the light source unit 142c.
  • the image acquired here is not an image of the vein of the palm but an image of the appearance of the palm.
  • this detection process is performed for the position where the right hand detection rectangular image area 3420a or the left hand detection rectangular image area 3420b is set in the image of the appearance of the palm.
  • the information acquisition unit 111 includes the right hand detection rectangular image region 3420a and the left hand detection rectangular image region 3420b in the valley portion of the base of the finger. It is determined whether or not a certain direction feature portion exists. For example, if the image is an acquired image 3421 as shown in FIG. 35A, the direction feature portion 3421a1 exists in the right-hand detection rectangular image region 3420a, but not in the left-hand detection rectangular image region 3420b. It is determined that the palm is located at an angle of “0 °”. As a result, the information processing apparatus 300 determines that the acquired image is an image of the vein of the palm of the right hand.
  • the information processing apparatus 300 determines that the acquired image is an image of the vein of the palm of the left hand.
  • the reading unit 342 acquires an image of the direction feature portion in the palm.
  • the information acquisition unit 111 determines the angle of the palm based on the presence or absence of the direction feature portion of the right-hand detection rectangular image region or the left-hand detection rectangular image region.
  • the type determination unit 112 determines the type of the palm arranged on the reading unit 342 (one of the left and right hands) based on the determined palm angle.
  • the acquired image for detecting the direction feature portion illustrated in FIGS. 35A and 35B may be an image captured by the imaging unit 142b.
  • the detection of the direction feature portion in the seventh modification of the second embodiment that is, the valley portion of the base of the finger, for example, the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and ring finger, the ring finger and the little finger What is necessary is just to detect based on that all or at least one part combination of the valley part of the root of this is open at predetermined intervals.
  • the detection of the valley portion at the base of the finger may be realized by the position of the base of each finger in the palm acquired image acquired by the living body detection unit 142a and the contour identification between the base of the finger. Good.
  • the seventh modification of the second embodiment as described above also has the same effect as that of the second embodiment.
  • an eighth modification of the second embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted.
  • the position of the reading unit is deviated from the center line of the main body, and the directions of the left and right living bodies (hands) are on each side of the main body.
  • the second embodiment differs from the second embodiment in that it intersects at a predetermined angle other than orthogonal or parallel.
  • FIG. 36 is a diagram illustrating an appearance of an information processing apparatus according to an eighth modification of the second embodiment.
  • An information processing apparatus 400 illustrated in FIG. 36 is a notebook personal computer to which a security function based on biometric authentication using a palm vein is added.
  • the information processing apparatus 400 includes a display unit 120 having an LCD 121, a keyboard 131, and a main body unit 430 having a reading unit 442.
  • the LCD 121 is a display device having a display screen for displaying characters or images.
  • other thin display devices such as an organic EL display may be used as the display device.
  • the keyboard 131 is an input device for inputting characters and performing other operations.
  • the reading unit 442 is an input device that inputs a biometric image by reading a vein of a palm by the user holding the palm.
  • the reading unit 442 includes a square vein sensor that acquires a biological image of a palm vein by reading the vein of the palm of the user.
  • the vein sensor is arranged so that each side of the vein sensor is parallel to each side of the reading unit 442.
  • the reading unit 442 is on the same top surface of the main body 430 as the keyboard 131 of the information processing apparatus 400, and in front of the keyboard 131 and on the left side, one side of the square vein sensor is located on the side of the main body 430.
  • the other side is inclined at 65 ° with respect to the front surface of the main body 430.
  • the reading unit 442 is provided on the upper surface of the main body 430 and on the left side. It may be provided. In addition, the main body 430 may be provided at an arbitrary angle with respect to the front surface and the side surface.
  • the vein sensor reads vein information by scanning the reading object.
  • the main scanning direction of the vein sensor is parallel to one side of the square vein sensor. Therefore, the main scanning direction of the vein sensor is parallel to one side of the reading unit 442.
  • the main scanning direction of the vein sensor is the D41 direction in FIG.
  • the vein sensor is arranged such that an angle formed between the main scanning direction D41 and the direction D42 in which the front surface 430a of the main body 430 extends is 65 °.
  • the vein sensor is, for example, either the direction D42 of the front surface 430a of the main body 430, the operation key arrangement direction D43 (the longitudinal direction of the keyboard 131) on the keyboard 131, or the main scanning direction D44 of the LCD 121 and the main scanning direction D41.
  • the angle may be 65 °.
  • the notebook type personal computer has been described.
  • the information processing apparatus 400 is an example of the information processing apparatus, and the user authentication function according to the present embodiment can be applied to a mobile phone or a PDA.
  • other mobile communication terminal devices desktop personal computers, automatic teller machines for accepting and withdrawing deposits from banks, information processing system terminal devices, etc. Can be applied to.
  • FIG. 37 is a diagram showing a state at the time of reading the vein of the palm of the right hand according to the eighth modified example of the second embodiment.
  • FIG. 37 is a diagram of the state when the information processing apparatus 400 reads the vein of the palm of the right hand as viewed from above.
  • the information processing apparatus 400 includes a display unit 120 and a main body unit 430.
  • a keyboard 131 and a reading unit 442 are provided on the upper surface of the main body 430 of the information processing apparatus 400.
  • a user's head 201, torso 202, upper right arm 203, right lower arm 204, and right hand palm 205 are shown.
  • the user When the user causes the reading unit 442 to read the vein of the palm 205 of the right hand, the user holds the palm 205 of the right hand with a predetermined angle with respect to the front surface of the information processing apparatus 400 as shown in FIG.
  • the main body portion 430 of 400 is positioned so as to be parallel to the upper surface.
  • the user is a space separated from the vein sensor surface by a certain distance (for example, several centimeters) with the palm of the right hand 205 open with the finger open so that the center of the palm coincides with the center of the reading unit 442. Position on top. As shown in FIG.
  • the reading unit 442 since the reading unit 442 is installed on the left side of the main body unit 430, the user's right arm (upper right arm unit 203, right lower arm unit 204) is compared with the second embodiment.
  • the body portion 202 is bent toward the body portion 202 and is located near the body portion 202.
  • the user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • the detection can be performed without any problem. Further, the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced.
  • FIG. 38 is a diagram illustrating a state when the vein of the palm of the left hand is read according to the eighth modification example of the second embodiment.
  • FIG. 38 is a diagram of the state when the information processing apparatus 400 reads the vein of the palm of the left hand as viewed from above.
  • the information processing apparatus 400 includes a display unit 120 and a main body unit 430.
  • a keyboard 131 and a reading unit 442 are provided on the upper surface of the main body 430 of the information processing apparatus 400.
  • a user's head 201, torso 202, left upper arm 206, left lower arm 207, and left hand palm 208 are shown.
  • the user When the user causes the reading unit 442 to read the vein of the palm 208 of the left hand, the user holds the palm 208 of the left hand that reads the vein at a predetermined angle with respect to the front surface of the information processing apparatus 400 as shown in FIG. And it arrange
  • FIG. At this time, the user has a space separated by a certain distance (for example, several centimeters) from the vein sensor surface with the finger 208 of the palm of the left hand open so that the center of the palm coincides with the center of the reading unit 442. Position on top. As shown in FIG.
  • the user's left arm (the upper left arm 206 and the lower left arm 207) is the same as in the second embodiment or FIG. Compared to the case of the palm of the right hand described above, it is in a stretched state and is located far from the body portion 202. The user does not need to twist the wrist between the palm 208 of the left hand and the left lower arm 207 at the time of reading, and can be made almost straight. Along with this, each finger of the user's left hand is straightened and opened sufficiently and the four bases between the finger of the user's left hand are sufficiently spaced. Therefore, the palm 208 of the left hand has no horizontal twist, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • the base between the fingers of the hand to detect that the palm 208 of the left hand is placed on the reading unit 442, it is possible to quickly and reliably determine the left and right hands. Also, the angle of the left wrist part of the user, the left lower arm part 207 and the left upper arm part 206 from the left wrist, the elbow between the left lower arm part 207 and the left upper arm part 206, and between the left upper arm part 206 and the body part 202 There is no unreasonable posture of the left shoulder, and the burden on the user can be reduced.
  • the eighth modification of the second embodiment as described above also provides the same effects as those of the second embodiment.
  • a third embodiment will be described. Differences from the second embodiment will be mainly described, and description of similar matters will be omitted.
  • the information processing apparatus according to the third embodiment is different from the second embodiment in that the biological information is feature information about the finger vein.
  • FIG. 39 is a block diagram illustrating an information processing apparatus according to the third embodiment.
  • An information processing apparatus 500 according to the third embodiment includes an information acquisition unit 511, a type determination unit 512, an information generation unit 513, a collation unit 514, and a biometric feature information storage unit 541b.
  • a reading unit 542 is connected to the information acquisition unit 511.
  • the information acquisition unit 511 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 500.
  • the information acquisition unit 511 can acquire the direction of the living body in a state where the biological image is acquired.
  • the biological image acquired by the information acquisition unit 511 is image information of a finger vein pattern.
  • the direction of the living body is two orthogonal different directions based on the left and right of the hand.
  • the information processing apparatus 500 includes a guide that indicates the direction of a finger to be read. Details of the guide will be described later.
  • the reading unit 542 is fixed to the upper part of the information processing apparatus 500.
  • the reading unit 542 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG.
  • the information acquisition unit 511 determines that the living body is arranged at a predetermined distance from the reading unit 542 and the direction of the living body with respect to the reading unit 542 based on the detection result by the living body detection unit 142a of the reading unit 542.
  • the information acquisition unit 511 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 542.
  • the directions are orthogonal to each other, and the angle with the keyboard 131 is an oblique angle.
  • the direction feature portion is a valley portion at the base of the finger in the palm.
  • the information acquisition unit 511 acquires a biological image obtained by imaging a living body by the imaging unit 142b of the reading unit 542.
  • the information acquisition unit 511 receives an input of the finger type (thumb, forefinger, middle finger, ring finger, and little finger distinction) from which the biometric information of the vein is acquired by the user.
  • the type determination unit 512 determines the type of biometric information based on the direction of the biometric acquired by the information acquisition unit 511. The type indicates the left and right of the hand that is the basis for generating biometric information.
  • the information generation unit 513 generates biological information indicating the characteristics of the living body based on the living body image acquired by the information acquisition unit 511.
  • the information generation unit 513 generates verification biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 512. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown.
  • the information generation unit 513 includes, for example, the biological information acquired by the information generation unit 513, the type of biological information determined by the type determination unit 512, and the type of finger received by the information acquisition unit 511 (thumb, index finger, Biometric feature information including a middle finger, a ring finger, and a little finger) and identification information for identifying an individual corresponding to the biometric information is generated and stored in the biometric feature information storage unit 541b.
  • the biometric information indicating the biometric information and the type and the type of the finger of the user who has a normal authority registered in advance for use in authentication is registered.
  • the information generating unit 513 stores the generated biometric feature information in the biometric feature information storage unit 541b.
  • the verification unit 514 performs authentication using the verification biometric feature information generated by the information generation unit 513.
  • the collation unit 514 extracts biometric feature information whose type matches the collation biometric feature information, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 500 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.
  • the biometric feature information storage unit 541b stores biometric feature information indicating the biometric information, the type of biometric information, and the type of finger. As a result, the user's biometric information, type, and finger type are stored in association with each other.
  • FIG. 40 is a diagram illustrating a reading unit according to the third embodiment.
  • a reading unit 542 illustrated in FIG. 40 is an input device that allows a user to read a vein of a finger and input a biological image.
  • the reading unit 542 includes a vein sensor that acquires a biological image of a finger vein by reading the vein of the user's finger.
  • the reading unit 542 has a square shape and is inclined by 45 °.
  • a cross-shaped guide 542 a having sides (side A and side B) parallel to the sides of the reading unit 542 is provided on the upper surface of the reading unit 542.
  • the side A of the guide 542a is inclined 45 ° to the left when viewed from the front of the information processing apparatus 500.
  • the side B of the guide 542a is inclined 45 ° to the right when viewed from the front of the information processing apparatus 500.
  • the user positions a finger for acquiring a biological image along one of the cross-shaped sides of the guide 542a. That is, the user positions the finger to be acquired along the side A when the information processing apparatus 500 acquires a biological image of the vein of one finger of the right hand. Further, when the user causes the information processing apparatus 500 to acquire a biological image of a vein of one finger of the left hand, the user positions the finger to be acquired along the side B. This allows the user to acquire a natural image of the right hand finger biometric image and the left hand biometric image of the user in a natural and easy posture, and the information processing apparatus 500 is also an acquisition target. It is possible to determine whether the finger is the finger of the right hand or the finger of the left hand.
  • Finger veins are difficult to counterfeit because they are in-vivo information, and are highly applicable because they are not affected by body surface such as rough hands, dryness, and wetness. Further, since it is non-contact, hygienic and natural operability is realized, user resistance is low, and high-speed authentication is possible.
  • the reading unit 542 may read a fingerprint. Further, the reading unit 542 is not limited to a square shape, and may have any shape.
  • the information processing apparatus 500 it is automatically determined whether the vein is that of the finger of the right hand or the finger of the left hand.
  • the distinction between the thumb, the index finger, the middle finger, the ring finger, and the little finger of each of the right hand and the left hand is determined based on a user input when obtaining biometric information.
  • a biometric image of a finger vein is acquired in response to the user positioning the finger to be acquired on a space separated from the vein sensor surface by a certain distance (for example, several centimeters).
  • FIG. 41 is a diagram illustrating a biometric feature table according to the third embodiment.
  • the biometric feature table 541b1 illustrated in FIG. 41 is set in the biometric feature information storage unit 541b included in the information processing apparatus 500 according to the third embodiment.
  • the biometric feature table 541b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 500.
  • the biometric feature table 541b1 includes “number”, “ID”, “left / right”, “finger type”, and “feature data” as items. In the biometric feature table 541b1, values set in the above items are associated with each other as biometric feature information.
  • Left and right indicate whether the biometric feature information is for the right or left finger. “Right” is set for the biometric feature information of the vein of any finger of the right hand. “Left” is set for the biometric feature information of the vein of any finger of the left hand.
  • the finger type indicates the type of finger indicated by the biometric feature information.
  • “1” is set.
  • “2” is set for the biometric feature information of the index finger vein.
  • “3” is set for the biometric feature information of the middle finger vein.
  • “4” is set for the biometric feature information of the vein of the ring finger.
  • “5” is set for the biometric feature information of the vein of the little finger.
  • the left and right indicate types in the biometric feature information.
  • biometric feature table 541b1 shown in FIG. 41 is an example, and any item can be set in the biometric feature table.
  • an item of an angle indicating the left and right as angles for example, “0 ° for the right hand and“ 90 ° ”for the left hand
  • an angle item may be provided instead of the left and right items.
  • FIG. 42 is a diagram illustrating a state at the time of reading the vein of the index finger of the right hand according to the third embodiment.
  • FIG. 42 is a diagram of the state when the information processing apparatus 500 reads the vein of the index finger of the right hand as viewed from above.
  • the information processing apparatus 500 includes a display unit 120 and a main body unit 530.
  • a keyboard 131 and a reading unit 542 are provided on the upper surface of the main body 530 of the information processing apparatus 500.
  • the reading unit 542 is on the same top surface of the main body 530 as the keyboard 131 of the information processing apparatus 500, and each side of a square vein sensor having a cross-shaped guide 542 a is located at the front center of the keyboard 131.
  • the little finger 205e is shown.
  • the user moves the finger for reading the vein (for example, the index finger 205b of the right hand) to the left of the side surface of the information processing apparatus 500 as illustrated in FIG. It is positioned so as to be parallel to the upper surface of the main body 530 of the information processing apparatus 500 at an angle of 45 °.
  • the user opens the right hand finger so that the center line of the index finger 205b of the right hand to be read coincides with the center line of one side of the cross shape of the reading unit 542 (dotted line AA in FIG. 40).
  • the finger is positioned in a space separated from the vein sensor surface by a certain distance (for example, several centimeters).
  • the user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably with respect to the direction feature portion. Therefore, the right hand or left hand can be determined quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. 42, the case where the vein of the index finger of the right hand of the user is read has been described. However, the present invention is not limited to this, and the same applies to the case of reading the vein of the finger other than the index finger of the right hand and the finger of the left hand. Yes, the description is omitted.
  • each finger of the user's right hand is located in a space that is a certain distance away from the vein sensor surface together with the palm 205 of the right hand, so that the finger or palm touches the keyboard 131 or other operation unit to cause an erroneous operation. This can be suppressed.
  • FIG. 43 is a diagram illustrating detection of a direction characteristic portion of a finger according to the third embodiment.
  • FIG. 43 shows left and right hand and directional features of the living body in the information processing apparatus 500 according to the third embodiment.
  • FIG. 43A shows an acquired image 5421 of the right hand finger.
  • FIG. 43B shows an acquired image 5422 of the finger vein of the left hand.
  • the acquired images 5421 and 5422 are images acquired by the living body detection unit 142a of the reading unit 542.
  • the acquired images 5421 and 5422 are captured by, for example, an image sensor included in the living body detection unit 142a.
  • the upper side in FIG. 43 is the back side when viewed from the front of the information processing apparatus 500
  • the lower side in FIG. 43 is the front side when viewed from the front of the information processing apparatus 500
  • the right side in FIG. 43 is the right side when viewed from the front of the information processing apparatus 500
  • the left side in FIG. 43 is the left side when viewed from the front of the information processing apparatus 500.
  • the image acquired by the living body detection unit 142a includes a right hand detection rectangular image region 5420a for detecting a valley portion of the base of the palm of the right hand palm along the lower right side, and a lower left side.
  • a left hand detection rectangular image region 5420b for detecting a valley portion at the base of the finger of the palm of the left hand is set.
  • the right-hand detection rectangular image region 5420a and the left-hand detection rectangular image region 5420b are provided along two sides that are orthogonal to each other in the acquired image.
  • the valley portion at the base of the finger functions as a direction feature portion.
  • the reading unit 542 When the reading unit 542 detects a finger of a user's hand located above the reading unit 542 with a distance sensor (not shown) of the living body detection unit 142a, the reading unit 542 acquires an image of the finger of the hand with an image sensor (not shown) of the living body detection unit 142a. To the information acquisition unit 511. The information acquisition unit 511 determines whether a valley portion at the base of the finger is detected in the right-hand detection rectangular image region 5420a or the left-hand detection rectangular image region 5420b set in the acquired image. The image acquisition of the finger by the image sensor here is performed without being irradiated with near-infrared light from the light source unit 142c.
  • the image acquired here is not an image of a finger vein but an image of the appearance of the finger.
  • this detection processing is performed for the position where the right hand detection rectangular image region 5420a or the left hand detection rectangular image region 5420b is set in the image of the appearance of the finger.
  • the information acquisition unit 511 is in the valley portion of the base of the finger in the rectangular image region 5420a for right hand detection or the rectangular image region 5420b for left hand detection. It is determined whether or not a certain direction feature portion exists. For example, in the case of an acquired image 5421 as shown in FIG. 43A, the direction feature portion 5421a1 exists in the right-hand detection rectangular image region 5420a, but does not exist in the left-hand detection rectangular image region 5420b. , It is determined to be located at an angle of “0 °”. Thereby, the information processing apparatus 500 determines that the acquired image is an image of the finger of the right hand.
  • the direction feature portion 5422b1 exists in the left-hand detection rectangular image region 5420b, but does not exist in the right-hand detection rectangular image region 5420a. It is determined that the palm is positioned at an angle of “90 °” clockwise. As a result, the information processing apparatus 500 determines that the acquired image is an image of the finger of the left hand.
  • the reading unit 542 acquires an image of the direction feature portion at the base of the finger.
  • the information acquisition unit 511 determines the angle of the hand based on the presence or absence of the direction feature portion of the right hand detection rectangular image region or the left hand detection rectangular image region.
  • the information acquisition unit 511 determines the type of finger (right finger or left hand finger) in the acquired image based on the determined hand angle. In the following description of the third embodiment, the information indicated by the finger type is described as “hand left and right”.
  • the acquired image for detecting the direction feature portion illustrated in FIGS. 43A and 43B may be an image captured by the imaging unit 142b.
  • the detection of the direction feature portion in the third embodiment that is, the valley portion of the base of the finger, for example, the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, the valley portion of the base of the ring finger and the little finger All or at least some combinations may be detected based on being open at a predetermined interval.
  • the detection of the valley portion of the base of the finger may be realized by the position of the base of each finger in the acquired image of the finger acquired by the living body detection unit 142a and the contour identification between the base of the finger. Good.
  • FIG. 44 is a flowchart illustrating the biometric feature information acquisition registration process according to the third embodiment.
  • the biometric feature information acquisition / registration process is a process for determining the type of finger to be registered, and generating and registering biometric feature information indicating the type of finger (left and right of the hand), the type of finger, and the characteristics of veins.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a finger vein. In the following, the process illustrated in FIG. 44 will be described in order of step number.
  • the information acquisition unit 511 receives an input of a finger type from which a biometric information of veins is acquired by the user.
  • the information acquisition unit 511 determines that the finger is placed at a predetermined height on the reading unit 542 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 542. If it is determined that a finger is placed, the information acquisition unit 511 detects a valley portion at the base of the finger, which is a direction feature portion, from a captured image by the image sensor provided in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the finger to be registered is determined.
  • the type of finger is specified based on the user's input, and the direction feature portion is used to determine the left and right of the hand.
  • the type determination unit 512 determines the type of finger (left and right of the hand) based on the direction of the finger determined in step S192.
  • the information acquisition unit 511 causes the imaging unit 142b of the reading unit 542 to capture an image and acquires a biological image in which a finger vein is reflected.
  • Step S195 The information generation unit 513 extracts features of the living body based on the living body image acquired in step S194.
  • the information generation unit 513 generates biometric information indicating the feature extraction result.
  • the information generation unit 513 generates biometric feature information including the type determined in step S193, the type of finger received in step S191, the biometric information generated in step S195, and the user ID. To do.
  • the information generation unit 513 stores the biometric feature information generated in step S196 in the biometric feature information storage unit 541b. Thereby, the biometric feature information is registered in the biometric feature information storage unit 541b. Thereafter, the process ends.
  • FIG. 45 is a flowchart illustrating biometric feature information authentication processing according to the third embodiment.
  • the biometric feature information authentication process is a process for determining the type of the finger to be authenticated, generating collation biometric information indicating the characteristics of the vein of the finger to be authenticated, and performing authentication by collating with the biometric feature information registered in advance. It is.
  • the collation biometric feature information is data having the same configuration as that of the biometric feature information, and indicates the feature of the authentication target user's biometric (in the third embodiment, the finger vein).
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a finger vein. In the following, the process illustrated in FIG. 45 will be described in order of step number.
  • the information acquisition unit 511 determines that the finger is placed at a predetermined height on the reading unit 542 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 542. If it is determined that a finger is placed, the information acquisition unit 511 detects a valley portion at the base of the finger, which is a direction feature portion, from a captured image by the image sensor provided in the living body detection unit 142a, and based on the detected direction feature portion. The direction of the finger to be authenticated is determined.
  • the type determination unit 512 determines the type of finger (left and right of the hand) based on the direction of the finger determined in step S201.
  • the information acquisition unit 511 causes the imaging unit 142b of the reading unit 542 to capture an image and acquires a biological image in which a finger vein is reflected.
  • Step S204 The information generation unit 513 extracts a feature of the living body based on the living body image acquired in step S203.
  • the information generation unit 513 generates biometric information indicating the feature extraction result.
  • the information generation unit 513 generates verification biometric feature information including the type determined in step S202 and the biometric information generated in step S204.
  • the matching unit 514 performs biometric feature information matching processing (described later in FIG. 46) using the matching biometric feature information generated in step S205. Thereafter, the process ends.
  • FIG. 46 is a flowchart illustrating the biometric feature information matching process according to the third embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 46 will be described in order of step number.
  • the collation unit 514 acquires collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 514 refers to the biometric feature information storage unit 541b and extracts biometric feature information that matches the type (left and right) of the collation biometric feature information acquired in step S211.
  • the matching unit 514 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 541b.
  • the collation unit 514 selects one of the biometric feature information extracted in step S212, and collates the biometric information included in each of the selected biometric feature information and the collated biometric feature information.
  • Step S214 The collation unit 514 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in step S213. If the verification is successful (step S214: YES), the process proceeds to step S215. On the other hand, if the verification fails (step S214: NO), the process proceeds to step S216.
  • Step S215 The collation unit 514 executes predetermined processing when authentication is successful. Thereafter, the process returns.
  • Step S216 The collation unit 514 determines whether all the biometric feature information extracted in Step S212 has been selected in Step S213. If all have been selected (YES in step S216), the process proceeds to step S217. On the other hand, if there is an unselected item (NO in step S216), the process proceeds to step S213.
  • FIG. 47 is a diagram illustrating a message window at the time of registration according to the third embodiment.
  • a message window 121b illustrated in FIG. 47 is an example of a window displayed on the display screen of the LCD 121 included in the information processing apparatus 500.
  • the message window 121b a message and an image for notifying the user that biometric feature information has been successfully registered based on the read finger are displayed.
  • the message window 121b for example, a message “Registration of the index finger of the right hand succeeded” and a biological image showing the vein of the index finger of the right hand imaged at the time of registration are displayed.
  • the message window 121b has an OK button 121b1.
  • the OK button 121b1 is a button for ending display of the message window 121b.
  • the user confirms the display contents of the message window 121b, the user can end the display of the message window 121b by operating the OK button 121b1.
  • the third embodiment it is possible to determine whether the finger is the right hand or the left hand according to the angle of the living body when acquiring the biological information about the vein of the finger. It becomes possible.
  • the biometric image can be acquired in a plurality of directions in the reading unit 542, the degree of freedom of the user's posture when acquiring the biometric information is increased, and the burden on the user's shoulder, arm, and wrist joint can be suppressed. it can.
  • biometric feature information storage unit 541b By previously classifying and registering biometric feature information according to type (left and right hand) and finger type, one piece of biometric feature information and a plurality of pieces of biometric feature information stored in the biometric feature information storage unit 541b Can be narrowed down based on the type. For this reason, it is possible to suppress an increase in authentication processing load and processing speed.
  • FIG. 48 is a block diagram illustrating an information processing apparatus according to the fourth embodiment.
  • An information processing apparatus 600 according to the fourth embodiment includes an information acquisition unit 611, a type determination unit 612, an information generation unit 613, a collation unit 614, and a biometric feature information storage unit 641b.
  • a reading unit 642 is connected to the information acquisition unit 611.
  • the information acquisition unit 611 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 600.
  • the information acquisition unit 611 can acquire the direction of the living body in a state where the biological image is acquired.
  • the biological image acquired by the information acquisition unit 611 is image information of a plurality of types of finger vein patterns.
  • the direction of the living body is a direction in which two orthogonal different directions based on the left and right of the hand and the directions of the fingers are combined.
  • the information processing apparatus 600 includes a guide that indicates a direction corresponding to each of a plurality of types of fingers to be read.
  • the reading unit 642 is fixed to the upper part of the information processing apparatus 600.
  • the reading unit 642 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG.
  • the information acquisition unit 611 determines that the living body is disposed at a predetermined distance from the reading unit 642 and the direction of the living body with respect to the reading unit 642 based on the detection result by the living body detection unit 142a of the reading unit 642.
  • the information acquisition unit 611 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 642.
  • the directions of the left and right hands to be placed are orthogonal to each other and the angle with the keyboard 131 is an oblique angle. Further, the direction of each finger is a predetermined direction with respect to the direction of the hand. The direction characteristic portion is a valley portion at the base of the finger in the palm of the finger to be read.
  • the information acquisition unit 611 acquires a biological image in which a living body is captured by the imaging unit 142b of the reading unit 642.
  • the type determination unit 612 determines the type of biometric information based on the direction of the biometric acquired by the information acquisition unit 611. The type indicates the left and right hand and finger type that are the basis for generating biometric information.
  • the information generation unit 613 generates biological information indicating the characteristics of the living body based on the living body image acquired by the information acquisition unit 611.
  • the information generation unit 613 generates collation biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 612. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown.
  • the information generation unit 613 includes, for example, biological features including the biological information acquired by the information generation unit 613, the type of biological information determined by the type determination unit 612, and identification information that identifies an individual corresponding to the biological information. Information is generated and stored in the biometric feature information storage unit 641b.
  • biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered.
  • the type includes information indicating whether the finger is the right hand or the left hand, and information indicating whether the thumb, index finger, middle finger, ring finger, or little finger.
  • the contents indicated by the former are described as “left and right hands”, and the contents indicated by the latter are described as “finger types”.
  • the information generation unit 613 stores the generated biometric feature information in the biometric feature information storage unit 641b when registering the biometric information of the user. Further, at the time of authentication using the user's biometric information, the verification unit 614 performs authentication using the verification biometric feature information generated by the information generation unit 613.
  • the collation unit 614 extracts biometric feature information whose type matches the collation biometric feature information, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 600 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.
  • the type indicates the right and left of the hand and the type of finger.
  • the biometric feature information storage unit 641b stores biometric feature information indicating the biometric information and the type of the biometric information. Thereby, a user's biometric information and a classification are matched and memorize
  • FIG. 49 is a diagram illustrating a reading unit according to the fourth embodiment.
  • a reading unit 642 illustrated in FIG. 49 is an input device that allows a user to read a vein of a finger and input a biological image.
  • the reading unit 642 includes a vein sensor that acquires biological information of the finger vein by reading the vein of the user's finger.
  • the reading unit 642 has a square shape and is inclined by 45 °.
  • guides 642c, 642d1, 642d2, 642e1, 642e2, 642f1, 642f2, 642g1, 642g2, 642h1, 642h2, 642i1, 642i2, 642j1, 642j2, 642k1, 642k2, 642l are provided.
  • a guide 642m is provided in the center of the reading unit 642.
  • the direction C connecting the guide 642c and the guide 642m is set to have a predetermined angle (for example, 180 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction D connecting the guide 642d1 and the guide 642d2 is set to have a predetermined angle (for example, 150 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction E connecting the guide 642e1 and the guide 642e2 is set to have a predetermined angle (for example, 135 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction F connecting the guide 642f1 and the guide 642f2 is set to have a predetermined angle (for example, 120 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction G connecting the guide 642g1 and the guide 642g2 is set to have a predetermined angle (for example, 105 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction H connecting the guide 642h1 and the guide 642h2 is set to have a predetermined angle (for example, 75 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction I connecting the guide 642i1 and the guide 642i2 is set to have a predetermined angle (for example, 60 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction J connecting the guide 642j1 and the guide 642j2 is set to have a predetermined angle (for example, 45 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction K connecting the guide 642k1 and the guide 642k2 is set to have a predetermined angle (for example, 30 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • a direction L connecting the guides 642l and 642m is set to have a predetermined angle (for example, 0 °) clockwise with respect to the left-right direction toward the information processing apparatus 600.
  • each guide corresponds to the type of finger. That is, the guide 642c and the guide 642m correspond to the thumb of the left hand.
  • the guide 642d1 and the guide 642d2 correspond to the left index finger.
  • the guide 642e1 and the guide 642e2 correspond to the middle finger of the left hand.
  • Guide 642f1 and guide 642f2 correspond to the ring finger of the left hand.
  • the guide 642g1 and the guide 642g2 correspond to the little finger of the left hand.
  • the guide 642h1 and the guide 642h2 correspond to the little finger of the right hand.
  • the guide 642i1 and the guide 642i2 correspond to the ring finger of the right hand.
  • the guide 642j1 and the guide 642j2 correspond to the middle finger of the right hand.
  • the guide 642k1 and the guide 642k2 correspond to the index finger of the right hand.
  • the guide 642l and the guide 642m correspond to the thumb of the right hand.
  • the user places a finger for acquiring biometric information between any of the above guide combinations. That is, when the user causes the information processing apparatus 600 to acquire the biological information of the vein of the right index finger, the user moves the belly of the index finger of the right hand, which is the acquisition target, along the direction K between the guide 642k1 and the guide 642k2. Position. The same applies to the other fingers, and a description thereof will be omitted. Thereby, when acquiring biometric information of a finger for a user, it can be acquired in a natural and easy posture, and the information processing apparatus 600 also depends on which direction the finger vein is along in the acquired image. It is possible to determine whether the acquisition target finger is a right-hand finger or a left-hand finger, and which type.
  • the reading unit 642 may read a fingerprint.
  • the reading unit 642 is not limited to a square shape, and can have any shape.
  • the finger to be acquired is placed on the right hand based on the side opposite to the side with the valley portion of the base of the finger that is the direction feature portion. Whether the finger is a finger or a left hand and the kind of the finger is determined. Further, in the information processing apparatus 600, a biometric image of a finger vein is acquired when the user places the acquisition target finger on a space separated from the vein sensor surface by a certain distance (for example, several cm).
  • FIG. 50 is a diagram illustrating a biometric feature table according to the fourth embodiment.
  • the biometric feature table 641b1 illustrated in FIG. 50 is set in the biometric feature information storage unit 641b included in the information processing apparatus 600 according to the fourth embodiment.
  • the biometric feature table 641b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 600.
  • the biometric feature table 641b1 is provided with “number”, “ID”, “left / right”, “finger type”, “angle”, and “feature data” as items. In the biometric feature table 641b1, values set in the above items are associated with each other as biometric feature information.
  • the angle indicates the angle of the finger when the finger vein indicated by the biometric feature information is detected.
  • the angle is associated with the type of finger. For example, with respect to the biometric feature information of the vein of the thumb of the right hand, “0”, which is a reference angle and indicates 0 ° that is an angle in the direction L, is set. For the biometric feature information of the right index finger vein, “30” indicating the direction K angle of 30 ° is set. For the biometric feature information of the middle finger vein of the right hand, “45” indicating 45 degrees that is the angle of direction J is set. For the biometric feature information of the vein of the ring finger of the right hand, “60” indicating 60 degrees that is the angle of direction I is set.
  • biometric feature information of the vein of the little finger of the right hand “75” indicating 75 ° that is the angle of direction H is set.
  • biometric feature information of the left thumb vein “180” indicating 180 degrees that is the angle of direction C is set.
  • biometric feature information of the vein of the left index finger “150” indicating 150 degrees that is the angle of direction D is set.
  • biometric feature information of the vein of the middle finger of the left hand “135” indicating 135 degrees that is the angle of direction E is set.
  • biometric feature information of the left finger ring “120” indicating 120 degrees that is the angle of the direction F is set.
  • 105 indicating 105 ° that is the angle of the direction G is set.
  • Such biometric feature information can be distinguished from the biometric feature information of which finger in which hand according to the type of finger specified by left and right and finger type.
  • the biometric feature table 641b1 illustrated in FIG. 50 is an example, and any item can be set in the biometric feature table.
  • the left and right and finger type items may be omitted.
  • the angle item may be omitted.
  • FIG. 51 is a diagram illustrating a state when reading the vein of the index finger of the right hand according to the fourth embodiment.
  • FIG. 51 is a diagram of the state when the information processing apparatus 600 reads the vein of the index finger of the right hand as viewed from above.
  • the information processing apparatus 600 includes a display unit 120 and a main body unit 630.
  • a keyboard 131 and a reading unit 642 are provided on the upper surface of the main body 630 of the information processing apparatus 600.
  • the reading unit 642 is on the same top surface of the main body 630 as the keyboard 131 of the information processing apparatus 600, and each side of the square vein sensor surrounded by the guides 642c to 642m is located at the front center of the keyboard 131.
  • the front and side surfaces of 600 are arranged at a 45 ° angle. Also, the user's head 201, torso 202, upper right arm 203, right lower arm 204, right hand palm 205, right hand thumb 205a, right hand index finger 205b, right hand middle finger 205c, right hand ring finger 205d, right hand The little finger 205e is shown.
  • the user When the user causes the reading unit 642 to read the vein of the finger of the hand, the user watches the finger for reading the vein (for example, the index finger 205b of the right hand) with respect to the front surface of the information processing apparatus 600 as shown in FIG. It is positioned so as to be parallel to the upper surface of the main body 630 of the information processing apparatus 600 at an angle of 30 ° around.
  • the user opens the right hand finger on a space separated from the vein sensor surface by a certain distance (for example, several centimeters) so that the center line of the right index finger 205b to be read coincides with the direction K. Position your finger.
  • the user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably with respect to the direction feature portion. Therefore, the right hand or left hand can be determined quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • FIG. 51 demonstrated the case where the vein of the index finger of a user's right hand was read, it is not limited to this, and the same applies to the case of reading the vein of a finger other than the index finger of the right hand and the finger of the left hand. Yes, the description is omitted.
  • each finger of the user's right hand is located in a space that is a certain distance away from the vein sensor surface together with the palm 205 of the right hand, so that the finger or palm touches the keyboard 131 or other operation unit to cause an erroneous operation. This can be suppressed.
  • 52 and 53 are diagrams illustrating a finger direction determination method according to the fourth embodiment. 52 and 53 show acquired images 6421 and 6422 when reading the vein of the middle finger of the right hand.
  • Acquired images 6421 and 6422 are images acquired by the living body detection unit 142a of the reading unit 642.
  • the acquired images 6421 and 6422 are captured by, for example, an image sensor included in the living body detection unit 142a.
  • the upper side in FIGS. 52 and 53 is the back side when viewed from the front of the information processing apparatus 600
  • the lower side in FIGS. 52 and 53 is the front side when viewed from the front of the information processing apparatus 600.
  • the right side is the right side when viewed from the front of the information processing apparatus 600
  • the left side of FIGS. 52 and 53 is the left side when viewed from the front of the information processing apparatus 600.
  • FIG. 52 shows an index finger image 205b1 of the right hand of the user, an image 205c1 of the middle finger of the right hand, an image 205d1 of the ring finger of the right hand, and an image 205e1 of the little finger of the right hand.
  • FIG. 53 shows an index finger image 205b2 of the right hand of the user, an image 205c2 of the middle finger of the right hand, an image 205d2 of the ring finger of the right hand, and an image 205e2 of the little finger of the right hand.
  • the acquired image 6421 includes a right hand detection rectangular image area 6421n0, 6421n1, 6421n2, 6421n3, 6421n4 for detecting a valley portion at the base of the finger of the palm of the right hand.
  • Left hand detection rectangular image areas 6421n5, 6421n6, 6421n7, 6421n8, and 6421n9 are set for detecting a valley portion at the base of the finger of the palm of the left hand.
  • the rectangular image areas for right hand detection 6421n0 to 6421n4 and the rectangular image areas for left hand detection 6421n5 to 6421n9 are provided inside the lower two sides of the acquired image 6421.
  • the right hand detection rectangular image areas 6421n0 to 6421n4 correspond to the thumb of the right hand, the index finger of the right hand, the middle finger of the right hand, the ring finger of the right hand, and the little finger of the right hand, and also correspond to the directions L to H, respectively.
  • the left hand detection rectangular image regions 6421n5 to 6421n9 correspond to the little finger of the left hand, the ring finger of the left hand, the middle finger of the left hand, the index finger of the left hand, the thumb of the left hand, and correspond to the directions G to C, respectively.
  • the information processing apparatus 600 detects a directional feature portion when a directional feature portion that is a valley portion at the base of the finger is detected in any of the rectangular image regions 6421n0 to 6421n4 for detecting the right hand and the rectangular image regions 6421n5 to 6421n9 for detecting the left hand. It is determined that the finger corresponding to the right-hand detection rectangular image region or the left-hand detection rectangular image region in which is detected is a reading target.
  • the valley portion at the base of the finger functions as a direction feature portion.
  • the acquired images 6421 and 6422 are images acquired by an image sensor (not shown) of the living body detection unit 142a of the reading unit 642.
  • the image acquisition of the finger by the image sensor here is performed without being irradiated with near-infrared light from the light source unit 142c. Therefore, the image acquired here is not an image of a finger vein but an image of the appearance of the finger. In the detection processing of the valley portion at the base of the finger, this detection processing is performed for the position where each rectangular image region is set in the image of the appearance of the finger.
  • a directional characteristic portion 6421n21 which is a valley portion of the index finger, middle finger, and ring finger base of the user's right hand exists in the right hand detection rectangular image region 6421n2 corresponding to the middle finger of the right hand.
  • the palm in the state where the finger is opened is on a space separated from the vein sensor surface by a certain distance (for example, several centimeters). To be located.
  • the user positions the root of the middle finger of the right hand above the right hand detection rectangular image region 6421n2 corresponding to the middle finger based on the guides 642j1 and 642j2 corresponding to the middle finger of the right hand, and moves the guide 642m at the center of the vein sensor.
  • the passing direction J and the center line of the middle finger of the right hand are matched.
  • the information processing apparatus 600 detects that the middle finger of the right hand is positioned at a predetermined height by a distance sensor (not shown) of the living body detection unit 142a, the right hand detection rectangular image areas 6421n0 to 6421n4 corresponding to the fingers of both hands, For the left-hand detection rectangular image regions 6421n5 to 6421n9, it is determined whether or not there is a valley portion at the base of the finger. At this time, a valley portion (direction feature portion 6421n21) at the base of the finger exists in the right-hand detection rectangular image region 6421n2 corresponding to the middle finger of the right hand. For this reason, the information processing apparatus 600 determines that the finger to be read is the middle finger of the right hand.
  • the detection of the valley portion of the base of the finger can be performed by, for example, a combination of all or at least a combination of the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, and the valley portion of the base of the ring finger and the little finger at a predetermined interval. You may detect based on being open.
  • the detection of the valley portion of the base of the finger may be realized by the position of the base of each finger in the acquired image of the palm acquired by the imaging unit (not shown) and the contour identification between the base of the finger. Good.
  • the determination example of the middle finger of the right hand has been shown, the same applies to the case of other fingers, and the description thereof is omitted.
  • the information processing apparatus 600 prevents an erroneous determination of the finger type in the acquired image 6422 as shown in FIG.
  • the direction J passes through the right hand detection rectangular image region 6422n2 corresponding to the middle finger of the right hand and the guide 642m at the center of the vein sensor.
  • the tangent line P is a tangent line of the contour of the index finger of the right hand positioned so as to pass through the guide 642m.
  • the tangent line P may be a tangent line in the vicinity of the guide 642m of the target finger.
  • the index finger image 205b2 of the right hand to be read is positioned above the reading unit 642 by the user.
  • a right hand detection rectangular image region 6422n2 corresponding to the middle finger, other right hand detection rectangular image regions (not shown), and a left hand rectangular image region (not shown) are set.
  • the information processing device 600 starts reading the finger vein.
  • the direction feature portion 6422n21 is detected in the right hand detection rectangular image region 6422n2 corresponding to the middle finger of the right hand from the image acquired by the living body detection unit 142a.
  • the information processing apparatus 600 obtains the tangent line P of the contour line of the index finger image 205b2 of the right hand to be read, and whether or not the intersection angle R between the tangent line P and the direction J is equal to or greater than a predetermined angle Rt. Determine whether.
  • the position of the finger to be read is incorrect, for example, “remove the palm from the sensor once and fix it to the correct position.
  • the user may read the finger again after prompting the user to correct the position of the finger to be read.
  • the message may be displayed on the LCD 121 or may be output by voice from a speaker (not shown).
  • FIG. 54 is a diagram illustrating another method for determining the finger direction according to the fourth embodiment.
  • the information processing apparatus 600 according to the fourth embodiment may prevent an erroneous determination of the finger type as shown in FIG. 54, the upper side in FIG. 54 is the back side when viewed from the front of the information processing apparatus 600, the lower side in FIG. 54 is the front side when viewed from the front of the information processing apparatus 600, and the right side in FIG.
  • the left side in FIG. 54 is the left side when viewed from the front side of the information processing apparatus 600.
  • an index finger image 205b3 of the user's right hand a middle finger image 205c3 of the right hand, a ring finger image 205d3 of the right hand, and a pinkie image 205e3 of the right hand are shown.
  • finger position impossibility areas 6423o1 and 6423o2 in which the image of the user's finger cannot be located are set in the acquired image 6423.
  • the finger position impossibility regions 6423o1 and 6423o2 are the sides on the opposite side of the acquired image 6423 from the direction feature portions 6423n21 and 6423n22 after the direction feature portions 6423n21 and 6423n22 in the right-hand detection rectangular image region 6423n2 are detected by the information processing device 600. It is formed to extend to.
  • the finger position impossible area is similarly defined as the right hand detection rectangular image area (or left hand detection area). (Rectangular image area for detection) is formed to extend from the direction feature portion to the opposite side of the acquired image 6423.
  • the index finger image 205b3 of the right hand to be read is positioned above the reading unit 642 by the user.
  • a right-hand detection rectangular image region 6423n2 corresponding to the middle finger, other right-hand detection rectangular image regions (not shown), and a left-hand rectangular image region (not shown) are set in the acquired image 6423.
  • the information processing device 600 starts reading the finger vein.
  • the direction characteristic portions 6423n21 and 6423n22 are detected in the right hand detection rectangular image region 6423n2 corresponding to the middle finger of the right hand from the image acquired by the living body detection unit 142a.
  • the information processing apparatus 600 determines whether or not the middle finger image 205b3 of the right hand overlaps at least one of the finger position impossible regions 6423o1 and 6423o2 as shown in FIG. As a result of the determination, when it overlaps with the finger position impossible areas 6423o1 and 6423o2 (in the example of FIG.
  • the middle finger image 205b3 of the right hand overlaps with the finger position impossible area 6423o1), the position of the finger to be read
  • output a message such as “remove the palm from the sensor and fix it to the correct position”, prompt the user to correct the position of the finger to be read, and then read the finger again. May be performed.
  • the message may be displayed on the LCD 121 or may be output by voice from a speaker (not shown).
  • FIG. 55 is a flowchart illustrating biometric feature information acquisition and registration processing according to the fourth embodiment.
  • the biometric feature information acquisition / registration process is a process for determining the type of a finger to be registered, and generating and registering biometric feature information indicating the type of finger (left / right / finger type) and vein characteristics.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a finger vein.
  • the process illustrated in FIG. 55 will be described in order of step number.
  • the information acquisition unit 611 determines that the finger is placed at a predetermined height on the reading unit 642 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 642. If it is determined that a finger is placed, the information acquisition unit 611 detects a valley portion at the base of the finger, which is a directional feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected directional feature portion. The direction of the finger to be registered is determined. In the fourth embodiment, the contour of a finger to be registered is detected, and the direction of the finger is determined from the detected contour of the finger, thereby preventing erroneous detection of finger placement. The direction feature is used to determine the left and right hand and finger type.
  • the type determining unit 612 determines the type of finger (left and right hand and finger type) based on the direction of the hand determined in step S221.
  • the information acquisition unit 611 causes the imaging unit 142b of the reading unit 642 to capture an image, and acquires a biological image in which a finger vein is reflected.
  • Step S224 The information generation unit 613 extracts features of biometric information based on the biometric image acquired in step S223.
  • the information generation unit 613 generates biometric information indicating the feature extraction result.
  • the information generation unit 613 generates biometric feature information including the type determined in step S222, the biometric information generated in step S224, and the user ID.
  • the information generation unit 613 stores the biometric feature information generated in step S225 in the biometric feature information storage unit 641b. Thereby, the biometric feature information is registered in the biometric feature information storage unit 641b. Thereafter, the process ends.
  • FIG. 56 is a flowchart showing a biometric feature information authentication process according to the fourth embodiment.
  • the biometric feature information authentication process determines the type of finger to be authenticated (left and right of the hand and the type of finger), generates verification biometric feature information indicating the type of finger to be authenticated and the characteristics of the vein, and is registered in advance. This is a process of verifying against biometric feature information.
  • the collation biometric feature information is data having the same configuration as that of the biometric feature information, and indicates the features of the authentication target user's biometric (in the fourth embodiment, the finger vein).
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a finger vein. In the following, the process illustrated in FIG. 56 will be described in order of step number.
  • the information acquisition unit 611 determines that the finger is placed at a predetermined height on the reading unit 642 based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 642. If it is determined that a finger is placed, the information acquisition unit 611 detects a valley portion at the base of the finger, which is a directional feature portion, from an image captured by the image sensor included in the living body detection unit 142a, and based on the detected directional feature portion. The direction of the finger to be authenticated is determined. In the fourth embodiment, the contour of a finger to be registered is detected, and the direction of the finger is determined from the detected contour of the finger, thereby preventing erroneous detection of finger placement. The direction feature is used to determine the left and right hand and finger type.
  • the type determination unit 612 determines the type of finger (left and right hand and finger type) based on the direction of the finger determined in step S231.
  • the information acquisition unit 611 causes the imaging unit 142b of the reading unit 642 to capture an image, and acquires a biological image in which a finger vein is reflected.
  • Step S234 The information generation unit 613 extracts features of biometric information based on the biometric image acquired in step S233.
  • the information generation unit 613 generates biometric information indicating the feature extraction result.
  • the information generation unit 613 generates verification biometric feature information including the type (left and right hand and finger type) determined in step S232 and the biometric information generated in step S234.
  • Step S236 The matching unit 614 executes biometric feature information matching processing (described later in FIG. 57) using the matching biometric feature information generated in step S235. Thereafter, the process ends.
  • FIG. 57 is a flowchart illustrating the biometric feature information matching process according to the fourth embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 57 will be described in order of step number.
  • the collation unit 614 acquires collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 614 refers to the biometric feature information storage unit 641b, and extracts biometric feature information that matches the type of collation biometric feature information (left and right hand and finger type) acquired in step S241.
  • the matching unit 614 extracts biometric feature information in which both the type of matching biometric feature information and the user ID match, from the biometric feature information storage unit 641b.
  • the collation unit 614 selects one of the biometric feature information extracted in step S242, and collates the biometric information included in each of the selected biometric feature information and the collated biometric feature information.
  • Step S244 The collation unit 614 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in Step S243. If the verification is successful (step S244: YES), the process proceeds to step S245. On the other hand, if the verification fails (NO in step S244), the process proceeds to step S246.
  • Step S245 The collation unit 614 executes a predetermined process when the authentication is successful. Thereafter, the process returns.
  • Step S246 The collation unit 614 determines whether all the biometric feature information extracted in Step S242 has been selected in Step S243. If all have been selected (YES in step S246), the process proceeds to step S247. On the other hand, if there is an unselected item (NO in step S246), the process proceeds to step S243.
  • Step S247 The collation unit 614 performs a predetermined process when authentication fails. Thereafter, the process returns. According to the fourth embodiment as described above, the same effects as those of the third embodiment can be obtained.
  • the time of acquiring the biological information of the finger vein it is possible to determine which type of finger is the ten fingers according to the angle of the biological body. In addition, this eliminates the need for the user to input all ten finger types as well as left and right, and can suppress the burden of operations during registration. Moreover, when performing 1-to-N collation, the object of collation can be narrowed down according to classification about 10 types of fingers. For this reason, it becomes possible to perform collation at high speed when performing one-to-N collation. Alternatively, it is possible to increase the number of pieces of biometric feature information while suppressing an increase in time required for verification and time required for authentication processing.
  • FIG. 58 is a block diagram illustrating an information processing apparatus according to the fifth embodiment.
  • An information processing apparatus 700 according to the fifth embodiment includes an information acquisition unit 711, a type determination unit 712, an information generation unit 713, a collation unit 714, and a biometric feature information storage unit 741b.
  • the information acquisition unit 711 is connected to a reading unit 742.
  • the information acquisition unit 711 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 700.
  • the information acquisition unit 711 can acquire the direction of the living body in a state where the biological image is acquired.
  • the biological images acquired by the information acquisition unit 711 are palm vein pattern image information and finger vein pattern image information.
  • the directions at the palm are two orthogonal different directions based on the left and right of the hand.
  • the finger direction is two orthogonal different directions based on the left and right of the hand.
  • the information processing apparatus 700 includes a guide that indicates the direction of the finger to be read.
  • the information acquisition unit 711 determines the direction of the living body by determining the position of the direction feature portion in the acquired image.
  • the reading unit 742 is fixed to the upper part of the information processing apparatus 700.
  • the reading unit 742 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG.
  • the information acquisition unit 711 determines that the living body is arranged at a predetermined distance from the reading unit 742 and the direction of the living body with respect to the reading unit 742 based on the detection result by the living body detection unit 142a of the reading unit 742.
  • the information acquisition unit 711 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 742.
  • the directions are orthogonal to each other and the angle with the keyboard 131 is an oblique angle.
  • the direction characteristic portion is a valley portion at the base of the finger.
  • the information acquisition unit 711 determines whether the direction is correct based on the tangent line of the finger outline from the image obtained by the living body detection unit 142a. The determination is not limited to this, and the determination may be made based on the center line of the contour line of the finger.
  • the information acquisition unit 711 accepts an input of a finger type in which biometric information of veins is registered by the user. In addition, the information acquisition unit 711 acquires a biological image in which a living body is captured by the imaging unit 142b of the reading unit 742.
  • the type determination unit 712 determines the type of biological information based on the direction of the biological body acquired by the information acquisition unit 711.
  • the type includes information indicating the left and right of the hand that is the basis for generating biometric information.
  • the type includes information indicating whether the information is about palm or information about a finger. As will be described in detail later, the type determination unit 712 determines whether the type of the living body is information related to the palm or information related to the finger based on the determination result of the position of the direction feature portion.
  • the information generation unit 713 generates biological information indicating the characteristics of the biological body based on the biological image acquired by the information acquisition unit 711.
  • the information generation unit 713 generates collation biometric feature information including the generated biometric information and the biometric type determined by the type determination unit 712. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown.
  • the information generation unit 713 specifies, for example, the biological information generated by the information generation unit 713, the type of the living body determined by the type determination unit 712, the type of finger received from the user, and the individual corresponding to the biological information.
  • the biometric feature information including the identification information to be generated is generated and stored in the biometric feature information storage unit 741b.
  • biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered.
  • the type includes information indicating the left and right of the hand and information indicating whether the hand is a palm or a finger.
  • the information generating unit 713 stores the generated biometric feature information in the biometric feature information storage unit 741b.
  • the matching unit 714 performs authentication using the authentication biometric feature information generated by the information generation unit 713.
  • the collation unit 714 extracts biometric feature information whose type matches the collation biometric feature information, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 700 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.
  • the biometric feature information storage unit 741b stores biometric feature information indicating the biometric information and the type of biometric information. Thereby, a user's biometric information and a classification are matched and memorize
  • FIG. 59 is a diagram illustrating a reading unit according to the fifth embodiment.
  • a reading unit 742 illustrated in FIG. 59 is an input device that allows a user to read a finger vein or a palm vein and input a biometric image.
  • the reading unit 742 includes a vein sensor that acquires a biological image of a finger vein or a palm vein by reading a vein of the user's finger or a palm vein.
  • the reading unit 742 has a square shape and is inclined by 45 °.
  • a cross-shaped guide 742 a having sides (side A, side B) parallel to the sides of the reading unit 742 is provided on the upper surface of the reading unit 742.
  • the side A of the guide 742a is inclined 45 ° to the left when viewed from the front of the information processing apparatus 700.
  • the side B of the guide 742a is inclined 45 ° to the right when viewed from the front of the information processing apparatus 700.
  • the vein sensor of the reading unit 742 can acquire a biological image of a vein regardless of the inside or outside of the guide 742a in order to acquire a biological image of a palm vein.
  • the user When acquiring a biological image of a palm vein, the user positions the palm on which the biological image is acquired on the upper surface of the reading unit 742, as in the information processing apparatus 100 of the second embodiment.
  • the user can acquire the natural image of the palm of the right hand and the biometric image of the palm of the left hand for the user in a natural and easy posture, and the information processing apparatus 700 can also be acquired. It can be determined whether the palm of the hand is the right hand palm or the left hand palm.
  • the user when acquiring the biological image of the finger vein, the user positions the finger for acquiring the biological image along one of the cross-shaped sides of the guide 742a, as in the third embodiment. Accordingly, the user can acquire a natural image of the right hand finger biometric image and the left hand biometric image of the user in a natural and easy posture, and the information processing apparatus 700 can also be acquired. It is possible to determine whether the finger is the finger of the right hand or the finger of the left hand.
  • the reading unit 742 may read a fingerprint of a finger or a palm print of a palm. Further, the reading unit 742 is not limited to a square shape, and may have any shape. In the information processing apparatus 700 according to the fifth embodiment, the details will be described later with reference to FIGS. 63 and 64. However, as in the second and third embodiments, the user can It is automatically determined whether the biometric information to be read is a palm vein or a finger vein, and whether it is a left or right hand. On the other hand, the distinction between the thumb, the index finger, the middle finger, the ring finger, and the little finger of each of the right hand and the left hand is determined based on a user input when obtaining biometric information. Further, in the information processing apparatus 700, biometric information on the palm or finger vein is acquired in response to the user positioning the palm or finger to be acquired on a space separated from the vein sensor surface by a certain distance (for example, several centimeters). To do.
  • FIG. 60 is a diagram illustrating a biometric feature table according to the fifth embodiment.
  • the biometric feature table 741b1 illustrated in FIG. 60 is set in the biometric feature information storage unit 741b included in the information processing apparatus 700 according to the fifth embodiment.
  • the biometric feature table 741b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 700.
  • the biometric feature table 741b1 is provided with “number”, “ID”, “left / right”, “finger type”, “finger”, “hand”, and “feature data” as items. In the biometric feature table 741b1, values set in the above items are associated with each other as biometric feature information.
  • Finger indicates that the biometric feature information is the biometric feature information of the finger.
  • the hand indicates that the biometric feature information is the biometric feature information of the hand. For example, when the biometric feature information is the biometric feature information of the finger, “0” is set for the finger and not set for the hand (here, indicated by “ ⁇ ”). When the biometric feature information is the biometric feature information of the palm, “0” is set for the hand, not for the finger.
  • biometric feature table 741b1 shown in FIG. 60 is an example, and any item can be set in the biometric feature table. For example, if the finger item is omitted and a value is set for the finger type item, it may be indicated that it is the biometric feature information of the finger.
  • FIG. 61 is a diagram showing a state when reading the vein of the palm of the right hand according to the fifth embodiment.
  • FIG. 61 is a view of the state when the information processing apparatus 700 reads the vein of the palm of the right hand as viewed from above.
  • the information processing apparatus 700 includes a display unit 120 and a main body unit 730, as in the information processing apparatus 100 according to the second embodiment.
  • a keyboard 131 and a reading unit 742 are provided on the upper surface of the main body 730 of the information processing apparatus 700.
  • the reading unit 742 is the same top surface of the main body 730 as the keyboard 131 of the information processing device 700, and each side of the square vein sensor is 45 ° on the front and side surfaces of the information processing device 700 at the front center of the keyboard 131. Arranged at an angle.
  • the reading unit 742 includes a guide 742a, but the guide 742a is not used when reading the palm vein.
  • a user's head 201, torso 202, upper right arm 203, right lower arm 204, and right hand palm 205 are shown.
  • the user moves the palm of the person reading the vein (for example, the palm 205 of the right hand) to the left of the side of the information processing apparatus 700.
  • the information processing device 700 is positioned parallel to the upper surface of the main body 730 at an angle of 45 °.
  • the user is a space separated from the vein sensor surface by a certain distance (for example, several centimeters) with the palm of the right hand 205 open with the finger open so that the center of the palm coincides with the center of the reading unit 742. Position on top.
  • the user does not have to twist the wrist between the palm 205 of the right hand and the right lower arm 204 when reading the vein of the palm, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • FIG. 61 demonstrated the case where the vein of the palm of a user's right hand was read, it is the same also when reading the vein of the palm of a left hand, and description is abbreviate
  • each finger of the user's right hand is located in a space apart from the vein sensor surface together with the palm 205 of the right hand, so that the finger or palm touches the keyboard 131 or other operation unit to cause an erroneous operation. This can be suppressed.
  • FIG. 62 is a diagram illustrating a state when reading the vein of the index finger of the right hand according to the fifth embodiment.
  • FIG. 62 is a view of the state when the information processing apparatus 700 reads the vein of the right index finger viewed from above.
  • the information processing apparatus 700 includes a display unit 120 and a main body unit 730.
  • a keyboard 131 and a reading unit 742 are provided on the upper surface of the main body 730 of the information processing apparatus 700.
  • the reading unit 742 is on the same top surface of the main body 730 as the keyboard 131 of the information processing apparatus 700, and each side of a square vein sensor having a cross-shaped guide 742 a is located at the front center of the keyboard 131.
  • the user When the user causes the reading unit 742 to read the vein of the finger of the hand, as shown in FIG. 62, the user moves the finger that reads the vein (for example, the index finger 205b of the right hand) to the left of the side surface of the information processing apparatus 700. It is positioned so as to be parallel to the upper surface of the main body 730 of the information processing apparatus 700 at an angle of 45 °.
  • the user opens the right hand finger so that the center line of the index finger 205b of the right hand to be read coincides with the center line of one side of the cross of the reading unit 742, and a certain distance from the vein sensor surface ( For example, a finger is placed in a space separated by several centimeters.
  • the user does not need to twist the wrist between the palm 205 of the right hand and the right lower arm 204 when reading the finger vein, as in the case of reading the vein of the palm, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably with respect to the direction feature portion. Therefore, the right hand or left hand can be determined quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced. 62, the case where the vein of the index finger of the user's right hand is read has been described. However, the present invention is not limited to this, and the same applies to the case where the vein of the finger other than the index finger of the right hand and the finger of the left hand is read. Yes, the description is omitted.
  • each finger of the user's right hand is located in a space that is a certain distance away from the vein sensor surface together with the palm 205 of the right hand, so that the finger or palm touches the keyboard 131 or other operation unit to cause an erroneous operation. This can be suppressed.
  • FIG. 63 and FIG. 64 are diagrams showing detection of the direction feature portion of the fifth embodiment.
  • FIG. 63A shows an acquired image 7421 when the vein of the palm of the right hand is read.
  • FIG. 63B shows an acquired image 7422 when the vein of the palm of the left hand is read.
  • FIG. 64A shows an acquired image 7423 when the finger vein of the right hand is read.
  • FIG. 64B shows an acquired image 7424 when the finger vein of the left hand is read.
  • Acquired images 7421 to 7424 are images acquired by the living body detection unit 142a of the reading unit 742.
  • the acquired images 7421 to 7424 are captured by, for example, an image sensor included in the living body detection unit 142a.
  • the upper side in FIGS. 63 and 64 is the back side when viewed from the front of the information processing apparatus 700
  • the lower side in FIGS. 63 and 64 is the front side when viewed from the front of the information processing apparatus 700.
  • the right side is the right side when viewed from the front of the information processing apparatus 700
  • the left side in FIGS. 63 and 64 is the left side when viewed from the front of the information processing apparatus 700.
  • the image acquired by the living body detection unit 142a includes a right hand detection rectangular image region 7420a for detecting a valley portion of the base of the palm of the right hand palm along the upper left side, and an upper right portion.
  • the right finger detection rectangular image region 7420d for detecting the valley portion of the base of the palm of the right hand palm is set along the lower right side of the image region 7420c.
  • the right-hand detection rectangular image region 7420a and the left-hand detection rectangular image region 7420b are provided along two sides that are orthogonal to each other in the acquired image.
  • the right finger detection rectangular image region 7420d and the left finger detection rectangular image region 7420c are provided along two mutually orthogonal sides in the acquired image.
  • the valley portion at the base of the finger functions as a direction feature portion.
  • the image acquisition by the image sensor of the living body detection unit 142a is performed without being irradiated with near infrared light from the light source unit 142c. Therefore, the image acquired here is not a vein image but an image of the appearance of a palm or a finger. In the detection process of the valley portion at the base of the finger, this detection process is performed for the position where each rectangular image region is set in the image of the palm or the appearance of the finger.
  • the right hand detection rectangular image region 7420a has a direction feature portion 7421a1 which is a valley portion of the index finger, middle finger, ring finger, and little finger base of the user's right hand. . Accordingly, the information processing apparatus 700 can determine that the palm of the right hand of the user is located above the vein sensor of the reading unit 742.
  • the left hand detection rectangular image region 7420b has a direction feature portion 7422b1 which is a valley portion of the index finger, middle finger, ring finger, and little finger base of the user's left hand. .
  • the information processing apparatus 700 can determine that the palm of the left hand of the user is located above the vein sensor of the reading unit 742.
  • the right finger detection rectangular image area 7420d has the base of any three adjacent fingers of the index finger, middle finger, ring finger, and little finger of the user's right hand.
  • the information processing apparatus 700 can determine that the finger of the right hand of the user is located in the information of the vein sensor of the reading unit 742.
  • the left finger detection rectangular image region 7420c has the base of any three adjacent fingers among the index finger, middle finger, ring finger, and little finger of the user's left hand.
  • the information processing apparatus 700 can determine that the finger of the user's left hand is located in the information of the vein sensor of the reading unit 742.
  • the reading unit 742 is a distance sensor (not shown) of the living body detection unit 142a.
  • the imaging unit 742b acquires a palm or finger image and provides the acquired image. Whether or not a valley portion at the base of the finger is detected in the right hand detection rectangular image region 7420a, left hand detection rectangular image region 7420b, left finger detection rectangular image region 7420c, or right finger detection rectangular image region 7420d. judge.
  • the reading unit 742 detects the palm or finger of the user, the reading unit 742 acquires an image of the palm or finger.
  • the reading unit 742 reads the right hand detection rectangular image area 7420a, the left hand detection rectangular image area 7420b, the left finger detection rectangular image area 7420c, or the right finger detection rectangular image area 7420d based on the presence / absence of a direction feature portion. Is the finger or the palm, and the left or right hand (type of acquisition target).
  • the acquired image for detecting the direction feature portion is an image captured by the imaging unit 142b. There may be.
  • the detection of the direction feature portion in the fifth embodiment that is, the valley portion of the base of the finger, for example, the valley portion of the base of the index finger and the middle finger, the valley portion of the base of the middle finger and the ring finger, the valley portion of the base of the ring finger and the little finger All or at least some combinations may be detected based on being open at a predetermined interval.
  • the detection of the valley portion at the base of the finger may be realized by the position of the base of each finger in the palm acquired image acquired by the living body detection unit 142a and the contour identification between the base of the finger. Good.
  • FIG. 65 is a flowchart showing the biometric feature information acquisition / registration process of the fifth embodiment.
  • the biometric feature information acquisition / registration process determines whether the biometric object to be registered is a finger or a palm and the right and left of the hand, and in the case of a finger, accepts the input of the finger type by the user, and the palm type (hand Left and right) or finger type (left and right hand / finger type) and biometric feature information indicating vein characteristics.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a finger or palm vein. In the following, the process illustrated in FIG. 65 will be described in order of step number.
  • the information acquisition unit 711 determines that a palm or a finger is placed at a predetermined height on the reading unit 742, based on the detection result of the distance sensor included in the living body detection unit 142a of the reading unit 742. . If it is determined that a palm or a finger is placed, the information acquisition unit 711 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor provided in the living body detection unit 142a, and the detected direction feature portion. Determine the position. In the fifth embodiment, the direction feature portion is used to determine whether the acquisition target is a palm or a finger and the left and right of the hand. In the case of a finger, the contour of the finger to be registered may be detected, the direction of the finger may be determined from the detected finger contour, and erroneous detection of finger placement may be prevented.
  • Step S252 If the result of determination in step S251 is that the acquisition target is a finger, the information acquisition unit 711 displays a message window that requests the user to input the finger type, and the user acquires the target finger. Accept input of type.
  • the type determination unit 712 determines the type of acquisition target (whether it is a finger or a palm and the left and right of the hand) based on the position of the direction feature portion determined in step S251.
  • Step S254 The information acquisition unit 711 causes the imaging unit 142b of the reading unit 742 to capture an image, and acquires a biological image in which a palm or finger vein is reflected. It should be noted that the processing order of steps S253 and S254 may be reversed or simultaneous when the palm and finger vein reading operations are the same.
  • Step S255 The information generation unit 713 extracts features of the biological information based on the biological image acquired in step S254.
  • the information generation unit 713 generates biometric information indicating the feature extraction result.
  • the information generation unit 713 includes the type determined in step S253, the type of the finger received in step S252 in the case of a finger, the biological information generated in step S255, and the user ID. Generate feature information.
  • the information generation unit 713 stores the biometric feature information generated in step S256 in the biometric feature information storage unit 741b. Thereby, the biometric feature information is registered in the biometric feature information storage unit 741b. Thereafter, the process ends.
  • FIG. 66 is a flowchart showing a biometric feature information authentication process according to the fifth embodiment.
  • the biometric feature information authentication process determines whether the biometric subject to be authenticated is a finger or a palm and the left and right sides of the hand, generates collation biometric feature information indicating the type of palm or the type of finger and the characteristics of a vein, This is a process of verifying against registered biometric feature information.
  • the collation biometric feature information is data having the same configuration as the biometric feature information, and indicates the features of the user's biometric subject (in the fifth embodiment, a finger vein or a palm vein).
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a finger vein or a palm vein. In the following, the process illustrated in FIG. 66 will be described in order of step number.
  • the information acquisition unit 711 determines that a palm or a finger is placed at a predetermined height on the reading unit 742, based on a detection result by a distance sensor provided in the living body detection unit 142a of the reading unit 742. . If it is determined that a palm or a finger is placed, the information acquisition unit 711 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor provided in the living body detection unit 142a, and the detected direction feature portion. Determine the position. In the case of a finger, the contour of the finger to be registered may be detected, the direction of the finger may be determined from the detected finger contour, and erroneous detection of finger placement may be prevented.
  • the type determination unit 712 determines the type of the acquisition target (whether it is a finger or a palm and the left and right of the hand) based on the position of the direction feature portion determined in step S261.
  • Step S263 The information acquisition unit 711 causes the imaging unit 142b of the reading unit 742 to capture an image and acquires a biological image in which a palm or finger vein is reflected. Note that the processing order of steps S262 and S263 may be reversed or simultaneous when the palm and finger vein reading operations are the same.
  • Step S264 The information generation unit 713 extracts features of the biological information based on the biological image acquired in Step S263.
  • the information generation unit 713 generates biometric information indicating the feature extraction result.
  • the information generation unit 713 generates verification biometric feature information including the type determined in step S262 and the biometric information generated in step S264.
  • the collation unit 714 executes biometric feature information collation processing (described later in FIG. 67) using the collation biometric feature information generated in step S265. Thereafter, the process ends.
  • FIG. 67 is a flowchart showing the biometric feature information matching process according to the fifth embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 67 will be described in order of step number.
  • the collation unit 714 obtains collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 714 refers to the biometric feature information storage unit 741b, and matches the type of collation biometric feature information (whether it is a finger or a palm and the left and right of the hand) acquired in step S271. Extract information.
  • the matching unit 714 extracts from the biometric feature information storage unit 741b biometric feature information that matches both the type of matching biometric feature information and the user ID.
  • the collation unit 714 selects one of the biometric feature information extracted in step S272 and selects biometric information included in each of the selected biometric feature information and collation biometric feature information.
  • Step S274 The collation unit 714 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in step S273. If the verification is successful (step S274: YES), the process proceeds to step S275. On the other hand, if the verification fails (step S274: NO), the process proceeds to step S276.
  • Step S275 The collation unit 714 executes a predetermined process when the authentication is successful. Thereafter, the process returns.
  • the collation unit 714 determines whether all the biometric feature information extracted in Step S272 has been selected in Step S273. If all have been selected (YES in step S276), the process proceeds to step S277. On the other hand, if there is an unselected one (step S276 NO), the process proceeds to step S273.
  • Step S277 The collation unit 714 executes predetermined processing when authentication fails. Thereafter, the process returns.
  • the fifth embodiment as described above also provides the same effects as those of the second and third embodiments.
  • the user it becomes possible for the user to arbitrarily select and use the finger vein biometric information and the palm vein biometric information for authentication of the user.
  • user authentication can be performed by using both biometric information of finger veins and biometric information of palm veins, so that the accuracy of authentication can be improved.
  • FIG. 68 is a block diagram illustrating an information processing apparatus according to the sixth embodiment.
  • An information processing apparatus 800 according to the sixth embodiment includes an information acquisition unit 811, a type determination unit 812, an information generation unit 813, a collation unit 814, and a biometric feature information storage unit 841b.
  • a reading unit 842 is connected to the information acquisition unit 811.
  • the information acquisition unit 811 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 800.
  • the information acquisition unit 811 can acquire the direction of the living body in a state where the biological image is acquired.
  • the biological image acquired by the information acquisition unit 811 is image information of a palm vein pattern and image information of one finger vein pattern specified in advance.
  • the direction of the palm (the direction of the hand) is the reverse direction (180 ° rotation) based on the left and right of the hand.
  • the direction of the finger is a reverse direction (180 ° rotation) based on the left and right sides of the finger's hand.
  • the information acquisition unit 811 determines the direction of the living body by determining the position of the direction feature portion in the acquired image.
  • the reading unit 842 is fixed to the upper part of the information processing apparatus 800.
  • the reading unit 842 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG.
  • the information acquisition unit 811 determines that the living body is arranged at a predetermined distance from the reading unit 842 and the direction of the living body with respect to the reading unit 842 based on the detection result by the living body detection unit 142a of the reading unit 842.
  • the information acquisition unit 811 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 842.
  • the directions of the left and right hands are opposite (rotated 180 °) and the angle with the keyboard 131 is parallel.
  • the direction feature portion is a valley portion at the base of the finger in the palm.
  • the information acquisition unit 811 acquires a biological image obtained by imaging a living body by the imaging unit 142b of the reading unit 842.
  • the type determination unit 812 determines the type of palm biometric information and the type of finger biometric information based on the hand direction acquired by the information acquisition unit 811. The type indicates whether the hand is left or right. In the sixth embodiment, the types of fingers are specified in advance, such as the left and right middle fingers. As will be described in detail later, the type determination unit 812 determines whether the type of generated information is information related to the palm or information related to the finger based on the determination result of the position of the direction feature portion.
  • the information generation unit 813 generates biological information indicating the characteristics of the biological body based on the biological image acquired by the information acquisition unit 811.
  • the information generation unit 813 generates collation biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 812. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown.
  • the information generation unit 813 generates biometric feature information including, for example, the generated biometric information, the type of biometric information determined by the type determination unit 812, and identification information that identifies an individual corresponding to the biometric information, It memorize
  • biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered.
  • the information generation unit 813 stores the generated biometric feature information in the biometric feature information storage unit 841b.
  • the verification unit 814 performs authentication using the verification biometric feature information generated by the information generation unit 813.
  • the collation unit 814 extracts biometric feature information whose type matches the collation biometric feature information, and collates with each palm and finger based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 800 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.
  • the biometric feature information storage unit 841b stores biometric feature information indicating the biometric information of the palm acquired by the information acquiring unit 811 and the biometric information of the finger and the type of the biometric information determined by the type determining unit 812. Thereby, the biometric information of the palm of the user and the biometric information of the finger are stored in association with the left and right of the hand.
  • FIG. 69 is a diagram illustrating a biometric feature table according to the sixth embodiment.
  • the biometric feature table 841b1 illustrated in FIG. 69 is set in the biometric feature information storage unit 841b included in the information processing apparatus 800 according to the sixth embodiment.
  • the biometric feature table 841b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 800.
  • the biometric feature table 841b1 is provided with “number”, “ID”, “left / right”, and “feature data” as items.
  • the feature data has “hand” and “finger 3” as items.
  • values set in the above items are associated with each other as biometric feature information.
  • the hand indicates the file name of the biometric information indicating the characteristics of the palm vein of the right hand and the vein of the palm of the left hand.
  • the finger 3 indicates a file name of biometric information indicating the characteristics of either the middle finger vein of the right hand or the middle finger vein of the left hand. Whether the biometric information of the biometric feature information is right-handed or left-handed is indicated by the left and right items.
  • biometric feature table 841b1 shown in FIG. 69 is an example, and any item can be set in the biometric feature table.
  • feature data of the middle finger vein is used, but feature data of another finger vein may be used.
  • FIG. 70 is a diagram illustrating a state when reading the palm of the right hand and the vein of the middle finger according to the sixth embodiment.
  • FIG. 70 is a diagram of the state when the information processing apparatus 800 reads the palm of the right hand and the vein of the middle finger as viewed from above.
  • the information processing apparatus 800 includes a display unit 120 and a main body unit 830.
  • a keyboard 131 and reading units 8425 and 8426 are provided on the upper surface of the main body 830 of the information processing apparatus 800.
  • the reading units 8425 and 8426 include a living body detection unit 142a, an imaging unit 142b, and a light source unit 142c shown in FIG.
  • the reading units 8425 and 8426 are on the same top surface of the main body 830 as the keyboard 131 of the information processing apparatus 800, and are arranged side by side in front of the keyboard 131, and each side of the square vein sensor (imaging unit 142 b) Arranged parallel to the front and side of device 800.
  • the imaging unit 142b of the reading unit 8425 can read the vein of the middle finger of the right hand and the vein of the palm of the left hand.
  • the imaging unit 142b of the reading unit 8426 can read the vein of the middle finger of the left hand and the vein of the palm of the right hand.
  • a user's head 201, body 202, upper right arm 203, right lower arm 204, right hand palm 205, and middle finger 205c of the right hand are shown.
  • each vein sensor in the reading units 8425 and 8426 in FIG. 70 has an angle of 90 ° with respect to the direction in which the front surface 830a of the main body 830 extends. It is arranged to make.
  • each vein sensor includes, for example, either the direction in which the front surface 830a of the main body 830 extends, the operation key arrangement direction on the keyboard 131 (longitudinal direction of the keyboard 131), or the main scanning direction of the LCD 121, and the main vein sensor. You may arrange
  • the user When the user causes the reading unit 8425 or 8426 to read the palm vein and the finger vein of the finger specified in advance, as shown in FIG. 70, the user reads the palm of the person reading the vein (for example, the palm 205 of the right hand) and A finger (for example, the middle finger 205c of the right hand) is positioned parallel to the front surface of the information processing apparatus 800 and parallel to the upper surface of the main body 830 of the information processing apparatus 800.
  • the user can set the center of the palm 205 of the right hand to coincide with the center of the reading unit 8426, and the center line of the middle finger 205c of the right hand is the horizontal center line of the reading unit 8425. Position to match.
  • the user positions the palm 205 of the right hand so that fingers other than the middle finger 205 c of the right hand to be read do not cover the vicinity of the center line in the horizontal direction of the reading unit 8425 with the finger 205 open.
  • the user positions the palm 205 of the right hand and the middle finger 205c of the right hand in a space separated from the vein sensor surfaces of the reading units 8426 and 8425 by a certain distance (for example, several centimeters). The user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • the base between the fingers of the hand is detected in a detection rectangular area set on the left side (not shown) of the acquired image acquired by the image sensor of the living body detection unit 142a included in the reading unit 8426 of the palm 205 of the right hand. It is possible to quickly and surely determine that the right hand has been placed.
  • This detection rectangular area is set on the left side of the acquired image, for example, as shown in the right-hand detection rectangular image area 7420a shown in FIGS.
  • the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced.
  • the user can set the center of the palm of the left hand to match the center of the reading unit 8425 and the center line of the middle finger of the left hand is the horizontal center line of the reading unit 8426. Position to match.
  • the base between the fingers of the hand is detected from a detection rectangular region set on the right side (not shown) of the acquired image acquired by the image sensor of the living body detection unit 142a included in the reading unit 8425. It is determined that the left hand is placed.
  • This detection rectangular area is set on the right side of the acquired image, for example, as the left-hand detection rectangular image area 7420b shown in FIGS.
  • FIG. 71 is a flowchart illustrating biometric feature information acquisition registration processing according to the sixth embodiment.
  • the biometric feature information acquisition / registration process is a process for determining the right and left palms and finger hands of a biometric target to be registered, and generating and registering biometric feature information indicating the types of palms and fingers (left and right hands) and veins It is.
  • the biometric feature information acquisition / registration process is executed, for example, when the user registers biometric feature information of a finger and a palm vein.
  • the process illustrated in FIG. 71 will be described in order of step number.
  • the information acquisition unit 811 has palms or fingers placed at predetermined height positions on the reading units 8425 and 8426 based on the detection results of the living body detection unit 142a included in the reading units 8425 and 8426, respectively. Determine that. If it is determined that a palm or a finger is placed, the information acquisition unit 811 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in each of the living body detection units 142a of the reading units 8425 and 8426. The position of the detected direction feature portion is determined.
  • the kind of finger is determined in advance as a predetermined finger (for example, the middle finger), and the direction feature portion is used to determine whether the acquisition target is the left or right hand.
  • the information acquisition unit 811 determines that the hand to be read is the right hand when the direction feature portion is detected in the detection rectangular area set on the left side (not shown) of the acquired image acquired by the reading unit 8426. On the other hand, when the direction feature portion is detected in the detection rectangular area set on the right side (not shown) of the acquired image acquired by the reading unit 8425, the information acquisition unit 811 determines that the hand to be read is the left hand.
  • the type determination unit 812 determines the type (left and right) of the acquisition target based on the position of the direction feature portion determined in step S281.
  • the information acquisition unit 811 causes the imaging unit 142b included in each of the reading units 8425 and 8426 to capture an image, and acquires a biological image in which the palm and finger veins are reflected.
  • Step S284 The information generation unit 813 extracts features of the biological information based on the biological image acquired in step S283.
  • the information generation unit 813 generates biometric information indicating the feature extraction result.
  • the information generation unit 813 generates biometric feature information including the type determined in step S282, the biometric information generated in step S284, and the user ID.
  • the information generation unit 813 stores the biometric feature information generated in step S285 in the biometric feature information storage unit 841b. Thereby, the biometric feature information is registered in the biometric feature information storage unit 841b. Thereafter, the process ends.
  • FIG. 72 is a flowchart showing a biometric feature information authentication process according to the sixth embodiment.
  • the biometric feature information authentication process determines the left and right sides of the palm of the biometric subject and the finger, generates biometric feature information indicating the type of the palm and fingers (left and right of the hand) and vein characteristics, and is registered in advance. This is a process of verifying against biometric feature information.
  • the collation biometric feature information is data having the same configuration as the biometric feature information, and indicates the features of the user's biometric subject (in the sixth embodiment, finger veins and palm veins).
  • the biometric feature information authentication process is executed, for example, when the user authenticates with a finger vein and a palm vein. In the following, the process illustrated in FIG. 72 will be described in order of step number.
  • the information acquisition unit 811 has palms or fingers placed at predetermined height positions on the reading units 8425 and 8426 based on the detection results of the living body detection unit 142a included in the reading units 8425 and 8426, respectively. Determine that. If it is determined that a palm or a finger is placed, the information acquisition unit 811 detects a valley portion at the base of the finger, which is a direction feature portion, from an image captured by the image sensor included in each of the living body detection units 142a of the reading units 8425 and 8426. The position of the detected direction feature portion is determined.
  • the kind of finger is determined in advance as a predetermined finger (for example, the middle finger), and the direction feature portion is used to determine whether the acquisition target is the left or right hand.
  • the information acquisition unit 811 determines that the hand to be read is the right hand when the direction feature portion is detected in the detection rectangular area set on the left side (not shown) of the acquired image acquired by the reading unit 8426. On the other hand, when the direction feature portion is detected in the detection rectangular area set on the right side (not shown) of the acquired image acquired by the reading unit 8425, the information acquisition unit 811 determines that the hand to be read is the left hand.
  • the type determination unit 812 determines the type (left and right) of the acquisition target based on the position of the direction feature portion determined in step S291.
  • the information acquisition unit 811 causes the imaging unit 142b included in each of the reading units 8425 and 8426 to capture an image, and acquires a biological image in which the palm and finger veins are reflected.
  • Step S294 The information generation unit 813 extracts features of the biological information based on the biological image acquired in step S293.
  • the information generation unit 813 generates biometric information indicating the feature extraction result.
  • the information generation unit 813 generates verification biometric feature information including the type determined in step S292 and the biometric information generated in step S294.
  • the collation unit 814 executes biometric feature information collation processing (described later in FIG. 73) using the collation biometric feature information generated in step S295. Thereafter, the process ends.
  • FIG. 73 is a flowchart showing the biometric feature information matching process according to the sixth embodiment.
  • the biometric feature information matching process is a process of matching verification biometric feature information to be authenticated with biometric feature information registered in advance.
  • the biometric feature information matching process is executed, for example, when called by the biometric feature information authentication process. In the following, the process illustrated in FIG. 73 will be described in order of step number.
  • the collation unit 814 obtains collation biometric feature information generated by the biometric feature information authentication process.
  • the collation unit 814 refers to the biometric feature information storage unit 841b, and extracts biometric feature information that matches the type of collation biometric feature information acquired in step S301 (palm and left and right hand of the finger).
  • the matching unit 814 extracts from the biometric feature information storage unit 841b biometric feature information that matches both the type of matching biometric feature information and the user ID.
  • the matching unit 814 selects one of the unselected biometric feature information extracted in step S302, and matches the biometric information included in each of the selected biometric feature information and the matching biometric feature information.
  • biometric feature information and collation biometric feature information are collated, both palm veins and finger veins are collated.
  • Step S304 The collation unit 814 determines whether or not the collation between the selected biometric feature information and the collation biometric feature information is successful as a result of the collation in Step S303. If the verification is successful (step S304 YES), the process proceeds to step S305. On the other hand, if the verification fails (NO in step S304), the process proceeds to step S306. For example, it may be determined that the collation is successful when both the palm vein and the finger vein match, and the other cases may be determined as unsuccessful. Moreover, when evaluating a collation result by a numerical value or a step, each collation result may be comprehensively determined.
  • Step S305 The collation unit 814 executes predetermined processing when authentication is successful. Thereafter, the process returns.
  • Step S306 The collation unit 814 determines whether all the biometric feature information extracted in Step S302 has been selected in Step S303. If all have been selected (YES in step S306), the process proceeds to step S307. On the other hand, if there is an unselected item (NO in step S306), the process proceeds to step S303.
  • the collation unit 814 executes a predetermined process when authentication fails. Thereafter, the process returns.
  • the type of the finger to be read is specified in advance (the middle finger of the right hand or the left hand).
  • the present invention is not limited to this, and the information processing apparatus 800 determines based on the direction of the finger itself. May be. Further, any finger other than the middle finger may be specified in advance as a reading target.
  • the effect of combining the same effect as that of the second embodiment and the same effect as that of the third embodiment can be achieved by placing the hand once.
  • user authentication is performed using biometric information of both finger veins and palm veins, it is possible to improve the accuracy of authentication.
  • the right hand is placed from the image acquired from the image sensor of the reading unit 8426, and the left hand is placed from the image acquired from the image sensor of the reading unit 8425. It was determined that However, as another example, the direction of the hand may be determined from only one of the reading units 8425 and 8426.
  • the information acquisition unit 711 sets the detection rectangular area on the left side of the image acquired from the image sensor of the reading unit 8426.
  • the information acquisition unit 811 determines that the right hand detection rectangular image region 7420a in FIG. 63A has a valley portion at the base of the four fingers of the right hand as the right hand direction feature portion in the detection rectangular region. It is determined that the right hand is placed.
  • the information acquisition unit 811 has a valley portion at the base of the three fingers of the left hand as in the left finger detection rectangular image region 7420c of FIG. It is determined that the left hand is placed.
  • the direction of the hand from one of the reading units 8425 and 8426 it is only necessary to provide an image sensor in only one of them, and the manufacturing cost can be reduced.
  • the information acquisition unit 811 may always determine the direction of the hand based on the acquired images from both image sensors of the reading units 8425 and 8426. For example, the information acquisition unit 811 detects the valley portion of the base of the four fingers of the right hand from the detection rectangular area set on the left side of the image acquired from the image sensor of the reading unit 8426, and the image of the reading unit 8425 When the valley portion of the base of the three fingers of the right hand is detected from the detection rectangular area set on the right side of the image acquired from the sensor, it is determined that the right hand is placed. Thereby, the determination accuracy of the hand direction can be increased.
  • the information processing apparatus is different from the sixth embodiment in that the biological information of the palm vein and the biological information of a plurality of specific finger veins are read simultaneously.
  • FIG. 74 is a block diagram illustrating an information processing apparatus according to the seventh embodiment.
  • the information processing apparatus 900 according to the seventh embodiment includes an information acquisition unit 911, a type determination unit 912, an information generation unit 913, a collation unit 914, and a biometric feature information storage unit 941b.
  • a reading unit 942 is connected to the information acquisition unit 911.
  • the information acquisition unit 911 acquires a biometric image of a person to be authenticated such as a user of the information processing apparatus 900.
  • the information acquisition unit 911 can acquire the direction of the living body in a state where the biological image is acquired.
  • the biological image acquired by the information acquisition unit 911 is image information of a palm vein pattern and image information of a plurality of finger vein patterns.
  • the direction of the palm (the direction of the hand) is the reverse direction (180 ° rotation) based on the left and right of the hand.
  • the direction of the finger is a direction that combines the reverse direction (rotation by 180 °) and the direction of each finger based on the left and right of the hand.
  • the information acquisition unit 911 determines the direction of the living body by determining the position of the direction feature portion in the acquired image.
  • the reading unit 942 is fixed to the upper part of the information processing apparatus 900.
  • the reading unit 942 includes the living body detection unit 142a, the imaging unit 142b, and the light source unit 142c illustrated in FIG.
  • the information acquisition unit 911 determines that the living body is arranged at a predetermined distance from the reading unit 942 and the direction of the living body with respect to the reading unit 942 based on the detection result by the living body detection unit 142a of the reading unit 942.
  • the information acquisition unit 911 determines the direction of the living body by determining the position of the direction feature portion in the living body from the image obtained by the living body detection unit 142a of the reading unit 942.
  • the directions of the left and right hands are opposite (rotated 180 °) and the angle with the keyboard 131 is parallel.
  • the direction feature portion is a valley portion at the base of the finger in the palm.
  • the information acquisition unit 911 obtains a contour image of a finger from the image obtained by the living body detection unit 142a of the reading unit 942.
  • the information acquisition unit 911 acquires a biological image obtained by imaging a living body by the imaging unit 142b of the reading unit 942.
  • the type determining unit 912 determines the type of palm and finger biometric information (left and right of the hand) based on the hand direction acquired by the information acquiring unit 911.
  • the type determination unit 912 determines the type of the finger from the finger contour image acquired by the information acquisition unit 911. For example, in the case of the right hand, when viewed from above the reading unit 942, the finger extending in the direction extending from the left base of the center of the middle finger is determined as the index finger, and the finger extending in the direction extending from the right base is determined as the ring finger. To do.
  • the type determination unit 912 determines whether the type of generated information is information related to the palm or information related to the finger based on the determination result of the position of the direction feature portion.
  • the information generation unit 913 generates biological information indicating the characteristics of the biological body based on the biological image acquired by the information acquisition unit 911.
  • the information generation unit 913 generates collation biometric feature information including the generated biometric information and the type of biometric information determined by the type determination unit 912. Thereby, the biometric information and type of the user who collates the biometric information for authentication are shown.
  • the information generation unit 913 generates biometric feature information including, for example, the generated biometric information, the type of biometric information determined by the type determination unit 912, and identification information that identifies an individual corresponding to the biometric information. It is stored in the feature information storage unit 941b.
  • biometric feature information indicating the biometric information and type of a user who has a normal authority registered in advance and is used for authentication is registered.
  • the types are the left and right hand types and the finger type.
  • the information generating unit 913 stores the generated biometric feature information in the biometric feature information storage unit 941b.
  • the verification unit 914 performs authentication using the verification authentication biometric feature information generated by the information generation unit 913.
  • the collation unit 914 extracts biometric feature information whose type matches the collation biometric feature information, and collates based on the biometric information of the collation biometric feature information and the extracted biometric feature information. Thereby, the information processing apparatus 900 performs biometric authentication of the user based on the collation result. Since collation is performed by limiting the collation targets to those of the same type, an increase in time and load required for the collation process can be suppressed.
  • the biometric feature information storage unit 941b stores biometric feature information indicating the biometric information of the palm acquired by the information acquiring unit 911, the biometric information of the finger, and the type of the biometric information determined by the type determining unit 912. Thereby, the biometric information of the user's palm and the biometric information of the finger are stored in association with the right and left of the hand and the type of the finger.
  • FIG. 75 is a diagram illustrating a biometric feature table according to the seventh embodiment.
  • the biometric feature table 941b1 illustrated in FIG. 75 is set in the biometric feature information storage unit 941b included in the information processing apparatus 900 according to the seventh embodiment.
  • the biometric feature table 941b1 is a table for managing biometric feature information used for biometric authentication of the information processing apparatus 900.
  • the biometric feature table 941b1 is provided with “number”, “ID”, “left / right”, and “feature data” as items.
  • the feature data includes “hand”, “finger 2”, “finger 3”, and “finger 4” as items.
  • values set in the above items are associated with each other as biometric feature information.
  • Finger 2 indicates the file name of biometric information indicating the characteristics of either the right index finger vein or the left index finger vein.
  • the finger 3 indicates the file name of the biometric information indicating the characteristics of either the middle finger vein of the right hand or the middle finger vein of the left hand, as in the sixth embodiment.
  • the finger 4 indicates the file name of the biometric information indicating the characteristics of either the right finger ring or the left ring finger.
  • biometric feature table 941b1 shown in FIG. 75 is an example, and any item can be set in the biometric feature table.
  • feature data of veins of the index finger, middle finger, and ring finger are used, but feature data of veins of other fingers may be used, and some combinations of these fingers are used. May be.
  • FIG. 76 is a diagram illustrating a state when the palm of the right hand and the veins of the finger are read according to the seventh embodiment.
  • FIG. 76 is a diagram when the information processing apparatus 900 reads from the upper side the state when the palm of the right hand and the veins of the index finger, middle finger, and ring finger are read.
  • the information processing apparatus 900 includes a display unit 120 and a main body unit 930.
  • a keyboard 131 and reading units 9425 and 9426 are provided on the upper surface of the main body 930 of the information processing apparatus 900.
  • the reading units 9425 and 9426 include a living body detection unit 142a, an imaging unit 142b, and a light source unit 142c shown in FIG.
  • the reading units 9425 and 9426 are on the same top surface of the main body 930 as the keyboard 131 of the information processing apparatus 900, and are arranged in the left and right in front of the keyboard 131 and each of the rectangular vein sensors (imaging unit 142 b) that are long in the vertical direction.
  • the sides are arranged in parallel to the front and side surfaces of the information processing apparatus 900.
  • the imaging unit 142b of the reading unit 9425 can read the veins of the index finger, middle finger and ring finger of the right hand, or the vein of the palm of the left hand.
  • the imaging unit 142b of the reading unit 9426 can read the left index finger, middle finger, ring finger vein, or right hand palm vein. Also, the user's head 201, torso 202, upper right arm 203, right lower arm 204, right hand palm 205, right hand thumb 205a, right hand index finger 205b, right hand middle finger 205c, right hand ring finger 205d, right hand The little finger 205e is shown.
  • each vein sensor in the reading units 9425 and 9426 in FIG. 76 has a main scanning direction of 90 ° with respect to the direction in which the front surface 930a of the main body 930 extends. It is arranged to make an angle.
  • each vein sensor includes, for example, either the direction in which the front surface 930a of the main body 930 extends, the operation key arrangement direction on the keyboard 131 (longitudinal direction of the keyboard 131), or the main scanning direction of the LCD 121, and the main vein sensor. You may arrange
  • the user When the user causes the reading units 9425 and 9426 to read the veins of the palm and a plurality of finger veins specified in advance, as shown in FIG. 76, the user reads the palm of the person reading the vein (for example, the palm 205 of the right hand). ) And a finger (for example, the index finger 205b of the right hand, the middle finger 205c of the right hand, and the ring finger 205d of the right hand) are positioned parallel to the front surface of the information processing apparatus 900 and parallel to the upper surface of the main body 930 of the information processing apparatus 900.
  • a finger for example, the index finger 205b of the right hand, the middle finger 205c of the right hand, and the ring finger 205d of the right hand
  • the user can set the center part of the palm 205 of the right hand to coincide with the center part of the reading unit 9426 and the center line of the middle finger 205c of the right hand is the horizontal center line of the reading unit 9425. Position to match.
  • the user opens a finger on the palm 205 of the right hand, and the fingers other than the index finger 205b of the right hand, the middle finger 205c of the right hand, and the ring finger 205d of the right hand (the thumb 205a of the right hand and the little finger 205e of the right hand). Is positioned so as not to be above the reading range of the reading unit 9425.
  • the user positions the palm 205 of the right hand, the index finger 205b of the right hand, the middle finger 205c of the right hand, and the ring finger 205d of the right hand so as to be over the reading range of the reading units 9425 and 9426, and the reading units 9425 and 9426. It is located on a space separated from each vein sensor surface by a certain distance (for example, several centimeters).
  • the user does not need to bend the wrist between the palm 205 of the right hand and the right lower arm 204 at the time of reading, and can be made almost straight.
  • each finger of the user's right hand is straightened and opened sufficiently and the four bases between the finger of the user's right hand are sufficiently spaced. Therefore, there is no twist of the horizontal plane in the palm 205 of the right hand, and a correct image can be obtained quickly and reliably. Accordingly, correct features can be detected quickly and reliably, and biometric information registration and user authentication can be performed quickly and reliably.
  • the base between the fingers of the hand is detected in a detection rectangular area set on the left side (not shown) of the acquired image acquired by the image sensor of the living body detection unit 142a provided in the reading unit 9426 of the palm 205 of the right hand. It is possible to quickly and surely determine that the right hand has been placed.
  • This detection rectangular area is set on the left side of the acquired image, for example, as shown in the right-hand detection rectangular image area 7420a shown in FIGS.
  • the angle of the right wrist part of the user, the right lower arm part 204 and the upper right arm part 203 from the right wrist, the elbow between the lower right arm part 204 and the upper right arm part 203, the upper right arm part 203 and the torso part 202 There is no unreasonable posture of the right shoulder between and the user's burden can be reduced.
  • the user can set the center of the palm of the left hand to coincide with the center of the reading unit 9425, and the center line of the middle finger of the left hand is the horizontal center line of the reading unit 9426. Position to match.
  • the base between the fingers of the hand is detected from a detection rectangular region set on the right side (not shown) of the acquired image acquired by the image sensor of the living body detection unit 142a included in the reading unit 9425. It is determined that the left hand is placed.
  • the detection rectangular area is set on the right side of the acquired image, for example, as a left-hand detection rectangular image area 7420b shown in FIGS.
  • the information acquisition unit 911 may determine the direction of the hand from only one of the reading units 9425 and 9426. Alternatively, the information acquisition unit 911 may always determine the direction of the hand based on the acquired images from both image sensors of the reading units 9425 and 9426.
  • the acquired image for detecting the direction feature portion may be an image captured by the imaging unit 142b.
  • the type of the finger to be read is specified in advance (the index finger, middle finger, and ring finger of the right or left hand).
  • the present invention is not limited to this, and the user inputs the type of the finger to be read. May be.
  • the information processing apparatus 900 may make the determination based on the direction of the finger itself. Also, any combination of fingers other than the index finger, middle finger, and ring finger may be specified in advance as a reading target.
  • the same effects as in the sixth embodiment can be obtained.
  • authentication is performed using biometric information on a plurality of finger veins and biometric information on palm veins, so that the accuracy of authentication can be improved.
  • biometric information of the finger vein it is possible to determine which type of finger is the finger among the six fingers of the three left and right fingers according to the angle of the finger biometric.
  • this eliminates the need for the user to input all six types of fingers as well as left and right, and can suppress the burden of operations during authentication and registration.
  • 1-to-N collation the object of collation can be narrowed down according to classification about 6 types of fingers. For this reason, it becomes possible to perform collation at high speed when performing one-to-N collation.
  • the eighth embodiment is an eighth modification of the second embodiment in that the information processing apparatus is an automatic transaction apparatus and is an automatic teller machine that accepts and pays out deposits such as banks. Different from the example.
  • FIG. 77 is a diagram showing the appearance of the automatic transaction apparatus according to the eighth embodiment.
  • the automatic transaction apparatus 1000 includes an operation screen 1081, a bill input / output unit 1082a, a coin input / output unit 1082b, a passbook receiving unit 1083, a card receiving unit 1084, a receipt issuing unit 1085, a reading unit 1086, and a speaker 1087.
  • the operation screen 1081 has a display screen for displaying an image showing the contents of a transaction, an image including a message for guiding an ATM user, and a touch panel for receiving user input.
  • the banknote deposit / withdrawal unit 1082a deposits / withdraws banknotes for accepting the user's deposit and for dispensing the user's deposit.
  • the coin deposit / withdrawal unit 1082b deposits / withdraws coins for accepting the deposit of the user and paying out the deposit of the user.
  • the passbook accepting unit 1083 accepts a passbook when accepting a user's deposit, when paying out a user's deposit, and when other users wish to book.
  • the card receiving unit 1084 receives a cash card or the like when the user uses it.
  • the receipt issuing unit 1085 issues a receipt in which usage details are recorded when the user uses it.
  • the reading unit 1086 reads the biometric information of the palm of the user's palm, generates verification biometric feature information, and compares the biometric feature information with the registered biometric feature information. Authenticate.
  • the reading unit 1086 is provided on the right side of the automatic transaction apparatus 1000 when viewed from the front. Thereby, the user can position the left and right palms in different directions and read the palm veins when the automatic transaction apparatus 1000 reads the palm veins.
  • the biometric feature information may be managed by a server as in the first modification of the second embodiment.
  • the automatic transaction apparatus 1000 performs authentication using the reading result obtained by the reading unit 1086.
  • the speaker 1087 outputs a voice guidance and a warning sound for guiding the transaction status and operation to the user.
  • the reading unit 1086 is provided on the right side of the automatic transaction apparatus 1000 when viewed from the front, but is not limited thereto, and may be provided on the left side or the center of the automatic transaction apparatus. Further, the reading unit 1086 reads the vein of the palm of the user, but is not limited to this, and may read the vein of the finger. Also, a fingerprint or palm print may be read. Moreover, you may read combining these.
  • the automatic transaction apparatus 1000 is not limited to an automatic teller machine, and may be an apparatus that performs biometric authentication of a user such as a vending machine.
  • the automatic transaction apparatus 1000 has the same effects as those of the second embodiment.
  • the above processing functions can be realized by a computer.
  • a program describing the processing contents of the functions that the information processing apparatuses 100 to 900 and the automatic transaction apparatus 1000 should have is provided.
  • the program describing the processing contents can be recorded on a computer-readable recording medium.
  • the computer-readable recording medium include a magnetic storage device, an optical disk, a magneto-optical recording medium, and a semiconductor memory.
  • Magnetic storage devices include hard disk devices (HDD), flexible disks (FD), magnetic tapes, and the like.
  • Examples of the optical disc include a DVD (Digital Versatile Disc), a DVD-RAM, and a CD-ROM / RW (Rewritable).
  • Magneto-optical recording media include MO (Magneto-Optical disk).
  • the computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. In addition, each time a program is transferred from a server computer connected via a network, the computer can sequentially execute processing according to the received program.
  • processing functions described above can be realized by an electronic circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or a PLD (Programmable Logic Device).
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Input (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention permet à un opérateur d'avoir des informations de sa main lues dans une posture naturelle. Un capteur (11) lit des informations de la main par balayage de la main dans une direction de balayage (D1). Le capteur (11) est agencé sur un boîtier de telle manière que la direction de balayage primaire (D1) croise la direction (D2) dans laquelle s'étend le bord (10b) d'un côté du boîtier en regard d'un opérateur (20). Ainsi, par rotation de sa main droite (21) en sens antihoraire autour de son coude droit (22) à partir d'un état d'actionnement normal, l'opérateur (20) peut placer sa main droite (21) sur le côté supérieur du capteur (11) de manière à ce que sa main droite (21) se trouve le long de la direction de balayage secondaire du capteur (11).
PCT/JP2011/077119 2011-11-25 2011-11-25 Dispositif de traitement d'informations WO2013076858A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2013545731A JP5794310B2 (ja) 2011-11-25 2011-11-25 情報処理装置
PCT/JP2011/077119 WO2013076858A1 (fr) 2011-11-25 2011-11-25 Dispositif de traitement d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/077119 WO2013076858A1 (fr) 2011-11-25 2011-11-25 Dispositif de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2013076858A1 true WO2013076858A1 (fr) 2013-05-30

Family

ID=48469336

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/077119 WO2013076858A1 (fr) 2011-11-25 2011-11-25 Dispositif de traitement d'informations

Country Status (2)

Country Link
JP (1) JP5794310B2 (fr)
WO (1) WO2013076858A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017104061A1 (fr) * 2015-12-18 2017-06-22 株式会社日立製作所 Dispositif et système d'authentification biométrique

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006024209A (ja) * 2004-07-01 2006-01-26 Hewlett-Packard Development Co Lp 一体型指紋センサを備えた電子装置の操作性を向上させる方法および装置
JP2006277341A (ja) * 2005-03-29 2006-10-12 Fujitsu Ltd 複数同時バイオメトリクス入力装置および複数同時バイオメトリクス認証装置
JP2007226623A (ja) * 2006-02-24 2007-09-06 Murata Mach Ltd 指紋入力装置
JP2009295175A (ja) * 2009-07-24 2009-12-17 Mitsumi Electric Co Ltd 指紋画像形成装置及び指紋画像形成方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008136729A (ja) * 2006-12-04 2008-06-19 Oki Electric Ind Co Ltd 生体情報取得センサ並びに該センサを有する生体情報取得装置及び自動取引装置
JP4748199B2 (ja) * 2008-09-30 2011-08-17 ソニー株式会社 静脈撮像装置および静脈撮像方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006024209A (ja) * 2004-07-01 2006-01-26 Hewlett-Packard Development Co Lp 一体型指紋センサを備えた電子装置の操作性を向上させる方法および装置
JP2006277341A (ja) * 2005-03-29 2006-10-12 Fujitsu Ltd 複数同時バイオメトリクス入力装置および複数同時バイオメトリクス認証装置
JP2007226623A (ja) * 2006-02-24 2007-09-06 Murata Mach Ltd 指紋入力装置
JP2009295175A (ja) * 2009-07-24 2009-12-17 Mitsumi Electric Co Ltd 指紋画像形成装置及び指紋画像形成方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017104061A1 (fr) * 2015-12-18 2017-06-22 株式会社日立製作所 Dispositif et système d'authentification biométrique
JPWO2017104061A1 (ja) * 2015-12-18 2018-05-10 株式会社日立製作所 生体認証装置およびシステム
US10460187B2 (en) 2015-12-18 2019-10-29 Hitachi, Ltd. Biometric authentication device and system

Also Published As

Publication number Publication date
JP5794310B2 (ja) 2015-10-14
JPWO2013076858A1 (ja) 2015-04-27

Similar Documents

Publication Publication Date Title
US10489577B2 (en) Identifying one or more users based on typing pattern and/or behavior
EP1461673B1 (fr) Systeme et procede de reconnaissance d'une signature sur le web base sur une carte d'objets
US9965608B2 (en) Biometrics-based authentication method and apparatus
JP5907283B2 (ja) 情報処理装置、生体部位判定プログラムおよび生体部位判定方法
JP5799586B2 (ja) 生体認証装置、生体認証方法及び生体認証用コンピュータプログラム
JP6024141B2 (ja) 生体情報処理装置、生体情報処理方法、および生体情報処理プログラム
JPWO2012131899A1 (ja) 生体認証装置、生体認証システム、および生体認証方法
US20160048718A1 (en) Enhanced kinematic signature authentication using embedded fingerprint image array
JP2008021074A (ja) 取引処理システム
JP2010055228A (ja) カード処理装置及び処理方法
JP6674683B2 (ja) 認証処理装置及び認証処理方法
US11544364B2 (en) Authentication-based on handholding information
JP5794310B2 (ja) 情報処理装置
JP5720803B2 (ja) 生体情報照合装置、生体情報照合プログラムおよび生体情報照合方法
JP5720802B2 (ja) 生体情報登録装置、生体情報取得装置、生体情報登録プログラムおよび生体情報登録方法
JP4945505B2 (ja) 生体情報処理装置
JP4644689B2 (ja) 電気機器及び電気機器の制御方法
TWI646474B (zh) 用於身分核實系統中的造假生物特徵過濾裝置
US20180157814A1 (en) Personal authentication method and apparatus based on recognition of fingertip gesture and identification of fake pattern
WO2015141007A1 (fr) Système d'enregistrement/authentification biométrique et procédé d'enregistrement/authentification biométrique
TWI688877B (zh) 生物辨識交易系統
JP5244521B2 (ja) 自動取引装置
JP2011123729A (ja) 認証システム、人体通信端末装置、およびホスト装置
JP2013246567A (ja) 情報処理装置、認証方法およびプログラム
JP2012194723A (ja) 大規模向け生体認証システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11876276

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013545731

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11876276

Country of ref document: EP

Kind code of ref document: A1