WO2013061446A1 - Information processing device, information processing method, and information processing program - Google Patents

Information processing device, information processing method, and information processing program Download PDF

Info

Publication number
WO2013061446A1
WO2013061446A1 PCT/JP2011/074827 JP2011074827W WO2013061446A1 WO 2013061446 A1 WO2013061446 A1 WO 2013061446A1 JP 2011074827 W JP2011074827 W JP 2011074827W WO 2013061446 A1 WO2013061446 A1 WO 2013061446A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
posture
finger
unit
biological information
Prior art date
Application number
PCT/JP2011/074827
Other languages
French (fr)
Japanese (ja)
Inventor
英夫 鎌田
彰孝 皆川
東浦 康之
健太郎 鎹
克美 井出
Original Assignee
富士通フロンテック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通フロンテック株式会社 filed Critical 富士通フロンテック株式会社
Priority to PCT/JP2011/074827 priority Critical patent/WO2013061446A1/en
Priority to JP2013540579A priority patent/JP5655155B2/en
Publication of WO2013061446A1 publication Critical patent/WO2013061446A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • G06V10/7515Shifting the patterns to accommodate for positional errors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and an information processing program.
  • the human body includes biological information that can identify an individual, and some of them are used as information for identifying and authenticating the individual.
  • biometric information that can be used for authentication includes fingerprints, eye retinas and irises, faces, blood vessels, and DNA (Deoxyribo Nucleic Acid).
  • biometric authentication is performed by comparing biometric information (registration template) collected during registration with biometric information acquired during authentication.
  • an increase in the number of patterns for recording biometric information causes problems such as an increase in recording capacity for recording biometric information, a decrease in authentication speed during authentication, and an increase in the acceptance rate of others.
  • the present invention has been made in view of these points, and an object thereof is to provide an information processing apparatus, an information processing method, and an information processing program that can acquire biometric information used for collation in consideration of individual differences. To do.
  • the information processing apparatus includes an information acquisition unit, a holding unit, a posture changing unit, an evaluation unit, and a generation unit.
  • the information acquisition unit acquires biological information from the living body.
  • the holding unit holds the living body corresponding to the information acquisition unit.
  • the posture changing unit changes the posture of the living body held by the holding unit.
  • An evaluation part evaluates the feature-value of the biometric information acquired for every some different attitude
  • the generating unit generates collation information used for biometric collation from the biometric information selected based on the evaluation among the plurality of obtained biometric information and the posture information that can specify the posture corresponding to the selected biometric information.
  • the information processing method which a computer performs changes the attitude
  • the verification information used for the verification of the living body from the biological information selected based on the evaluation among the plurality of acquired biological information and the posture information capable of specifying the posture corresponding to the selected biological information is evaluated. Generate.
  • the information processing program changes the posture of the living body held by the holding unit, acquires the biological information from the living body for each of a plurality of different postures, and obtains the feature amount of the acquired biological information.
  • FIG. 1 is a diagram illustrating the configuration of the information processing apparatus according to the first embodiment.
  • the information processing apparatus 1 generates verification information registered in advance when authenticating a user using biometric information.
  • the biological information is information that can uniquely identify a user specific to the user's biological body.
  • the biological information includes, for example, a palm vein pattern.
  • the information processing apparatus 1 includes a holding unit 1a, an attitude change unit 1b, an information acquisition unit 1c, an evaluation unit 1d, and a generation unit 1e.
  • the holding unit 1 a holds the living body 2 when acquiring biological information from the living body 2.
  • the information acquisition unit 1c acquires biological information from the living body 2 held by the holding unit 1a.
  • the posture changing unit 1b changes the posture of the living body 2 held by the holding unit 1a. Thereby, the information processing apparatus 1 can acquire biological information of various postures from the living body 2.
  • the evaluation unit 1d evaluates the feature amount of the biological information acquired for each of a plurality of different postures by the information acquisition unit 1c.
  • the generation unit 1e selects biometric information for verification based on the evaluation performed by the evaluation unit 1d from among the plurality of acquired biological information.
  • the generation unit 1e generates collation information used for biometric collation from posture information that can specify the posture corresponding to the selected biological information.
  • the posture information is information that can identify the posture changed by the posture changing unit 1b.
  • the information processing apparatus 1 acquires biometric information used for verification from the biometric information of various postures acquired from the biometric 2, and selects biometric information suitable for verification. That is, the information processing apparatus 1 can select biometric information suitable for collation even when the appropriate posture is different for each individual. Then, the information processing apparatus 1 generates verification information from the biometric information and the posture information corresponding to the biometric information, thereby enabling authentication with the same posture as that during registration.
  • FIG. 2 is a diagram illustrating a configuration of the authentication system according to the second embodiment.
  • an information processing system in which the authentication system 3 performs authentication using a palm vein is exemplified.
  • the present invention is not limited to this, and other feature detection parts of a living body whose feature amount also changes due to a change in posture. It can also be applied to a system that performs authentication. More preferably, the authentication system 3 is applicable not only to posture changes such as yawing, pitching, and rolling, but also to a system that performs authentication at a feature detection site having a posture change accompanying a shape change such as a palm.
  • the authentication system 3 is one of information processing systems that recognizes the characteristics of a living body and identifies and authenticates an individual. For example, the customer system is authenticated by a bank system or the like.
  • the authentication system 3 includes an information processing device such as a registration device 10, a plurality of automatic depositing devices 6 and an authentication server 4, and a network 8.
  • the authentication server 4 associates and stores identification information for identifying an individual and verification information (template) registered in advance before biometric authentication.
  • the identification information for identifying an individual is a unique ID (IDentification) assigned to a user directly (for example, a user number) or indirectly (for example, an account number).
  • the collation information registered in advance includes biometric information for collation and posture information for collation.
  • the biometric information for verification is feature information obtained by extracting a feature portion from image information using a predetermined feature extraction algorithm, encoded information obtained by encoding image information or feature information, and the like.
  • the verification posture information is information for designating a posture at the time of verification.
  • One or more automatic teller machines 6 are installed in ATMs (Automated Teller Machines) corners 5 and ATM booths 7 located inside financial institutions.
  • the automatic depositing apparatus 6 is one of authentication apparatuses that perform biometric authentication when authenticating a user prior to a financial transaction.
  • the automated teller machine 6 includes an IC (Integrated Circuit) card reader / writer 17 and a sensor unit 20.
  • the sensor unit 20 includes an imaging device and takes a vein image of the palm of the user.
  • the automatic teller machine 6 includes verification information (verification biometric information) specified from identification information read from a user's IC card (for example, an IC chip built-in cash card) by the IC card reader / writer 17 and the sensor unit 20. The user is authenticated from the biometric information of the acquired user.
  • the sensor unit 20 holds the posture of the palm of the user based on the verification posture information in the same posture as at the time of template registration.
  • the sensor unit 20 acquires biometric information in the same posture as when the template is registered. That is, the sensor unit 20 is a biological information acquisition device that acquires biological information, and the automatic depositing device 6 is an authentication device that includes the biological information acquisition device.
  • the registration device 10 is a device that is provided at a bank window or the like, and performs user template registration according to instructions or operations of an attendant.
  • the registration device 10 includes a processing device 11, a display 12, and a sensor unit 20, and includes a keyboard 13, a mouse 14, an IC card reader / writer 15 and the like as necessary.
  • the sensor unit 20 has a built-in imaging device, images the palm of the user, and outputs a captured image to the processing device 11.
  • the IC card reader / writer 15 reads and writes information on the IC card 16 of the user.
  • the keyboard 13 and the mouse 14 accept input operations.
  • template registration registration information registration
  • a user who requests template registration inputs identification information (for example, a user ID) for identifying the user using the keyboard 13, mouse 14, or IC card reader / writer 15.
  • the registration apparatus 10 guides the template registration to the user by display using the display 12, and requests input of biometric information for template registration.
  • the user inputs biometric information by holding his hand over the sensor unit 20.
  • the sensor unit 20 acquires a plurality of pieces of biological information while changing the posture of the hand, and selects biological information to be registered from the acquired pieces of biological information.
  • the registration device 10 creates verification information from the selected biological information and the posture information corresponding to the selected biological information, and stores the storage unit of the processing device 11, the storage unit of the authentication server 4, or the user's IC card. Record in at least one of the 16 storage units.
  • the automated teller machine 6 refers to the template in the storage unit of the authentication server 4 or the storage unit of the IC card 16 and collates the input biometric information.
  • FIG. 3 is a diagram illustrating an appearance of the sensor unit according to the second embodiment.
  • FIG. 4 is a diagram illustrating an example of changing the finger interval of the three-finger support unit according to the second embodiment.
  • FIG. 5 is a diagram illustrating an example of a first finger / fifth finger droop amount change of the first finger / fifth finger support unit according to the second embodiment.
  • FIG. 6 is a diagram illustrating an example of changing the wrist support portion position of the wrist support portion according to the second embodiment.
  • the sensor unit 20 includes a guide unit 40 that supports the palm and a sensor 26 that captures the palm.
  • the guide part 40 has a box shape with an open upper surface, and has a concave chamber 24 that expands from the bottom toward the opening.
  • the concave chamber 24 sets the distance between the sensor 26 and the palm to an appropriate position.
  • the concave chamber 24 prevents intrusion of ambient light in the imaging range of the sensor 26 and prevents unnecessary background reflection.
  • the sensor 26 is located on the bottom surface of the concave chamber 24 and faces the opening.
  • the sensor 26 is an image sensor that captures the palm (for example, a CMOS (Complementary Metal Oxide Semiconductor) sensor, a CCD (Charge Coupled Device) sensor, etc.), a condenser lens, and a distance to the subject by irradiating the subject. And a plurality of light emitting elements (LED: Light Emitting Diode).
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • LED Light Emitting Diode
  • the guide unit 40 includes a three-finger support unit 21, a first / fifth finger support unit 25, and a wrist support unit 27.
  • the three-finger support portion 21 has two finger separation ribs 22 and determines and supports the positions of three fingers (indicating finger, middle finger, ring finger) by the two finger separation ribs 22.
  • the positions of the two finger separation ribs 22 can be changed, and the finger interval between the fingers placed on the three-finger support portion 21 can be adjusted.
  • the guide unit 40 can reduce the finger interval W2 between the fingers placed on the three-finger support unit 21 to the finger interval W1.
  • the guide part 40 can widen the finger
  • the first finger / fifth finger support portion 25 is a support member disposed on the left and right sides of the concave chamber 24, and the first finger and the fifth finger, or the first finger and the fifth finger, and the thumb finger or little finger hill. Can be supported.
  • the height of the first finger / fifth finger support portion 25 can be changed, and the amount of sag (height) of the first finger and the fifth finger placed on the first finger / fifth finger support portion 25 Can be adjusted.
  • Adjustment of the amount of sagging of the first and fifth fingers is based on the fact that the pinnacle that is the base of the first finger or the little finger hill that is the base of the fifth finger is raised above the center of the palm. Since there are individual differences in the amount of bulge, the posture of each individual is made appropriate.
  • the guide unit 40 reduces the first finger / fifth finger sag amount H2 of the first finger and the fifth finger placed on the first finger / fifth finger support unit 25 to reduce the first finger / fifth finger.
  • the guide unit 40 increases the first finger / fifth finger sag amount H2 of the first finger and the fifth finger placed on the first finger / fifth finger support unit 25 to increase the first finger / fifth finger.
  • the wrist support unit 27 supports the wrist.
  • the wrist support portion 27 can adjust the position of the palm in the front-rear direction (direction along the front-rear axis).
  • the guide part 40 can advance the wrist support part 27, and can adjust from the wrist support position L2 to the wrist support position L1.
  • the guide part 40 can retract
  • the guide part 40 can open the finger naturally when the palm is placed on the guide part 40 by the two finger separation ribs 22, and can guide the entire palm to be horizontal. it can.
  • the guide unit 40 since the boundary between the palm and the finger is clearly indicated by the two finger separation ribs 22, the guide unit 40 contributes to the improvement of the palm contour extraction accuracy. Further, the entire palm of the guide unit 40 is leveled by the first finger / fifth finger support unit 25, and the deformation of the vein pattern is reduced to contribute to the stability of the authentication accuracy.
  • FIG. 7 is a diagram illustrating a configuration of a sensor unit according to the second embodiment.
  • the sensor unit (biological information acquisition device) 20 includes a sensing unit 30 and a guide unit (guide device) 40.
  • the sensing unit 30 captures a vein image of the palm and transmits the captured data to the processing device 11.
  • the sensing unit 30 includes a storage unit 31, an imaging unit 32, a control unit 33, and a communication unit 34.
  • the control unit 33 comprehensively controls each processing unit.
  • the imaging unit (sensor 26) 32 acquires image information from a living body that is a subject.
  • the storage unit 31 temporarily stores the image information acquired by the imaging unit 32.
  • the communication unit 34 communicates with the processing device 11 and the guide unit 40.
  • the photographing unit 32 photographs near infrared reflected light from a living body (palm) as a subject. Since hemoglobin in red blood cells flowing in the veins has lost oxygen, this hemoglobin (reduced hemoglobin) has a property of absorbing near infrared rays in the vicinity of 700 nm (nanometers) to 1000 nm. Therefore, when near infrared rays are applied to the palm, only a portion where the vein is present is less reflected, and the position of the vein can be recognized by the intensity of reflected light of the near infrared ray. The photographed image by the photographing unit 32 becomes an achromatic image although it is easy to extract characteristic information by using a specific light source.
  • the guide unit 40 changes the posture of the palm and transmits control data that can identify the changed posture to the processing device 11.
  • the guide unit 40 can transmit control data to the processing device 11 via the sensing unit 30.
  • the guide unit 40 includes a communication unit 41, a control unit 42, motors (for example, stepping motors) 44, 45, 46, position sensors 47, 49, 51, and load sensors 48, 50, 52.
  • the control unit 42 comprehensively controls each processing unit.
  • the communication unit 41 communicates with the sensing unit 30.
  • the motor 44 drives the finger separation rib 22.
  • the motor 45 drives the first finger / fifth finger support unit 25.
  • the motor 46 drives the wrist support portion 27.
  • the position sensor 47 detects the position of the finger separation rib 22.
  • the position sensor 47 may detect the position of the finger separation rib 22 based on the driving amount of the motor 44.
  • the position sensor 49 detects the position of the first finger / fifth finger support unit 25.
  • the position sensor 49 may detect the position detection of the first finger / fifth finger support portion 25 by the driving amount of the motor 45.
  • the position sensor 51 detects the position of the wrist support portion 27.
  • the position sensor 51 may detect the position of the wrist support portion 27 based on the driving amount of the motor 46.
  • the load sensor 48 detects the load of the motor 44.
  • the load sensor 50 detects the load of the motor 45.
  • the load sensor 52 detects the load of the motor 46.
  • a known drive mechanism using a cam or the like can be used as the drive mechanism in which the motors 44, 45, and 46 drive the drive unit.
  • the controller 42 drives the motor 44 according to the position of the finger separation rib 22 detected by the position sensor 47. Moreover, the control part 42 stops the drive of the motor 44 according to the load detection of the load sensor 48 for a user's safety.
  • the control unit 42 drives the motor 45 in accordance with the position of the first finger / fifth finger support unit 25 detected by the position sensor 49. Moreover, the control part 42 stops the drive of the motor 45 according to the load detection of the load sensor 50 for a user's safety.
  • the control unit 42 drives the motor 46 according to the position of the wrist support unit 27 detected by the position sensor 51. Moreover, the control part 42 stops the drive of the motor 46 according to the load detection of the load sensor 52 for a user's safety.
  • FIG. 8 is a diagram illustrating a hardware configuration example of the registration apparatus according to the second embodiment.
  • the registration device 10 includes a processing device 11, a display 12, a keyboard 13, a mouse 14, a sensor unit 20, and an IC card reader / writer 15.
  • the entire processing apparatus 11 is controlled by a CPU (Central Processing Unit) 101.
  • a RAM Random Access Memory
  • HDD Hard Disk Drive
  • a communication interface 104 a graphic processing device 105, and an input / output interface 106 are connected to the CPU 101 via a bus 107.
  • the RAM 102 temporarily stores at least part of an OS (Operating System) program and application programs to be executed by the CPU 101.
  • the RAM 102 stores various data necessary for processing by the CPU 101.
  • the HDD 103 stores an OS and application programs.
  • a display 12 is connected to the graphic processing device 105.
  • the graphic processing device 105 displays an image on the screen of the display 12 in accordance with a command from the CPU 101.
  • the input / output interface 106 is connected to the keyboard 13, the mouse 14, the sensor unit 20, and the IC card reader / writer 15.
  • the input / output interface 106 can be connected to a portable recording medium interface that can write information to the portable recording medium 110 and read information from the portable recording medium 110.
  • the input / output interface 106 transmits signals sent from the keyboard 13, mouse 14, sensor unit 20, IC card reader / writer 15, and portable recording medium interface to the CPU 101 via the bus 107.
  • the communication interface 104 is connected to the network 8.
  • the communication interface 104 transmits / receives data to / from other computers (for example, the authentication server 4).
  • the processing device 11 can also be configured to include modules each composed of an FPGA (Field Programmable Gate Array), a DSP (Digital Signal Processor), or the like, or can be configured without the CPU 101.
  • each of the processing devices 11 includes a nonvolatile memory (for example, an EEPROM (Electrically Erasable and Programmable Read Only Memory), a flash memory, a flash memory type memory card, etc.), and stores module firmware.
  • the nonvolatile memory can write firmware via the portable recording medium 110 or the communication interface 104. In this way, the processing device 11 can also update the firmware by rewriting the firmware stored in the nonvolatile memory.
  • FIG. 9 is a flowchart of template registration processing according to the second embodiment.
  • the template registration process is executed based on, for example, a template registration execution operation by a staff member.
  • Step S11 The processing device 11 notifies the start of template registration.
  • the notification of the start of template registration can be performed using display on the display 12 or sound from a speaker (not shown).
  • the processing apparatus 11 displays “Template registration starts” on the display 12.
  • the processing device 11 executes a posture adjustment range determination process for determining an adjustment range of the finger interval and the first finger / fifth finger droop amount. Details of the posture adjustment range determination processing will be described later with reference to FIG.
  • Step S16 The processing device 11 executes a biometric information extraction process for collation for extracting biometric information for collation. Details of the biometric information extraction process for verification will be described later with reference to FIGS. 14, 17, and 18.
  • the processing device 11 generates verification information including the extracted verification biometric information and verification posture information corresponding to the extracted verification biometric information.
  • the processing apparatus 11 performs template registration of the generated verification information.
  • the template registration of the verification information is performed by recording in at least one of the storage unit of the processing device 11, the storage unit of the authentication server 4, or the storage unit of the user's IC card 16, and the template registration process is performed. finish.
  • initial values may be acquired for each condition such as gender and age based on information such as gender and age input in advance. Further, it is also possible to determine the size of the hand based on an image taken by detecting a state where the hand is put on the sensor 26 and obtain an initial value.
  • Step S22 The processing device 11 notifies that the posture of the palm is to be adjusted.
  • the notification that the posture adjustment of the palm is to be performed can be performed using display on the display 12 or sound from a speaker (not shown). For example, the processing device 11 displays “Please put your palm” on the display 12.
  • the processing unit 11 determines that the wrist is positioned rearward and is not in the normal position
  • the processing unit 11 proceeds to step S23, and the sensor unit 20 so that the position of the wrist support portion 27 advances by a predetermined amount (for example, 5 mm).
  • a predetermined amount for example, 5 mm
  • the processing unit 11 proceeds to step S23, and the sensor unit 20 so that the position of the wrist support portion 27 is retracted by a predetermined amount (for example, 5 mm).
  • the processing device 11 ends the wrist support portion position adjustment process.
  • the processing device 11 determines the adjustment range of the finger interval, that is, the adjustment range of the distance between the two finger separation ribs 22, based on the determined form of the living body. Further, the processing device 11 determines the adjustment range of the first finger / fifth finger sag amount, that is, the height adjustment range of the first finger / fifth finger support unit 25 based on the determined form of the living body.
  • the processing device 11 determines an initial adjustment value of the finger interval, that is, an initial adjustment value of the distance between the two finger separation ribs 22, based on the determined form of the living body. Further, the processing device 11 determines an initial adjustment value of the first finger / fifth finger sag amount, that is, an initial adjustment value of the height of the first finger / fifth finger support unit 25, based on the determined form of the living body. To do.
  • the processing device 11 determines the finger spacing adjustment initial value as 25 mm when the palm size is standard, and determines the finger spacing adjustment initial value as 20 mm when the palm size is small. If larger, the initial adjustment value of the finger interval is determined to be 30 mm. Further, the processing apparatus 11 determines the adjustment initial value of the first finger / fifth finger dripping amount to 5 mm when the palm size is standard, and the first finger / fifth finger dripping amount when the palm size is small. Is adjusted to 3 mm, and when the palm size is large, the adjustment initial value of the first finger / fifth finger drooping amount is determined to be 7 mm.
  • the processing device 11 individually determines the initial adjustment value of the finger interval and the initial adjustment value of the first finger / fifth finger sag amount based on the actual measurement value of the biological information acquired in step S24. Also good.
  • the processing device 11 determines the adjustment unit of the finger interval, that is, the adjustment unit of the distance between the two finger separation ribs 22, based on the determined form of the living body. Moreover, the processing apparatus 11 determines the adjustment unit of the height of the 1st finger / 5th finger support part 25, ie, the adjustment unit of the height of the 1st finger / 5th finger support part 25, based on the determined form of the living body.
  • the processing apparatus 11 determines the finger spacing adjustment unit as 1 mm when the palm size is standard, and determines the finger spacing adjustment unit as 0.5 mm when the palm size is small. If larger, the unit for adjusting the finger interval is determined to be 1.5 mm. Further, the processing device 11 determines the adjustment unit of the first finger / fifth finger sag amount as 1 mm when the palm size is standard and when the palm size is large, and the first finger when the palm size is small. / The adjustment unit of the fifth drooping amount is determined to be 0.5 mm.
  • FIG. 12 is a flowchart of finger interval change processing according to the second embodiment.
  • the finger interval change process is executed in the template registration process.
  • Step S43 The processing device 11 adds the adjustment unit determined in step S34 of the posture adjustment range determination process and updates the adjustment value. That is, the processing device 11 increases the finger interval by one adjustment unit.
  • Step S46 The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image).
  • the sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
  • FIG. 13 is a flowchart of the first finger / fifth finger droop amount changing process of the second embodiment.
  • the first finger / fifth finger droop amount changing process is executed in the template registration process.
  • Step S51 The processing device 11 sets the adjustment initial value determined in step S33 of the posture adjustment range determination process, and sets the finger interval and the first finger / fifth finger droop amount to the adjustment initial values.
  • the sensor unit 20 sets the first finger / fifth finger sag amount as an initial adjustment value based on an instruction from the processing device 11.
  • Step S52 The processing apparatus 11 instructs the sensor unit 20 to acquire biometric information (palm vein image).
  • the sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
  • Step S54 The processing device 11 determines whether or not there is an adjustment value within the adjustment range determined in step S32 of the posture adjustment range determination process. When the adjustment value is within the adjustment range, the processing device 11 instructs the sensor unit 20 to update the first finger / fifth finger drooping amount, and proceeds to step S52. The processing device 11 proceeds to step S55 when the adjustment value exceeds the adjustment range.
  • Step S55 The processing apparatus 11 sets the initial adjustment value determined in step S33 of the posture adjustment range determination process, and instructs the sensor unit 20 to set the first finger / fifth finger droop amount to the adjustment initial value. To do.
  • the sensor unit 20 sets the first finger / fifth finger sag amount as an initial adjustment value based on an instruction from the processing device 11.
  • Step S56 The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image).
  • the sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
  • Step S57 The processing device 11 updates the adjustment value by subtracting the adjustment unit determined in step S34 of the posture adjustment range determination process. That is, the processing device 11 reduces the first finger / fifth finger sag amount by one adjustment unit.
  • Step S58 The processing device 11 determines whether or not there is an adjustment value within the adjustment range determined in step S32 of the posture adjustment range determination process. When the adjustment value is within the adjustment range, the processing device 11 instructs the sensor unit 20 to update the first finger / fifth finger sag amount, and proceeds to step S56. The processing device 11 ends the first finger / fifth finger droop amount changing process when the adjustment value exceeds the adjustment range.
  • processing device 11 expands from the initial adjustment value to the maximum adjustment range for each adjustment unit, then returns to the initial adjustment value and then reduces to the minimum adjustment range for each adjustment unit. You may make it return to an initial value, whenever it expands and contracts.
  • FIG. 14 is a flowchart of the biometric information extraction process for collation according to the second embodiment.
  • FIG. 15 is a diagram illustrating an example of a finger interval biometric information table according to the second embodiment.
  • FIG. 16 is a diagram illustrating an example of a first finger / fifth sagging amount biological information table according to the second embodiment.
  • FIG. 17 is a diagram illustrating an example of an order table according to the second embodiment.
  • FIG. 18 is a diagram illustrating an example of the combined biometric information table according to the second embodiment.
  • the biometric information extraction process for verification is executed in the template registration process.
  • the processing device 11 calculates the feature amount of the biological information acquired in step S42 and step S46 of the finger interval change process. Thereby, the processing apparatus 11 obtains the finger interval biometric information table 200 together with the posture information (initial value, adjustment value) acquired in step S42 and step S46 of the finger interval change process.
  • the finger interval biometric information table 200 records initial values, adjustment values, and feature quantities at the time of finger interval adjustment in association with each other for a plurality of pieces of biometric information (d000, d001,...) Acquired with different finger intervals.
  • the processing device 11 calculates the feature amount of the biological information acquired in step S52 and step S56 of the first finger / fifth drooping amount changing process. Thereby, the processing apparatus 11 combines the posture information (initial value, adjustment value) acquired in step S52 and step S56 of the first finger / fifth finger droop amount change process, and the first finger / fifth finger droop amount.
  • a biological information table 210 is obtained.
  • the first finger / fifth finger sag amount biological information table 210 is a first finger / fifth finger sag for a plurality of pieces of biometric information (d100, d101,...) Acquired with different first finger / fifth finger sag amounts.
  • the initial value, adjustment value, and feature amount at the time of adjusting the amount are recorded in association with each other.
  • the features of the palm vein image are evaluated comprehensively, for example, by evaluating each item of the vein branch point and the amount of veins per unit area.
  • the evaluation of the feature amount included in the biological information is not limited to the above example, and any evaluation method can be adopted.
  • the processing device 11 has posture information (initial value, adjustment value) corresponding to each feature amount. Are recorded in the order table 220.
  • the processing device 11 refers to the feature amount in the first finger / fifth finger droop amount biological information table 210, and extracts the top three posture information (initial value and adjustment value) having the largest feature amount.
  • the extracted posture information (initial value, adjustment value) is recorded in the order table 220.
  • the processing device 11 determines the posture information corresponding to each feature amount. (Initial value, adjustment value) is recorded in the order table 220.
  • the order table 220 includes the top three posture information (initial value, adjustment value) having the most feature amount among the plurality of pieces of biological information acquired with different finger intervals, and the first finger / fifth finger droop amount. And the top three posture information (initial value, adjustment value) having a large amount of features among a plurality of pieces of biometric information acquired with different values.
  • the posture information combination pattern generated in this way is recorded in the combination biometric information table 230.
  • the processing apparatus 11 acquires one of the combination patterns of posture information recorded in the combination biometric information table 230.
  • the processing device 11 instructs the sensor unit 20 to update the finger interval and the first finger / fifth finger sag amount with the combination pattern of the acquired posture information.
  • the sensor unit 20 updates the finger interval and the first finger / fifth finger droop amount based on an instruction from the processing device 11.
  • the sensor unit 20 may be updated after returning to the initial value once.
  • the sensor unit 20 differs depending on whether the adjustment direction is “+” and the adjustment value is reached, or whether the adjustment direction is “ ⁇ ” and the adjustment value is reached.
  • Eliminate effects on hand posture changes For example, even if the adjustment value is the same as 26 mm, the posture of the hand may be different depending on whether the adjustment value is 26 mm from 25 mm to +1 mm or the adjustment value is reached from 27 mm to ⁇ 1 mm. By returning to the value, the condition of posture change can be made uniform.
  • Step S66 The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image).
  • the sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
  • Step S67 The processing apparatus 11 determines whether or not biometric information has been acquired for all the combination patterns of posture information recorded in the combined biometric information table 230. The processing apparatus 11 proceeds to step S68 when the biological information is acquired for all the posture information combination patterns, and proceeds to step S65 when the biological information is not acquired.
  • the processing device 11 evaluates the feature values (v200, v201,%) For the biometric information (d200, d201,%) Acquired for all the posture information combination patterns recorded in the combined biometric information table 230. And recorded in the combined biometric information table 230.
  • the combined biometric information table 230 includes three pieces of posture information (initial values and adjustment values) having different finger intervals and three pieces of posture information (initial values and adjustment values) having different first / fifth finger droop amounts. ) And the biometric information and the feature quantity are recorded in association with each other.
  • the processing device 11 refers to the combined biometric information table 230 and extracts the biometric information with the most characteristic amount as biometric information for verification.
  • the processing device 11 ends the verification biometric information extraction process using the extracted biometric information for verification and the posture information corresponding to the extracted biometric information for verification as the verification posture information.
  • the posture information may include posture information of the wrist support portion position in addition to the posture information of the finger interval and the posture information of the first finger / fifth finger sag amount.
  • the processing device 11 extracts the biometric information for verification after evaluating the feature amount of the biometric information acquired a plurality of times (for example, three times) for the extraction candidates with the top three as extraction candidates. It may be.
  • the processing device 11 can acquire biometric information suitable for use in collation in consideration of individual differences (shape, size, flexibility, etc. of the biometric acquisition site). Further, the processing device 11 generates a ranking table 220 by extracting a posture having a large amount of feature for each of the plurality of posture adjustment elements (finger interval and first finger / fifth finger drooping amount), A posture having a large feature amount is extracted from the posture combinations in 220. Accordingly, the processing device 11 can reduce the processing time required for generating the verification information even when there are a plurality of posture adjustment elements.
  • the 1st finger / 5th finger support part 25 adjusts the height which supports a 1st finger and a 5th finger similarly, a 1st finger support part and a 5th finger support part And a motor, a position sensor, and a load sensor, respectively, and may be independently adjustable.
  • the automatic depositing apparatus 6 acquires a user ID (for example, an account number) from the IC card 16.
  • a user ID for example, an account number
  • the automatic depositing apparatus 6 acquires posture information (posture information on the interval between fingers, posture information on the first finger / fifth amount of drooping) from the IC card 16.
  • the automatic depositing apparatus 6 instructs the sensor unit 20 to support the palm in a posture corresponding to the obtained posture information.
  • the automatic depositing apparatus 6 may first instruct the user to place the palm after placing the sensor unit 20 in a posture that supports the palm with the initial value of each posture information.
  • the automatic teller machine 6 updates the posture of the sensor unit 20 that supports the palm with the adjustment value of each posture information. Thereby, the automatic depositing apparatus 6 can improve the reproducibility of the posture when the registration apparatus 10 registers biometric information.
  • the automatic depositing apparatus 6 acquires the biological information by reproducing the posture of the palm when the biological information is registered. [Step S75] The automatic depositing apparatus 6 collates the biometric information for verification recorded in the IC card 16 with the biometric information acquired from the sensor unit 20.
  • Step S76 The automatic teller machine 6 proceeds to Step S77 when the verification of the biometric information acquired from the sensor unit 20 is successful, and proceeds to Step S78 when the verification fails.
  • the automatic depositing apparatus 6 can guide the palm, which is the user's biometric information acquisition site, to the same posture as when the verification information is registered, and can be expected to improve authentication accuracy.
  • the above processing functions can be realized by a computer.
  • a program describing the processing contents of the functions that each device should have is provided.
  • the program describing the processing contents can be recorded on a computer-readable recording medium (including a portable recording medium). Examples of the computer-readable recording medium include a magnetic recording device, an optical disk, a magneto-optical recording medium, and a semiconductor memory.
  • Examples of the magnetic recording device include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape.
  • Optical disks include DVD (Digital Versatile Disc), DVD-RAM, CD-ROM, CD-R (Recordable) / RW (ReWritable), and the like.
  • Magneto-optical recording media include MO (Magneto-Optical disk).
  • a portable recording medium such as a DVD or CD-ROM in which the program is recorded is sold. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
  • the computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Input (AREA)

Abstract

The present invention generates biometric information used for verification by considering individual differences. An information processing device (1) generates verification information that is pre-registered when authenticating a user by using biometric information. A retention unit (1a) retains a biological body (2) when acquiring biometric information from the biological body (2). An information acquisition unit (1c) acquires biometric information from the retained biological body (2). A position modification unit (1b) modifies the position of the retained biological body (2). As a consequence, the information processing device (1) is able to acquire the biometric information of the biological body (2) in various positions. An evaluation unit (1d) evaluates the feature amount of the sets of biometric information acquired by the information acquisition unit (1c) for each of the various positions. A generation unit (1e) selects, from among the multiple sets of acquired biometric information, the set of biometric information to be used for verification on the basis of the evaluation performed by means of the evaluation unit (1d). The generation unit (1e) generates verification information used for verifying the biological body from a set of position information with which it is possible to identify the position corresponding to the selected biometric information.

Description

情報処理装置、情報処理方法、および情報処理プログラムInformation processing apparatus, information processing method, and information processing program
 本発明は、情報処理装置、情報処理方法、および情報処理プログラムに関する。 The present invention relates to an information processing apparatus, an information processing method, and an information processing program.
 人体には、個人を識別可能な生体情報があり、それらのうちのいくつかは、個人を特定して認証するための情報として利用されている。たとえば、認証に利用可能とされる生体情報には、指紋、目の網膜や虹彩、顔、血管、DNA(Deoxyribo Nucleic Acid)などがあることが知られている。 The human body includes biological information that can identify an individual, and some of them are used as information for identifying and authenticating the individual. For example, it is known that biometric information that can be used for authentication includes fingerprints, eye retinas and irises, faces, blood vessels, and DNA (Deoxyribo Nucleic Acid).
 近年の生体認証技術の進展に伴い、このような人体の一部である生体の特徴を認識して、個人認証する装置が種々提供されている。生体認証では、登録時に採取した生体情報(登録テンプレート)と、認証時に取得した生体情報とを比較することで認証をおこなう。 With the recent progress of biometric authentication technology, various devices for recognizing the characteristics of a living body that is a part of the human body and performing personal authentication have been provided. In biometric authentication, authentication is performed by comparing biometric information (registration template) collected during registration with biometric information acquired during authentication.
 この生体情報による認証の精度向上のためには、認証の都度、一定の精度の生体情報を取得することが望ましい。しかしながら、認証対象となるユーザは、認証の際に、必ずしも適切な姿勢を取るわけではない。そのため、手のひらや指の静脈を用いて認証をおこなう場合に、指の先端や付け根を位置決めガイドに接触させて、手を適切な位置に案内する位置決め装置の提案がある。 In order to improve the accuracy of authentication using this biometric information, it is desirable to acquire biometric information with a certain accuracy each time authentication is performed. However, the user to be authenticated does not necessarily take an appropriate posture at the time of authentication. Therefore, there is a proposal of a positioning device that guides the hand to an appropriate position by contacting the tip or base of the finger with a positioning guide when authentication is performed using a palm or a finger vein.
 また、照合用の指静脈パタンを取得する際に、指置き部を駆動装置により上下方向や左右方向へ動かすことにより、複数の指姿勢の指静脈パタンを取得する生体情報読取システムの提案がある。 Also, there is a proposal of a biometric information reading system that acquires finger vein patterns of a plurality of finger postures by moving a finger placement unit in the vertical direction and the horizontal direction by a driving device when acquiring finger vein patterns for verification. .
特開平11-47119号公報JP 11-47119 A 特開2008-250601号公報JP 2008-250601 A
 しかしながら、個人ごとに生体の形状が異なることから、生体の特徴をよりよく取得可能な姿勢は、個人ごとに異なる場合があり、一律に最適な姿勢を案内することは困難である。また、生体情報を記録するパタン数の増大は、生体情報を記録する記録容量の増大や、認証時の認証速度の低下、他人受入率の増大などの問題を惹起する。 However, since the shape of the living body is different for each individual, the posture in which the characteristics of the living body can be acquired better may be different for each individual, and it is difficult to uniformly guide the optimal posture. In addition, an increase in the number of patterns for recording biometric information causes problems such as an increase in recording capacity for recording biometric information, a decrease in authentication speed during authentication, and an increase in the acceptance rate of others.
 本発明は、このような点に鑑みてなされたものであり、個人差を考慮して、照合に用いる生体情報を取得し得る情報処理装置、情報処理方法、および情報処理プログラムの提供を目的とする。 The present invention has been made in view of these points, and an object thereof is to provide an information processing apparatus, an information processing method, and an information processing program that can acquire biometric information used for collation in consideration of individual differences. To do.
 上記課題を解決するために、情報処理装置は、情報取得部と、保持部と、姿勢変更部と、評価部と、生成部と、を備える。情報取得部は、生体から生体情報を取得する。保持部は、情報取得部に対応して生体を保持する。姿勢変更部は、保持部により保持された生体の姿勢を変更する。評価部は、情報取得部により複数の異なる姿勢毎に取得した生体情報の特徴量を評価する。生成部は、取得した複数の生体情報のうち評価にもとづいて選択した生体情報と、選択した生体情報に対応した姿勢を特定可能な姿勢情報とから生体の照合に用いる照合用情報を生成する。 In order to solve the above problem, the information processing apparatus includes an information acquisition unit, a holding unit, a posture changing unit, an evaluation unit, and a generation unit. The information acquisition unit acquires biological information from the living body. The holding unit holds the living body corresponding to the information acquisition unit. The posture changing unit changes the posture of the living body held by the holding unit. An evaluation part evaluates the feature-value of the biometric information acquired for every some different attitude | position by the information acquisition part. The generating unit generates collation information used for biometric collation from the biometric information selected based on the evaluation among the plurality of obtained biometric information and the posture information that can specify the posture corresponding to the selected biometric information.
 また、上記課題を解決するために、コンピュータが実行する情報処理方法は、保持部により保持された生体の姿勢を変更し、複数の異なる姿勢毎に生体から生体情報を取得し、取得した生体情報の特徴量を評価し、取得した複数の生体情報のうち評価にもとづいて選択した生体情報と、選択した生体情報に対応した姿勢を特定可能な姿勢情報とから生体の照合に用いる照合用情報を生成する。 Moreover, in order to solve the said subject, the information processing method which a computer performs changes the attitude | position of the biological body hold | maintained by the holding | maintenance part, acquires biological information from a biological body for every several different attitude | position, and acquired biological information The verification information used for the verification of the living body from the biological information selected based on the evaluation among the plurality of acquired biological information and the posture information capable of specifying the posture corresponding to the selected biological information is evaluated. Generate.
 また、上記課題を解決するために、情報処理プログラムは、保持部により保持された生体の姿勢を変更し、複数の異なる姿勢毎に生体から生体情報を取得し、取得した生体情報の特徴量を評価し、取得した複数の生体情報のうち評価にもとづいて選択した生体情報と、選択した生体情報に対応した姿勢を特定可能な姿勢情報とから生体の照合に用いる照合用情報を生成する、処理をコンピュータに実行させる。 In order to solve the above problem, the information processing program changes the posture of the living body held by the holding unit, acquires the biological information from the living body for each of a plurality of different postures, and obtains the feature amount of the acquired biological information. Processing for generating verification information used for biometric verification from the biometric information selected based on the evaluation among the plurality of biometric information that has been evaluated and acquired, and the posture information that can specify the posture corresponding to the selected biometric information Is executed on the computer.
 上記の情報処理装置、情報処理方法、および情報処理プログラムによれば、個人差を考慮して、照合に用いる生体情報を取得し得る。
 本発明の上記および他の目的、特徴および利点は本発明の例として好ましい実施の形態を表す添付の図面と関連した以下の説明により明らかになるであろう。
According to the information processing apparatus, the information processing method, and the information processing program, it is possible to acquire biometric information used for matching in consideration of individual differences.
These and other objects, features and advantages of the present invention will become apparent from the following description taken in conjunction with the accompanying drawings which illustrate preferred embodiments by way of example of the present invention.
第1の実施形態の情報処理装置の構成を示す図である。It is a figure which shows the structure of the information processing apparatus of 1st Embodiment. 第2の実施形態の認証システムの構成を示す図である。It is a figure which shows the structure of the authentication system of 2nd Embodiment. 第2の実施形態のセンサユニットの外観を示す図である。It is a figure which shows the external appearance of the sensor unit of 2nd Embodiment. 第2の実施形態の三指支持部の指間隔変更の一例を示す図である。It is a figure which shows an example of the finger | toe space | interval change of the three-finger support part of 2nd Embodiment. 第2の実施形態の第1指/第5指支持部の第1指/第5指垂れ量変更の一例を示す図である。It is a figure which shows an example of the 1st finger / 5th finger dripping amount change of the 1st finger / 5th finger support part of 2nd Embodiment. 第2の実施形態の手首支持部の手首支持部位置変更の一例を示す図である。It is a figure which shows an example of the wrist support part position change of the wrist support part of 2nd Embodiment. 第2の実施形態のセンサユニットの構成を示す図である。It is a figure which shows the structure of the sensor unit of 2nd Embodiment. 第2の実施形態の登録装置のハードウェア構成例を示す図である。It is a figure which shows the hardware structural example of the registration apparatus of 2nd Embodiment. 第2の実施形態のテンプレート登録処理のフローチャートである。It is a flowchart of the template registration process of 2nd Embodiment. 第2の実施形態の手首支持部位置調整処理のフローチャートである。It is a flowchart of the wrist support part position adjustment process of 2nd Embodiment. 第2の実施形態の姿勢調整範囲決定処理のフローチャートである。It is a flowchart of the attitude | position adjustment range determination process of 2nd Embodiment. 第2の実施形態の指間隔変更処理のフローチャートである。It is a flowchart of the finger | toe space | interval change process of 2nd Embodiment. 第2の実施形態の第1指/第5指垂れ量変更処理のフローチャートである。It is a flowchart of the 1st finger / 5th drooping amount change process of 2nd Embodiment. 第2の実施形態の照合用生体情報抽出処理のフローチャートである。It is a flowchart of the biometric information extraction process for collation of 2nd Embodiment. 第2の実施形態の指間隔生体情報テーブルの一例を示す図である。It is a figure which shows an example of the finger space | interval biometric information table of 2nd Embodiment. 第2の実施形態の第1指/第5指垂れ量生体情報テーブルの一例を示す図である。It is a figure which shows an example of the 1st finger / 5th finger dripping amount biometric information table of 2nd Embodiment. 第2の実施形態の序列テーブルの一例を示す図である。It is a figure which shows an example of the rank table of 2nd Embodiment. 第2の実施形態の組み合わせ生体情報テーブルの一例を示す図である。It is a figure which shows an example of the combination biometric information table of 2nd Embodiment. 第2の実施形態の認証処理のフローチャートである。It is a flowchart of the authentication process of 2nd Embodiment.
 以下、実施形態を図面を参照して説明する。
 [第1の実施形態]
 まず、第1の実施形態の情報処理装置について、図1を用いて説明する。図1は、第1の実施形態の情報処理装置の構成を示す図である。
Hereinafter, embodiments will be described with reference to the drawings.
[First Embodiment]
First, the information processing apparatus according to the first embodiment will be described with reference to FIG. FIG. 1 is a diagram illustrating the configuration of the information processing apparatus according to the first embodiment.
 情報処理装置1は、生体情報を用いたユーザの認証に際し、あらかじめ登録しておく照合用情報を生成する。生体情報は、ユーザの生体に固有のユーザを一意に識別可能な情報である。生体情報は、たとえば、手のひらの静脈パタンがある。 The information processing apparatus 1 generates verification information registered in advance when authenticating a user using biometric information. The biological information is information that can uniquely identify a user specific to the user's biological body. The biological information includes, for example, a palm vein pattern.
 情報処理装置1は、保持部1aと、姿勢変更部1bと、情報取得部1cと、評価部1dと、生成部1eと、を備える。保持部1aは、生体2から生体情報を取得する際に、生体2を保持する。情報取得部1cは、保持部1aにより保持された生体2から生体情報を取得する。姿勢変更部1bは、保持部1aにより保持された生体2の姿勢を変更する。これにより、情報処理装置1は、生体2からさまざまな姿勢の生体情報を取得可能である。 The information processing apparatus 1 includes a holding unit 1a, an attitude change unit 1b, an information acquisition unit 1c, an evaluation unit 1d, and a generation unit 1e. The holding unit 1 a holds the living body 2 when acquiring biological information from the living body 2. The information acquisition unit 1c acquires biological information from the living body 2 held by the holding unit 1a. The posture changing unit 1b changes the posture of the living body 2 held by the holding unit 1a. Thereby, the information processing apparatus 1 can acquire biological information of various postures from the living body 2.
 評価部1dは、情報取得部1cにより複数の異なる姿勢毎に取得した生体情報の特徴量を評価する。生成部1eは、取得した複数の生体情報のうち、評価部1dがおこなった評価にもとづいて照合用の生体情報を選択する。生成部1eは、選択した生体情報に対応した姿勢を特定可能な姿勢情報とから生体の照合に用いる照合用情報を生成する。姿勢情報は、姿勢変更部1bが変更した姿勢を特定可能な情報である。 The evaluation unit 1d evaluates the feature amount of the biological information acquired for each of a plurality of different postures by the information acquisition unit 1c. The generation unit 1e selects biometric information for verification based on the evaluation performed by the evaluation unit 1d from among the plurality of acquired biological information. The generation unit 1e generates collation information used for biometric collation from posture information that can specify the posture corresponding to the selected biological information. The posture information is information that can identify the posture changed by the posture changing unit 1b.
 これにより、情報処理装置1は、生体2から取得したさまざまな姿勢の生体情報から照合に用いる生体情報を取得し、照合に適した生体情報を選択する。すなわち、情報処理装置1は、個人ごとに適正姿勢が異なる場合であっても、照合に適した生体情報を選択することができる。そして、情報処理装置1は、生体情報と、生体情報に対応した姿勢情報とから照合用情報を生成することで、登録時と同様の姿勢で認証をおこなうことを可能にする。 Thereby, the information processing apparatus 1 acquires biometric information used for verification from the biometric information of various postures acquired from the biometric 2, and selects biometric information suitable for verification. That is, the information processing apparatus 1 can select biometric information suitable for collation even when the appropriate posture is different for each individual. Then, the information processing apparatus 1 generates verification information from the biometric information and the posture information corresponding to the biometric information, thereby enabling authentication with the same posture as that during registration.
 [第2の実施形態]
 次に、第2の実施形態の認証システムについて、図2を用いて説明する。図2は、第2の実施形態の認証システムの構成を示す図である。第2の実施形態として、認証システム3が手のひらの静脈を用いて認証をおこなう情報処理システムを例示するが、これに限らず、姿勢の変化により特徴量も変化する生体のその他の特徴検出部位で認証をおこなうシステムにも適用可能である。より好ましくは、認証システム3は、ヨーイング、ピッチング、ローリング等の姿勢変化だけでなく、手のひらのように形状変化を伴う姿勢変化のある特徴検出部位で認証をおこなうシステムに適用可能である。
[Second Embodiment]
Next, an authentication system according to the second embodiment will be described with reference to FIG. FIG. 2 is a diagram illustrating a configuration of the authentication system according to the second embodiment. As the second embodiment, an information processing system in which the authentication system 3 performs authentication using a palm vein is exemplified. However, the present invention is not limited to this, and other feature detection parts of a living body whose feature amount also changes due to a change in posture. It can also be applied to a system that performs authentication. More preferably, the authentication system 3 is applicable not only to posture changes such as yawing, pitching, and rolling, but also to a system that performs authentication at a feature detection site having a posture change accompanying a shape change such as a palm.
 認証システム3は、生体の特徴を認識して個人を特定して認証する情報処理システムの1つであり、たとえば、銀行システムなどで顧客の認証をおこなう。認証システム3は、登録装置10、複数の自動預払装置6、および認証サーバ4などの情報処理装置と、ネットワーク8を含んで構成される。 The authentication system 3 is one of information processing systems that recognizes the characteristics of a living body and identifies and authenticates an individual. For example, the customer system is authenticated by a bank system or the like. The authentication system 3 includes an information processing device such as a registration device 10, a plurality of automatic depositing devices 6 and an authentication server 4, and a network 8.
 認証サーバ4は、個人を識別するための識別情報と、生体認証前にあらかじめ登録される照合用情報(テンプレート)とを関連付けて記憶する。個人を識別するための識別情報は、利用者に直接的(たとえば、利用者番号)に、あるいは間接的(たとえば、口座番号)に付されたユニークなID(IDentification)である。 The authentication server 4 associates and stores identification information for identifying an individual and verification information (template) registered in advance before biometric authentication. The identification information for identifying an individual is a unique ID (IDentification) assigned to a user directly (for example, a user number) or indirectly (for example, an account number).
 あらかじめ登録される照合用情報は、照合用生体情報と、照合用姿勢情報とを含む。照合用生体情報は、画像情報から所定の特徴抽出アルゴリズムで特徴部を抽出した特徴情報、画像情報または特徴情報を符号化した符号化情報等である。照合用姿勢情報は、照合時の姿勢を指定する情報である。 The collation information registered in advance includes biometric information for collation and posture information for collation. The biometric information for verification is feature information obtained by extracting a feature portion from image information using a predetermined feature extraction algorithm, encoded information obtained by encoding image information or feature information, and the like. The verification posture information is information for designating a posture at the time of verification.
 自動預払装置6は、金融機関の屋内にあるATM(Automated Teller Machine)コーナ5や、ATMブース7に、1台または複数台が設置される。自動預払装置6は、金融取引に先立ち、利用者を認証する際に、生体認証をおこなう認証装置の1つである。自動預払装置6は、IC(Integrated Circuit)カードリーダライタ17とセンサユニット20を備える。 One or more automatic teller machines 6 are installed in ATMs (Automated Teller Machines) corners 5 and ATM booths 7 located inside financial institutions. The automatic depositing apparatus 6 is one of authentication apparatuses that perform biometric authentication when authenticating a user prior to a financial transaction. The automated teller machine 6 includes an IC (Integrated Circuit) card reader / writer 17 and a sensor unit 20.
 センサユニット20は、撮像装置を備え、利用者の手のひらの静脈像を撮影する。自動預払装置6は、ICカードリーダライタ17が利用者のICカード(たとえば、ICチップ内蔵型キャッシュカード)から読み取る識別情報から特定する照合用情報(照合用生体情報)と、センサユニット20から取得する利用者の生体情報とから、利用者の認証をおこなう。 The sensor unit 20 includes an imaging device and takes a vein image of the palm of the user. The automatic teller machine 6 includes verification information (verification biometric information) specified from identification information read from a user's IC card (for example, an IC chip built-in cash card) by the IC card reader / writer 17 and the sensor unit 20. The user is authenticated from the biometric information of the acquired user.
 センサユニット20は、照合用姿勢情報にもとづいて利用者の手のひらの姿勢を、テンプレート登録時と同様の姿勢で保持する。センサユニット20は、テンプレート登録時と同様の姿勢で生体情報を取得する。すなわち、センサユニット20は、生体情報を取得する生体情報取得装置であり、自動預払装置6は、生体情報取得装置を備える認証装置である。 The sensor unit 20 holds the posture of the palm of the user based on the verification posture information in the same posture as at the time of template registration. The sensor unit 20 acquires biometric information in the same posture as when the template is registered. That is, the sensor unit 20 is a biological information acquisition device that acquires biological information, and the automatic depositing device 6 is an authentication device that includes the biological information acquisition device.
 登録装置10は、銀行の窓口などに設けられ、係員の指示または操作にしたがい、利用者のテンプレート登録をおこなう装置である。登録装置10は、処理装置11と、ディスプレイ12と、センサユニット20を含んで構成され、必要に応じてキーボード13と、マウス14と、ICカードリーダライタ15等を含んで構成される。センサユニット20は、撮像装置を内蔵し、利用者の手のひらを撮影し、処理装置11に撮影画像を出力する。ICカードリーダライタ15は、利用者のICカード16の情報を読み書きする。キーボード13と、マウス14は、入力操作を受け付ける。 The registration device 10 is a device that is provided at a bank window or the like, and performs user template registration according to instructions or operations of an attendant. The registration device 10 includes a processing device 11, a display 12, and a sensor unit 20, and includes a keyboard 13, a mouse 14, an IC card reader / writer 15 and the like as necessary. The sensor unit 20 has a built-in imaging device, images the palm of the user, and outputs a captured image to the processing device 11. The IC card reader / writer 15 reads and writes information on the IC card 16 of the user. The keyboard 13 and the mouse 14 accept input operations.
 ここで、登録装置10におけるテンプレート登録(照合用情報の登録)について説明する。テンプレート登録を求める利用者は、利用者を識別するための識別情報(たとえば、ユーザID)の入力をキーボード13、マウス14、あるいはICカードリーダライタ15によりおこなう。登録装置10は、ディスプレイ12を用いた表示によりテンプレート登録を利用者に案内し、テンプレート登録するための生体情報の入力を求める。利用者は、センサユニット20に手をかざすことにより、生体情報の入力をおこなう。センサユニット20は、手の姿勢を変更しながら複数の生体情報を取得し、取得した複数の生体情報から登録する生体情報を選択する。 Here, template registration (registration information registration) in the registration apparatus 10 will be described. A user who requests template registration inputs identification information (for example, a user ID) for identifying the user using the keyboard 13, mouse 14, or IC card reader / writer 15. The registration apparatus 10 guides the template registration to the user by display using the display 12, and requests input of biometric information for template registration. The user inputs biometric information by holding his hand over the sensor unit 20. The sensor unit 20 acquires a plurality of pieces of biological information while changing the posture of the hand, and selects biological information to be registered from the acquired pieces of biological information.
 登録装置10は、選択した生体情報と、選択した生体情報に対応する姿勢情報とから照合用情報を作成して、処理装置11の記憶部、認証サーバ4の記憶部、または利用者のICカード16の記憶部のうち少なくともいずれか1つに記録する。自動預払装置6は、生体認証をおこなう際に、認証サーバ4の記憶部、またはICカード16の記憶部にテンプレートを照会して、入力された生体情報の照合をおこなう。 The registration device 10 creates verification information from the selected biological information and the posture information corresponding to the selected biological information, and stores the storage unit of the processing device 11, the storage unit of the authentication server 4, or the user's IC card. Record in at least one of the 16 storage units. When performing the biometric authentication, the automated teller machine 6 refers to the template in the storage unit of the authentication server 4 or the storage unit of the IC card 16 and collates the input biometric information.
 次に、第2の実施形態のセンサユニットについて、図3から図6を用いて説明する。図3は、第2の実施形態のセンサユニットの外観を示す図である。図4は、第2の実施形態の三指支持部の指間隔変更の一例を示す図である。図5は、第2の実施形態の第1指/第5指支持部の第1指/第5指垂れ量変更の一例を示す図である。図6は、第2の実施形態の手首支持部の手首支持部位置変更の一例を示す図である。 Next, the sensor unit of the second embodiment will be described with reference to FIGS. FIG. 3 is a diagram illustrating an appearance of the sensor unit according to the second embodiment. FIG. 4 is a diagram illustrating an example of changing the finger interval of the three-finger support unit according to the second embodiment. FIG. 5 is a diagram illustrating an example of a first finger / fifth finger droop amount change of the first finger / fifth finger support unit according to the second embodiment. FIG. 6 is a diagram illustrating an example of changing the wrist support portion position of the wrist support portion according to the second embodiment.
 センサユニット20は、手のひらを支持するガイド部40と、手のひらを撮影するセンサ26とを備える。ガイド部40は、上面を開放した箱型の形状であり、底部から開口部に向かって拡開する凹室24を有する。凹室24は、センサ26と手のひらとの距離を適正位置とする。また、凹室24は、センサ26の撮影範囲の外乱光の侵入防止や、不要な背景の写り込みを防止する。 The sensor unit 20 includes a guide unit 40 that supports the palm and a sensor 26 that captures the palm. The guide part 40 has a box shape with an open upper surface, and has a concave chamber 24 that expands from the bottom toward the opening. The concave chamber 24 sets the distance between the sensor 26 and the palm to an appropriate position. The concave chamber 24 prevents intrusion of ambient light in the imaging range of the sensor 26 and prevents unnecessary background reflection.
 センサ26は、凹室24の底面に位置して開口を臨む。センサ26は、手のひらを撮影するイメージセンサ(たとえば、CMOS(Complementary Metal Oxide Semiconductor)センサ、CCD(Charge Coupled Device)センサなど)と、集光レンズと、被写体に照射して被写体との距離を測距するための複数の発光素子(LED:Light Emitting Diode)とを備える。 The sensor 26 is located on the bottom surface of the concave chamber 24 and faces the opening. The sensor 26 is an image sensor that captures the palm (for example, a CMOS (Complementary Metal Oxide Semiconductor) sensor, a CCD (Charge Coupled Device) sensor, etc.), a condenser lens, and a distance to the subject by irradiating the subject. And a plurality of light emitting elements (LED: Light Emitting Diode).
 ガイド部40は、三指支持部21と、第1指/第5指支持部25と、手首支持部27とを備える。三指支持部21は、2個の指分離リブ22を有し、2個の指分離リブ22により、三指(示指、中指、薬指)の位置を決めて支持する。2個の指分離リブ22は、位置が変更可能であって、三指支持部21に載置された指の指間隔を調整可能である。たとえば、ガイド部40は、三指支持部21に載置された指の指間隔W2を狭くして指間隔W1とすることができる。また、ガイド部40は、三指支持部21に載置された指の指間隔W2を広くして指間隔W3とすることができる。 The guide unit 40 includes a three-finger support unit 21, a first / fifth finger support unit 25, and a wrist support unit 27. The three-finger support portion 21 has two finger separation ribs 22 and determines and supports the positions of three fingers (indicating finger, middle finger, ring finger) by the two finger separation ribs 22. The positions of the two finger separation ribs 22 can be changed, and the finger interval between the fingers placed on the three-finger support portion 21 can be adjusted. For example, the guide unit 40 can reduce the finger interval W2 between the fingers placed on the three-finger support unit 21 to the finger interval W1. Moreover, the guide part 40 can widen the finger | toe space | interval W2 of the finger mounted in the three-finger support part 21, and can make it the finger | toe space | interval W3.
 第1指/第5指支持部25は、凹室24の左右に配置された支持部材であり、第1指および第5指、あるいは第1指および第5指に加えて拇指丘または小指丘を支持可能である。第1指/第5指支持部25は、高さが変更可能であって、第1指/第5指支持部25に載置された第1指および第5指の垂れ量(高さ)を調整可能である。 The first finger / fifth finger support portion 25 is a support member disposed on the left and right sides of the concave chamber 24, and the first finger and the fifth finger, or the first finger and the fifth finger, and the thumb finger or little finger hill. Can be supported. The height of the first finger / fifth finger support portion 25 can be changed, and the amount of sag (height) of the first finger and the fifth finger placed on the first finger / fifth finger support portion 25 Can be adjusted.
 第1指および第5指の垂れ量の調整は、第1指の付け根部となる拇指丘または第5指の付け根部となる小指丘が、手のひらの中央部よりも隆起しているうえ、その隆起量に個人差があることから、個人ごとの姿勢を適正にするものである。たとえば、ガイド部40は、第1指/第5指支持部25に載置された第1指および第5指の第1指/第5指垂れ量H2を小さくして第1指/第5指垂れ量H1とすることができる。また、ガイド部40は、第1指/第5指支持部25に載置された第1指および第5指の第1指/第5指垂れ量H2を大きくして第1指/第5指垂れ量H3とすることができる。 Adjustment of the amount of sagging of the first and fifth fingers is based on the fact that the pinnacle that is the base of the first finger or the little finger hill that is the base of the fifth finger is raised above the center of the palm. Since there are individual differences in the amount of bulge, the posture of each individual is made appropriate. For example, the guide unit 40 reduces the first finger / fifth finger sag amount H2 of the first finger and the fifth finger placed on the first finger / fifth finger support unit 25 to reduce the first finger / fifth finger. The amount of finger sag H1. Further, the guide unit 40 increases the first finger / fifth finger sag amount H2 of the first finger and the fifth finger placed on the first finger / fifth finger support unit 25 to increase the first finger / fifth finger. The amount of finger sag H3.
 手首支持部27は、手首を支持する。手首支持部27は、手のひらの位置を前後方向(前後軸に沿う方向)に調整可能である。たとえば、ガイド部40は、手首支持部27を前進して、手首支持位置L2から手首支持位置L1に調整することができる。また、ガイド部40は、手首支持部27を後退して、手首支持位置L2から手首支持位置L3に調整することができる。 The wrist support unit 27 supports the wrist. The wrist support portion 27 can adjust the position of the palm in the front-rear direction (direction along the front-rear axis). For example, the guide part 40 can advance the wrist support part 27, and can adjust from the wrist support position L2 to the wrist support position L1. Moreover, the guide part 40 can retract | save the wrist support part 27, and can adjust from the wrist support position L2 to the wrist support position L3.
 このように、ガイド部40は、2個の指分離リブ22により、手のひらをガイド部40に置いた時に、自然に指を開かせることができ、手のひら全体が水平となるように案内することができる。また、ガイド部40は、2個の指分離リブ22により、手のひらと指との境界が明示されるので、手のひらの輪郭抽出精度の向上に貢献する。また、ガイド部40は、第1指/第5指支持部25により手のひら全体が水平となり、静脈パタンの変形を低減して認証精度の安定に貢献する。 Thus, the guide part 40 can open the finger naturally when the palm is placed on the guide part 40 by the two finger separation ribs 22, and can guide the entire palm to be horizontal. it can. In addition, since the boundary between the palm and the finger is clearly indicated by the two finger separation ribs 22, the guide unit 40 contributes to the improvement of the palm contour extraction accuracy. Further, the entire palm of the guide unit 40 is leveled by the first finger / fifth finger support unit 25, and the deformation of the vein pattern is reduced to contribute to the stability of the authentication accuracy.
 次に、センサユニット20の構成について図7を用いて説明する。図7は、第2の実施形態のセンサユニットの構成を示す図である。
 センサユニット(生体情報取得装置)20は、センシング部30とガイド部(案内装置)40とを備える。センシング部30は、手のひらの静脈像を撮影し、撮影データを処理装置11に送信する。センシング部30は、記憶部31と、撮影部32と、制御部33と、通信部34を備える。
Next, the configuration of the sensor unit 20 will be described with reference to FIG. FIG. 7 is a diagram illustrating a configuration of a sensor unit according to the second embodiment.
The sensor unit (biological information acquisition device) 20 includes a sensing unit 30 and a guide unit (guide device) 40. The sensing unit 30 captures a vein image of the palm and transmits the captured data to the processing device 11. The sensing unit 30 includes a storage unit 31, an imaging unit 32, a control unit 33, and a communication unit 34.
 制御部33は、各処理部を統括的に制御する。撮影部(センサ26)32は、被写体となる生体から画像情報を取得する。記憶部31は、撮影部32が取得した画像情報を一時的に記憶する。通信部34は、処理装置11およびガイド部40と通信をおこなう。 The control unit 33 comprehensively controls each processing unit. The imaging unit (sensor 26) 32 acquires image information from a living body that is a subject. The storage unit 31 temporarily stores the image information acquired by the imaging unit 32. The communication unit 34 communicates with the processing device 11 and the guide unit 40.
 撮影部32は、被写体となる生体(手のひら)からの近赤外線の反射光を撮影する。静脈に流れる赤血球の中のヘモグロビンは、酸素を失っていることから、このヘモグロビン(還元ヘモグロビン)は、700nm(ナノメートル)~1000nm付近の近赤外線を吸収する性質がある。そのため、手のひらに近赤外線を当てると、静脈がある部分だけ反射が少なく、静脈の位置は、近赤外線の反射光の強弱により認識可能となる。撮影部32による撮影画像は、特定の光源を用いることにより特徴的な情報の抽出が容易になるが、無彩色の画像となる。 The photographing unit 32 photographs near infrared reflected light from a living body (palm) as a subject. Since hemoglobin in red blood cells flowing in the veins has lost oxygen, this hemoglobin (reduced hemoglobin) has a property of absorbing near infrared rays in the vicinity of 700 nm (nanometers) to 1000 nm. Therefore, when near infrared rays are applied to the palm, only a portion where the vein is present is less reflected, and the position of the vein can be recognized by the intensity of reflected light of the near infrared ray. The photographed image by the photographing unit 32 becomes an achromatic image although it is easy to extract characteristic information by using a specific light source.
 ガイド部40は、手のひらの姿勢を変更し、変更した姿勢を特定可能な制御データを処理装置11に送信する。ガイド部40は、処理装置11への制御データの送信を、センシング部30を介しておこなうことができる。 The guide unit 40 changes the posture of the palm and transmits control data that can identify the changed posture to the processing device 11. The guide unit 40 can transmit control data to the processing device 11 via the sensing unit 30.
 ガイド部40は、通信部41と、制御部42と、モータ(たとえば、ステッピングモータ)44,45,46と、位置センサ47,49,51と、負荷センサ48,50,52を備える。制御部42は、各処理部を統括的に制御する。通信部41は、センシング部30と通信をおこなう。 The guide unit 40 includes a communication unit 41, a control unit 42, motors (for example, stepping motors) 44, 45, 46, position sensors 47, 49, 51, and load sensors 48, 50, 52. The control unit 42 comprehensively controls each processing unit. The communication unit 41 communicates with the sensing unit 30.
 モータ44は、指分離リブ22を駆動する。モータ45は、第1指/第5指支持部25を駆動する。モータ46は、手首支持部27を駆動する。位置センサ47は、指分離リブ22の位置を検出する。位置センサ47は、指分離リブ22の位置検出をモータ44の駆動量により検出してもよい。位置センサ49は、第1指/第5指支持部25の位置を検出する。位置センサ49は、第1指/第5指支持部25の位置検出をモータ45の駆動量により検出してもよい。位置センサ51は、手首支持部27の位置を検出する。位置センサ51は、手首支持部27の位置検出をモータ46の駆動量により検出してもよい。負荷センサ48は、モータ44の負荷を検出する。負荷センサ50は、モータ45の負荷を検出する。負荷センサ52は、モータ46の負荷を検出する。 The motor 44 drives the finger separation rib 22. The motor 45 drives the first finger / fifth finger support unit 25. The motor 46 drives the wrist support portion 27. The position sensor 47 detects the position of the finger separation rib 22. The position sensor 47 may detect the position of the finger separation rib 22 based on the driving amount of the motor 44. The position sensor 49 detects the position of the first finger / fifth finger support unit 25. The position sensor 49 may detect the position detection of the first finger / fifth finger support portion 25 by the driving amount of the motor 45. The position sensor 51 detects the position of the wrist support portion 27. The position sensor 51 may detect the position of the wrist support portion 27 based on the driving amount of the motor 46. The load sensor 48 detects the load of the motor 44. The load sensor 50 detects the load of the motor 45. The load sensor 52 detects the load of the motor 46.
 なお、モータ44,45,46が駆動部を駆動する駆動機構については、カムなどを用いた既知の駆動機構を用いることができる。
 制御部42は、位置センサ47が検出する指分離リブ22の位置にしたがい、モータ44を駆動する。また、制御部42は、ユーザの安全のため、負荷センサ48の負荷検出にしたがいモータ44の駆動を停止する。制御部42は、位置センサ49が検出する第1指/第5指支持部25の位置にしたがい、モータ45を駆動する。また、制御部42は、ユーザの安全のため、負荷センサ50の負荷検出にしたがいモータ45の駆動を停止する。制御部42は、位置センサ51が検出する手首支持部27の位置にしたがい、モータ46を駆動する。また、制御部42は、ユーザの安全のため、負荷センサ52の負荷検出にしたがいモータ46の駆動を停止する。
A known drive mechanism using a cam or the like can be used as the drive mechanism in which the motors 44, 45, and 46 drive the drive unit.
The controller 42 drives the motor 44 according to the position of the finger separation rib 22 detected by the position sensor 47. Moreover, the control part 42 stops the drive of the motor 44 according to the load detection of the load sensor 48 for a user's safety. The control unit 42 drives the motor 45 in accordance with the position of the first finger / fifth finger support unit 25 detected by the position sensor 49. Moreover, the control part 42 stops the drive of the motor 45 according to the load detection of the load sensor 50 for a user's safety. The control unit 42 drives the motor 46 according to the position of the wrist support unit 27 detected by the position sensor 51. Moreover, the control part 42 stops the drive of the motor 46 according to the load detection of the load sensor 52 for a user's safety.
 次に、登録装置10のハードウェア構成例について図8を用いて説明する。図8は、第2の実施形態の登録装置のハードウェア構成例を示す図である。
 登録装置10は、処理装置11、ディスプレイ12、キーボード13、マウス14、センサユニット20、ICカードリーダライタ15を備える。
Next, a hardware configuration example of the registration apparatus 10 will be described with reference to FIG. FIG. 8 is a diagram illustrating a hardware configuration example of the registration apparatus according to the second embodiment.
The registration device 10 includes a processing device 11, a display 12, a keyboard 13, a mouse 14, a sensor unit 20, and an IC card reader / writer 15.
 処理装置11は、CPU(Central Processing Unit)101によって装置全体が制御されている。CPU101には、バス107を介してRAM(Random Access Memory)102、HDD(Hard Disk Drive)103、通信インタフェース104、グラフィック処理装置105、および入出力インタフェース106が接続されている。 The entire processing apparatus 11 is controlled by a CPU (Central Processing Unit) 101. A RAM (Random Access Memory) 102, an HDD (Hard Disk Drive) 103, a communication interface 104, a graphic processing device 105, and an input / output interface 106 are connected to the CPU 101 via a bus 107.
 RAM102には、CPU101に実行させるOS(Operating System)のプログラムやアプリケーションプログラムの少なくとも一部が一時的に格納される。また、RAM102には、CPU101による処理に必要な各種データが格納される。HDD103には、OSやアプリケーションプログラムが格納される。 The RAM 102 temporarily stores at least part of an OS (Operating System) program and application programs to be executed by the CPU 101. The RAM 102 stores various data necessary for processing by the CPU 101. The HDD 103 stores an OS and application programs.
 グラフィック処理装置105には、ディスプレイ12が接続されている。グラフィック処理装置105は、CPU101からの命令にしたがって、画像をディスプレイ12の画面に表示させる。 A display 12 is connected to the graphic processing device 105. The graphic processing device 105 displays an image on the screen of the display 12 in accordance with a command from the CPU 101.
 入出力インタフェース106には、キーボード13、マウス14、センサユニット20、ICカードリーダライタ15が接続されている。また、入出力インタフェース106は、可搬型記録媒体110への情報の書込み、および可搬型記録媒体110への情報の読出しが可能な可搬型記録媒体インタフェースと接続可能になっている。入出力インタフェース106は、キーボード13、マウス14、センサユニット20、ICカードリーダライタ15、可搬型記録媒体インタフェースから送られてくる信号を、バス107を介してCPU101に送信する。 The input / output interface 106 is connected to the keyboard 13, the mouse 14, the sensor unit 20, and the IC card reader / writer 15. The input / output interface 106 can be connected to a portable recording medium interface that can write information to the portable recording medium 110 and read information from the portable recording medium 110. The input / output interface 106 transmits signals sent from the keyboard 13, mouse 14, sensor unit 20, IC card reader / writer 15, and portable recording medium interface to the CPU 101 via the bus 107.
 通信インタフェース104は、ネットワーク8に接続されている。通信インタフェース104は、その他のコンピュータ(たとえば、認証サーバ4)との間でデータの送受信をおこなう。 The communication interface 104 is connected to the network 8. The communication interface 104 transmits / receives data to / from other computers (for example, the authentication server 4).
 以上のようなハードウェア構成によって、本実施の形態の処理機能を実現することができる。なお、認証サーバ4、自動預払装置6も同様のハードウェア構成で実現できる。
 なお、処理装置11は、それぞれFPGA(Field Programmable Gate Array)やDSP(Digital Signal Processer)などからなるモジュールを含んで構成することもでき、CPU101を有しない構成とすることもできる。その場合、処理装置11は、それぞれ不揮発性メモリ(たとえば、EEPROM(Electrically Erasable and Programmable Read Only Memory)、フラッシュメモリ、フラッシュメモリ型メモリカードなど)を備え、モジュールのファームウェアを記憶する。不揮発性メモリは、可搬型記録媒体110、あるいは通信インタフェース104を介してファームウェアを書き込むことができる。このように処理装置11は、不揮発性メモリに記憶されているファームウェアを書き換えることにより、ファームウェアの更新をすることもできる。
With the hardware configuration as described above, the processing functions of the present embodiment can be realized. Note that the authentication server 4 and the automatic depositing apparatus 6 can also be realized with the same hardware configuration.
The processing device 11 can also be configured to include modules each composed of an FPGA (Field Programmable Gate Array), a DSP (Digital Signal Processor), or the like, or can be configured without the CPU 101. In this case, each of the processing devices 11 includes a nonvolatile memory (for example, an EEPROM (Electrically Erasable and Programmable Read Only Memory), a flash memory, a flash memory type memory card, etc.), and stores module firmware. The nonvolatile memory can write firmware via the portable recording medium 110 or the communication interface 104. In this way, the processing device 11 can also update the firmware by rewriting the firmware stored in the nonvolatile memory.
 次に、登録装置10が実行するテンプレート登録処理について図9を用いて説明する。図9は、第2の実施形態のテンプレート登録処理のフローチャートである。テンプレート登録処理は、たとえば、係員によるテンプレート登録の実行操作にもとづいて実行される。 Next, template registration processing executed by the registration apparatus 10 will be described with reference to FIG. FIG. 9 is a flowchart of template registration processing according to the second embodiment. The template registration process is executed based on, for example, a template registration execution operation by a staff member.
 [ステップS11]処理装置11は、テンプレート登録の開始を報知する。テンプレート登録の開始の報知は、ディスプレイ12による表示や、図示しないスピーカによる音声を用いておこなうことができる。たとえば、処理装置11は、ディスプレイ12に「テンプレート登録を開始します。」と表示する。 [Step S11] The processing device 11 notifies the start of template registration. The notification of the start of template registration can be performed using display on the display 12 or sound from a speaker (not shown). For example, the processing apparatus 11 displays “Template registration starts” on the display 12.
 [ステップS12]処理装置11は、手のひらと指との境界がセンサ26の撮影範囲内となるように手首支持部位置を調整する手首支持部位置調整処理を実行する。手首支持部位置調整処理の詳細は、図10を用いて後で説明する。 [Step S12] The processing device 11 executes a wrist support portion position adjustment process for adjusting the wrist support portion position so that the boundary between the palm and the finger is within the imaging range of the sensor 26. Details of the wrist support portion position adjustment processing will be described later with reference to FIG.
 [ステップS13]処理装置11は、指間隔と、第1指/第5指垂れ量の調整範囲を決定する姿勢調整範囲決定処理を実行する。姿勢調整範囲決定処理の詳細は、図11を用いて後で説明する。 [Step S13] The processing device 11 executes a posture adjustment range determination process for determining an adjustment range of the finger interval and the first finger / fifth finger droop amount. Details of the posture adjustment range determination processing will be described later with reference to FIG.
 [ステップS14]処理装置11は、指間隔を変更する指間隔変更処理を実行する。指間隔変更処理の詳細は、図12および図15を用いて後で説明する。
 [ステップS15]処理装置11は、第1指/第5指垂れ量を変更する第1指/第5指垂れ量変更処理を実行する。第1指/第5指垂れ量変更処理の詳細は、図13および図16を用いて後で説明する。
[Step S14] The processing device 11 executes a finger interval changing process for changing the finger interval. Details of the finger interval changing process will be described later with reference to FIGS. 12 and 15.
[Step S15] The processing device 11 executes a first finger / fifth finger droop amount changing process for changing the first finger / fifth finger droop amount. Details of the first finger / fifth finger droop amount changing process will be described later with reference to FIGS. 13 and 16.
 [ステップS16]処理装置11は、照合用生体情報を抽出する照合用生体情報抽出処理を実行する。照合用生体情報抽出処理の詳細は、図14、図17および図18を用いて後で説明する。 [Step S16] The processing device 11 executes a biometric information extraction process for collation for extracting biometric information for collation. Details of the biometric information extraction process for verification will be described later with reference to FIGS. 14, 17, and 18.
 [ステップS17]処理装置11は、抽出した照合用生体情報と、抽出した照合用生体情報に対応する照合用姿勢情報とを含む照合用情報を生成する。
 [ステップS18]処理装置11は、生成した照合用情報のテンプレート登録をおこなう。照合用情報のテンプレート登録は、処理装置11の記憶部、認証サーバ4の記憶部、または利用者のICカード16の記憶部のうち少なくともいずれか1つに記録することによりおこない、テンプレート登録処理を終了する。
[Step S <b> 17] The processing device 11 generates verification information including the extracted verification biometric information and verification posture information corresponding to the extracted verification biometric information.
[Step S18] The processing apparatus 11 performs template registration of the generated verification information. The template registration of the verification information is performed by recording in at least one of the storage unit of the processing device 11, the storage unit of the authentication server 4, or the storage unit of the user's IC card 16, and the template registration process is performed. finish.
 次に、登録装置10が実行する手首支持部位置調整処理について図10を用いて説明する。図10は、第2の実施形態の手首支持部位置調整処理のフローチャートである。手首支持部位置調整処理は、テンプレート登録処理において実行される。 Next, wrist support portion position adjustment processing executed by the registration device 10 will be described with reference to FIG. FIG. 10 is a flowchart of wrist support position adjustment processing according to the second embodiment. The wrist support part position adjustment process is executed in the template registration process.
 [ステップS21]処理装置11は、手首支持部位置の初期値を取得する。手首支持部位置の初期値は、対象とするユーザを所定の割合でカバーする値が設定される。手首支持部位置の初期値は、たとえば、指と手のひらの境界と、手首との間の距離である110mm(ミリメートル)を設定する。手首支持部位置の初期値110mmは、日本人の成人の95%をカバーするとされる。 [Step S21] The processing device 11 acquires the initial value of the wrist support portion position. The initial value of the wrist support portion position is set to a value that covers the target user at a predetermined rate. For example, the initial value of the wrist support portion position is set to 110 mm (millimeters), which is the distance between the finger-palm boundary and the wrist. The initial value of the wrist support part position of 110 mm is supposed to cover 95% of Japanese adults.
 なお、あらかじめ入力される性別や年齢などの情報にもとづいて、性別や年齢などの条件ごとに異なる初期値を取得するようにしてもよい。また、センサ26に手を翳した状態を検出して撮影した画像にもとづいて手の大きさを判定し、初期値を取得するようにしてもよい。 Note that different initial values may be acquired for each condition such as gender and age based on information such as gender and age input in advance. Further, it is also possible to determine the size of the hand based on an image taken by detecting a state where the hand is put on the sensor 26 and obtain an initial value.
 [ステップS22]処理装置11は、手のひらの姿勢調整をおこなう旨を報知する。手のひらの姿勢調整をおこなう旨の報知は、ディスプレイ12による表示や、図示しないスピーカによる音声を用いておこなうことができる。たとえば、処理装置11は、ディスプレイ12に「手のひらを置いてください。」と表示する。 [Step S22] The processing device 11 notifies that the posture of the palm is to be adjusted. The notification that the posture adjustment of the palm is to be performed can be performed using display on the display 12 or sound from a speaker (not shown). For example, the processing device 11 displays “Please put your palm” on the display 12.
 [ステップS23]処理装置11は、手首支持部27の位置を手首支持部位置の初期値110mmにするように、センサユニット20に指示する。センサユニット20は、処理装置11の指示にもとづいて手首支持部27の位置を手首支持部位置の初期値110mmとする。 [Step S23] The processing device 11 instructs the sensor unit 20 to set the position of the wrist support portion 27 to the initial value of the wrist support portion position of 110 mm. The sensor unit 20 sets the position of the wrist support portion 27 to an initial value of 110 mm of the wrist support portion position based on an instruction from the processing device 11.
 [ステップS24]処理装置11は、生体情報(手のひらの撮影画像)を取得するように、センサユニット20に指示する。センサユニット20は、取得した生体情報と、生体情報取得時の姿勢情報を処理装置11に応答する。 [Step S24] The processing device 11 instructs the sensor unit 20 to acquire biometric information (a palm image). The sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
 [ステップS25]処理装置11は、手のひらの撮影画像から手首支持部位置が正位置か否かを判定する。処理装置11は、手のひらの撮影画像において手首が欠けている場合に、手首が後方に位置して正位置にないと判定する。また、処理装置11は、手のひらの撮影画像において腕が含まれている場合に、手首が前方に位置して正位置にないと判定する。また、処理装置11は、手のひらの撮影画像において手首が欠けず腕も含まれていなければ、手首支持部位置が正位置にあると判定する。 [Step S25] The processing device 11 determines whether or not the wrist support portion position is the normal position from the photographed image of the palm. The processing device 11 determines that the wrist is positioned rearward and is not in the normal position when the wrist is missing in the captured image of the palm. Further, the processing device 11 determines that the wrist is positioned forward and not in the normal position when an arm is included in the palm image. Further, the processing device 11 determines that the wrist support portion position is in the normal position if the photographed image of the palm does not lack the wrist and does not include the arm.
 処理装置11は、手首が後方に位置して正位置にないと判定した場合に、ステップS23にすすみ、手首支持部27の位置が所定量(たとえば、5mm)だけ前進するように、センサユニット20に指示する。処理装置11は、手首が前方に位置して正位置にないと判定した場合に、ステップS23にすすみ、手首支持部27の位置が所定量(たとえば、5mm)だけ後退するように、センサユニット20に指示する。処理装置11は、手首支持部位置が正位置にあると判定した場合に、手首支持部位置調整処理を終了する。 When the processing device 11 determines that the wrist is positioned rearward and is not in the normal position, the processing unit 11 proceeds to step S23, and the sensor unit 20 so that the position of the wrist support portion 27 advances by a predetermined amount (for example, 5 mm). To instruct. When the processing device 11 determines that the wrist is positioned forward and is not in the normal position, the processing unit 11 proceeds to step S23, and the sensor unit 20 so that the position of the wrist support portion 27 is retracted by a predetermined amount (for example, 5 mm). To instruct. When it is determined that the wrist support portion position is in the normal position, the processing device 11 ends the wrist support portion position adjustment process.
 次に、登録装置10が実行する姿勢調整範囲決定処理について図11を用いて説明する。図11は、第2の実施形態の姿勢調整範囲決定処理のフローチャートである。姿勢調整範囲決定処理は、テンプレート登録処理において実行される。 Next, the posture adjustment range determination process executed by the registration device 10 will be described with reference to FIG. FIG. 11 is a flowchart of posture adjustment range determination processing according to the second embodiment. The posture adjustment range determination process is executed in the template registration process.
 [ステップS31]処理装置11は、生体の形態を判定する。たとえば、生体の形態の判定は、手のひらの長さ、幅などの所定の基準にもとづいて、手のひらのサイズが大きい、小さい、標準、あるいは、幅が広い、幅が狭いなどの分類分けによりおこなう。生体の形態の判定は、手首支持部位置調整処理のステップS24で取得した生体情報を用いておこなうことができる。 [Step S31] The processing device 11 determines the form of the living body. For example, the form of the living body is determined by classification based on predetermined criteria such as the length and width of the palm, such as the size of the palm being large, small, standard, or wide, narrow. The determination of the form of the living body can be performed using the biological information acquired in step S24 of the wrist support portion position adjustment process.
 [ステップS32]処理装置11は、判定した生体の形態にもとづいて、指間隔の調整範囲、すなわち2個の指分離リブ22の距離の調整範囲を決定する。また、処理装置11は、判定した生体の形態にもとづいて、第1指/第5指垂れ量の調整範囲、すなわち第1指/第5指支持部25の高さの調整範囲を決定する。 [Step S32] The processing device 11 determines the adjustment range of the finger interval, that is, the adjustment range of the distance between the two finger separation ribs 22, based on the determined form of the living body. Further, the processing device 11 determines the adjustment range of the first finger / fifth finger sag amount, that is, the height adjustment range of the first finger / fifth finger support unit 25 based on the determined form of the living body.
 たとえば、処理装置11は、手のひらのサイズが標準の場合に指間隔の調整範囲を15mmから35mmに決定し、手のひらのサイズが小さい場合に指間隔の調整範囲を15mmから25mmに決定し、手のひらのサイズが大きい場合に指間隔の調整範囲を25mmから35mmに決定する。また、処理装置11は、手のひらのサイズが標準の場合に第1指/第5指垂れ量の調整範囲を0mmから10mmに決定し、手のひらのサイズが小さい場合に第1指/第5指垂れ量の調整範囲を0mmから5mmに決定し、手のひらのサイズが大きい場合に第1指/第5指垂れ量の調整範囲を5mmから10mmに決定する。 For example, the processing device 11 determines the adjustment range of the finger interval from 15 mm to 35 mm when the palm size is standard, and determines the adjustment range of the finger interval from 15 mm to 25 mm when the palm size is small. When the size is large, the adjustment range of the finger interval is determined from 25 mm to 35 mm. Further, the processing device 11 determines the adjustment range of the first finger / fifth finger droop amount from 0 mm to 10 mm when the palm size is standard, and the first finger / fifth finger droop when the palm size is small. The adjustment range of the amount is determined from 0 mm to 5 mm, and when the palm size is large, the adjustment range of the first finger / fifth finger sag amount is determined from 5 mm to 10 mm.
 [ステップS33]処理装置11は、判定した生体の形態にもとづいて、指間隔の調整初期値、すなわち2個の指分離リブ22の距離の調整初期値を決定する。また、処理装置11は、判定した生体の形態にもとづいて、第1指/第5指垂れ量の調整初期値、すなわち第1指/第5指支持部25の高さの調整初期値を決定する。 [Step S33] The processing device 11 determines an initial adjustment value of the finger interval, that is, an initial adjustment value of the distance between the two finger separation ribs 22, based on the determined form of the living body. Further, the processing device 11 determines an initial adjustment value of the first finger / fifth finger sag amount, that is, an initial adjustment value of the height of the first finger / fifth finger support unit 25, based on the determined form of the living body. To do.
 たとえば、処理装置11は、手のひらのサイズが標準の場合に指間隔の調整初期値を25mmに決定し、手のひらのサイズが小さい場合に指間隔の調整初期値を20mmに決定し、手のひらのサイズが大きい場合に指間隔の調整初期値を30mmに決定する。また、処理装置11は、手のひらのサイズが標準の場合に第1指/第5指垂れ量の調整初期値を5mmに決定し、手のひらのサイズが小さい場合に第1指/第5指垂れ量の調整初期値を3mmに決定し、手のひらのサイズが大きい場合に第1指/第5指垂れ量の調整初期値を7mmに決定する。 For example, the processing device 11 determines the finger spacing adjustment initial value as 25 mm when the palm size is standard, and determines the finger spacing adjustment initial value as 20 mm when the palm size is small. If larger, the initial adjustment value of the finger interval is determined to be 30 mm. Further, the processing apparatus 11 determines the adjustment initial value of the first finger / fifth finger dripping amount to 5 mm when the palm size is standard, and the first finger / fifth finger dripping amount when the palm size is small. Is adjusted to 3 mm, and when the palm size is large, the adjustment initial value of the first finger / fifth finger drooping amount is determined to be 7 mm.
 なお、処理装置11は、指間隔の調整初期値、および第1指/第5指垂れ量の調整初期値を、ステップS24で取得した生体情報の実測値にもとづいて個々に決定するようにしてもよい。 The processing device 11 individually determines the initial adjustment value of the finger interval and the initial adjustment value of the first finger / fifth finger sag amount based on the actual measurement value of the biological information acquired in step S24. Also good.
 [ステップS34]処理装置11は、判定した生体の形態にもとづいて、指間隔の調整単位、すなわち2個の指分離リブ22の距離の調整単位を決定する。また、処理装置11は、判定した生体の形態にもとづいて、第1指/第5指垂れ量の調整単位、すなわち第1指/第5指支持部25の高さの調整単位を決定する。 [Step S34] The processing device 11 determines the adjustment unit of the finger interval, that is, the adjustment unit of the distance between the two finger separation ribs 22, based on the determined form of the living body. Moreover, the processing apparatus 11 determines the adjustment unit of the height of the 1st finger / 5th finger support part 25, ie, the adjustment unit of the height of the 1st finger / 5th finger support part 25, based on the determined form of the living body.
 たとえば、処理装置11は、手のひらのサイズが標準の場合に指間隔の調整単位を1mmに決定し、手のひらのサイズが小さい場合に指間隔の調整単位を0.5mmに決定し、手のひらのサイズが大きい場合に指間隔の調整単位を1.5mmに決定する。また、処理装置11は、手のひらのサイズが標準の場合および手のひらのサイズが大きい場合に第1指/第5指垂れ量の調整単位を1mmに決定し、手のひらのサイズが小さい場合に第1指/第5指垂れ量の調整単位を0.5mmに決定する。 For example, the processing apparatus 11 determines the finger spacing adjustment unit as 1 mm when the palm size is standard, and determines the finger spacing adjustment unit as 0.5 mm when the palm size is small. If larger, the unit for adjusting the finger interval is determined to be 1.5 mm. Further, the processing device 11 determines the adjustment unit of the first finger / fifth finger sag amount as 1 mm when the palm size is standard and when the palm size is large, and the first finger when the palm size is small. / The adjustment unit of the fifth drooping amount is determined to be 0.5 mm.
 なお、調整単位は、モータの駆動量(たとえば、ステップ数、回転量など)やギヤの回転量であってもよい。
 処理装置11は、調整単位を決定した後、姿勢調整範囲決定処理を終了する。
The adjustment unit may be a motor drive amount (for example, the number of steps, a rotation amount, etc.) or a gear rotation amount.
After determining the adjustment unit, the processing device 11 ends the posture adjustment range determination process.
 このように、登録装置10は、生体の形態にもとづいて調整範囲、調整初期値、調整単位を決定するので、生体情報の取得を効率よくおこなうことができる。
 次に、登録装置10が実行する指間隔変更処理について図12を用いて説明する。図12は、第2の実施形態の指間隔変更処理のフローチャートである。指間隔変更処理は、テンプレート登録処理において実行される。
Thus, since the registration apparatus 10 determines an adjustment range, an adjustment initial value, and an adjustment unit based on the form of a living body, it is possible to efficiently acquire biological information.
Next, the finger interval changing process executed by the registration apparatus 10 will be described with reference to FIG. FIG. 12 is a flowchart of finger interval change processing according to the second embodiment. The finger interval change process is executed in the template registration process.
 [ステップS41]処理装置11は、姿勢調整範囲決定処理のステップS33で決定した調整初期値を設定し、指間隔と第1指/第5指垂れ量を調整初期値にするように、センサユニット20に指示する。センサユニット20は、処理装置11の指示にもとづいて指間隔を調整初期値とする。 [Step S41] The processing device 11 sets the adjustment initial value determined in step S33 of the posture adjustment range determination process, and sets the finger interval and the first finger / fifth finger droop amount to the adjustment initial values. 20 The sensor unit 20 sets the finger interval as an initial adjustment value based on an instruction from the processing device 11.
 [ステップS42]処理装置11は、生体情報(手のひらの静脈画像)を取得するように、センサユニット20に指示する。センサユニット20は、取得した生体情報と、生体情報取得時の姿勢情報を処理装置11に応答する。 [Step S42] The processing apparatus 11 instructs the sensor unit 20 to acquire biometric information (palm vein image). The sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
 [ステップS43]処理装置11は、姿勢調整範囲決定処理のステップS34で決定した調整単位を加えて調整値を更新する。すなわち、処理装置11は、指間隔を1調整単位拡大する。 [Step S43] The processing device 11 adds the adjustment unit determined in step S34 of the posture adjustment range determination process and updates the adjustment value. That is, the processing device 11 increases the finger interval by one adjustment unit.
 [ステップS44]処理装置11は、姿勢調整範囲決定処理のステップS32で決定した調整範囲内に調整値があるか否かを判定する。処理装置11は、調整値が調整範囲内にある場合に、指間隔を更新するようにセンサユニット20に指示し、ステップS42にすすむ。処理装置11は、調整値が調整範囲を超えた場合にステップS45にすすむ。 [Step S44] The processing device 11 determines whether or not there is an adjustment value within the adjustment range determined in step S32 of the posture adjustment range determination process. When the adjustment value is within the adjustment range, the processing device 11 instructs the sensor unit 20 to update the finger interval, and proceeds to step S42. The processing device 11 proceeds to step S45 when the adjustment value exceeds the adjustment range.
 [ステップS45]処理装置11は、姿勢調整範囲決定処理のステップS33で決定した調整初期値を設定し、指間隔を調整初期値にするように、センサユニット20に指示する。センサユニット20は、処理装置11の指示にもとづいて指間隔を調整初期値とする。 [Step S45] The processing device 11 sets the initial adjustment value determined in step S33 of the posture adjustment range determination process, and instructs the sensor unit 20 to set the finger interval to the initial adjustment value. The sensor unit 20 sets the finger interval as an initial adjustment value based on an instruction from the processing device 11.
 [ステップS46]処理装置11は、生体情報(手のひらの静脈画像)を取得するように、センサユニット20に指示する。センサユニット20は、取得した生体情報と、生体情報取得時の姿勢情報を処理装置11に応答する。 [Step S46] The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image). The sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
 [ステップS47]処理装置11は、姿勢調整範囲決定処理のステップS34で決定した調整単位を引いて調整値を更新する。すなわち、処理装置11は、指間隔を1調整単位縮小する。 [Step S47] The processing device 11 updates the adjustment value by subtracting the adjustment unit determined in step S34 of the posture adjustment range determination process. That is, the processing device 11 reduces the finger interval by one adjustment unit.
 [ステップS48]処理装置11は、姿勢調整範囲決定処理のステップS32で決定した調整範囲内に調整値があるか否かを判定する。処理装置11は、調整値が調整範囲内にある場合に、指間隔を更新するようにセンサユニット20に指示し、ステップS46にすすむ。処理装置11は、調整値が調整範囲を超えた場合に指間隔変更処理を終了する。 [Step S48] The processing device 11 determines whether or not there is an adjustment value within the adjustment range determined in step S32 of the posture adjustment range determination process. When the adjustment value is within the adjustment range, the processing device 11 instructs the sensor unit 20 to update the finger interval, and proceeds to step S46. The processing device 11 ends the finger interval changing process when the adjustment value exceeds the adjustment range.
 なお、処理装置11は、調整初期値から調整単位ごとに最大調整範囲まで拡大した後に、一旦調整初期値に復帰してから調整単位ごとに最小調整範囲まで縮小するようにしたが、1調整単位の拡大、縮小の都度、初期値に復帰するようにしてもよい。 Note that the processing device 11 expands from the adjustment initial value to the maximum adjustment range for each adjustment unit, then returns to the adjustment initial value and then reduces to the minimum adjustment range for each adjustment unit. You may make it return to an initial value, whenever it expands and contracts.
 次に、登録装置10が実行する第1指/第5指垂れ量変更処理について図13を用いて説明する。図13は、第2の実施形態の第1指/第5指垂れ量変更処理のフローチャートである。第1指/第5指垂れ量変更処理は、テンプレート登録処理において実行される。 Next, the first finger / fifth finger droop amount changing process executed by the registration apparatus 10 will be described with reference to FIG. FIG. 13 is a flowchart of the first finger / fifth finger droop amount changing process of the second embodiment. The first finger / fifth finger droop amount changing process is executed in the template registration process.
 [ステップS51]処理装置11は、姿勢調整範囲決定処理のステップS33で決定した調整初期値を設定し、指間隔と第1指/第5指垂れ量を調整初期値にするように、センサユニット20に指示する。センサユニット20は、処理装置11の指示にもとづいて第1指/第5指垂れ量を調整初期値とする。 [Step S51] The processing device 11 sets the adjustment initial value determined in step S33 of the posture adjustment range determination process, and sets the finger interval and the first finger / fifth finger droop amount to the adjustment initial values. 20 The sensor unit 20 sets the first finger / fifth finger sag amount as an initial adjustment value based on an instruction from the processing device 11.
 [ステップS52]処理装置11は、生体情報(手のひらの静脈画像)を取得するように、センサユニット20に指示する。センサユニット20は、取得した生体情報と、生体情報取得時の姿勢情報を処理装置11に応答する。 [Step S52] The processing apparatus 11 instructs the sensor unit 20 to acquire biometric information (palm vein image). The sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
 [ステップS53]処理装置11は、姿勢調整範囲決定処理のステップS34で決定した調整単位を加えて調整値を更新する。すなわち、処理装置11は、第1指/第5指垂れ量を1調整単位拡大する。 [Step S53] The processing device 11 updates the adjustment value by adding the adjustment unit determined in step S34 of the posture adjustment range determination processing. That is, the processing device 11 enlarges the first finger / fifth finger sag amount by one adjustment unit.
 [ステップS54]処理装置11は、姿勢調整範囲決定処理のステップS32で決定した調整範囲内に調整値があるか否かを判定する。処理装置11は、調整値が調整範囲内にある場合に、第1指/第5指垂れ量を更新するようにセンサユニット20に指示し、ステップS52にすすむ。処理装置11は、調整値が調整範囲を超えた場合にステップS55にすすむ。 [Step S54] The processing device 11 determines whether or not there is an adjustment value within the adjustment range determined in step S32 of the posture adjustment range determination process. When the adjustment value is within the adjustment range, the processing device 11 instructs the sensor unit 20 to update the first finger / fifth finger drooping amount, and proceeds to step S52. The processing device 11 proceeds to step S55 when the adjustment value exceeds the adjustment range.
 [ステップS55]処理装置11は、姿勢調整範囲決定処理のステップS33で決定した調整初期値を設定し、第1指/第5指垂れ量を調整初期値にするように、センサユニット20に指示する。センサユニット20は、処理装置11の指示にもとづいて第1指/第5指垂れ量を調整初期値とする。 [Step S55] The processing apparatus 11 sets the initial adjustment value determined in step S33 of the posture adjustment range determination process, and instructs the sensor unit 20 to set the first finger / fifth finger droop amount to the adjustment initial value. To do. The sensor unit 20 sets the first finger / fifth finger sag amount as an initial adjustment value based on an instruction from the processing device 11.
 [ステップS56]処理装置11は、生体情報(手のひらの静脈画像)を取得するように、センサユニット20に指示する。センサユニット20は、取得した生体情報と、生体情報取得時の姿勢情報を処理装置11に応答する。 [Step S56] The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image). The sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
 [ステップS57]処理装置11は、姿勢調整範囲決定処理のステップS34で決定した調整単位を引いて調整値を更新する。すなわち、処理装置11は、第1指/第5指垂れ量を1調整単位縮小する。 [Step S57] The processing device 11 updates the adjustment value by subtracting the adjustment unit determined in step S34 of the posture adjustment range determination process. That is, the processing device 11 reduces the first finger / fifth finger sag amount by one adjustment unit.
 [ステップS58]処理装置11は、姿勢調整範囲決定処理のステップS32で決定した調整範囲内に調整値があるか否かを判定する。処理装置11は、調整値が調整範囲内にある場合に、第1指/第5指垂れ量を更新するようにセンサユニット20に指示し、ステップS56にすすむ。処理装置11は、調整値が調整範囲を超えた場合に第1指/第5指垂れ量変更処理を終了する。 [Step S58] The processing device 11 determines whether or not there is an adjustment value within the adjustment range determined in step S32 of the posture adjustment range determination process. When the adjustment value is within the adjustment range, the processing device 11 instructs the sensor unit 20 to update the first finger / fifth finger sag amount, and proceeds to step S56. The processing device 11 ends the first finger / fifth finger droop amount changing process when the adjustment value exceeds the adjustment range.
 なお、処理装置11は、調整初期値から調整単位ごとに最大調整範囲まで拡大した後に、一旦調整初期値に復帰してから調整単位ごとに最小調整範囲まで縮小するようにしたが、1調整単位の拡大、縮小の都度、初期値に復帰するようにしてもよい。 Note that the processing device 11 expands from the initial adjustment value to the maximum adjustment range for each adjustment unit, then returns to the initial adjustment value and then reduces to the minimum adjustment range for each adjustment unit. You may make it return to an initial value, whenever it expands and contracts.
 次に、登録装置10が実行する照合用生体情報抽出処理について図14から図18を用いて説明する。図14は、第2の実施形態の照合用生体情報抽出処理のフローチャートである。図15は、第2の実施形態の指間隔生体情報テーブルの一例を示す図である。図16は、第2の実施形態の第1指/第5指垂れ量生体情報テーブルの一例を示す図である。図17は、第2の実施形態の序列テーブルの一例を示す図である。図18は、第2の実施形態の組み合わせ生体情報テーブルの一例を示す図である。照合用生体情報抽出処理は、テンプレート登録処理において実行される。 Next, the biometric information extraction process for verification executed by the registration apparatus 10 will be described with reference to FIGS. FIG. 14 is a flowchart of the biometric information extraction process for collation according to the second embodiment. FIG. 15 is a diagram illustrating an example of a finger interval biometric information table according to the second embodiment. FIG. 16 is a diagram illustrating an example of a first finger / fifth sagging amount biological information table according to the second embodiment. FIG. 17 is a diagram illustrating an example of an order table according to the second embodiment. FIG. 18 is a diagram illustrating an example of the combined biometric information table according to the second embodiment. The biometric information extraction process for verification is executed in the template registration process.
 [ステップS61]処理装置11は、指間隔変更処理のステップS42とステップS46で取得した生体情報の特徴量を計算する。これにより、処理装置11は、指間隔変更処理のステップS42とステップS46で取得した姿勢情報(初期値、調整値)と併せて、指間隔生体情報テーブル200を得る。指間隔生体情報テーブル200は、指間隔を違えて取得した複数の生体情報(d000、d001、…)について、指間隔調整時の初期値、調整値、および特徴量を関連付けて記録する。 [Step S61] The processing device 11 calculates the feature amount of the biological information acquired in step S42 and step S46 of the finger interval change process. Thereby, the processing apparatus 11 obtains the finger interval biometric information table 200 together with the posture information (initial value, adjustment value) acquired in step S42 and step S46 of the finger interval change process. The finger interval biometric information table 200 records initial values, adjustment values, and feature quantities at the time of finger interval adjustment in association with each other for a plurality of pieces of biometric information (d000, d001,...) Acquired with different finger intervals.
 また、処理装置11は、第1指/第5指垂れ量変更処理のステップS52とステップS56で取得した生体情報の特徴量を計算する。これにより、処理装置11は、第1指/第5指垂れ量変更処理のステップS52とステップS56で取得した姿勢情報(初期値、調整値)と併せて、第1指/第5指垂れ量生体情報テーブル210を得る。第1指/第5指垂れ量生体情報テーブル210は、第1指/第5指垂れ量を違えて取得した複数の生体情報(d100、d101、…)について、第1指/第5指垂れ量調整時の初期値、調整値、および特徴量を関連付けて記録する。 Further, the processing device 11 calculates the feature amount of the biological information acquired in step S52 and step S56 of the first finger / fifth drooping amount changing process. Thereby, the processing apparatus 11 combines the posture information (initial value, adjustment value) acquired in step S52 and step S56 of the first finger / fifth finger droop amount change process, and the first finger / fifth finger droop amount. A biological information table 210 is obtained. The first finger / fifth finger sag amount biological information table 210 is a first finger / fifth finger sag for a plurality of pieces of biometric information (d100, d101,...) Acquired with different first finger / fifth finger sag amounts. The initial value, adjustment value, and feature amount at the time of adjusting the amount are recorded in association with each other.
 なお、手のひらの静脈像(静脈パタン)の特徴量は、たとえば、静脈の分岐点や、単位面積当たりの静脈の量などをそれぞれ項目評価したうえで各評価項目を重み付けして総合的に評価される。生体情報に含まれる特徴量の評価は、上記した例に限らず、任意の評価方法を採用することができる。 Note that the features of the palm vein image (vein pattern) are evaluated comprehensively, for example, by evaluating each item of the vein branch point and the amount of veins per unit area. The The evaluation of the feature amount included in the biological information is not limited to the above example, and any evaluation method can be adopted.
 [ステップS62]処理装置11は、指間隔生体情報テーブル200の特徴量を参照し、特徴量の多い上位3つの姿勢情報(初期値、調整値)を抽出する。抽出した姿勢情報(初期値、調整値)は、序列テーブル220に記録される。 [Step S62] The processing device 11 refers to the feature quantity in the finger interval biometric information table 200 and extracts the top three posture information (initial value and adjustment value) having the largest feature quantity. The extracted posture information (initial value, adjustment value) is recorded in the order table 220.
 たとえば、指間隔生体情報テーブル200に記録された特徴量のうち特徴量の多い順にv003、v004、v002であるとき、処理装置11は、各特徴量に対応する姿勢情報(初期値、調整値)を序列テーブル220に記録する。 For example, when v003, v004, and v002 are the feature amounts recorded in the finger interval biometric information table 200 in the descending order of the feature amounts, the processing device 11 has posture information (initial value, adjustment value) corresponding to each feature amount. Are recorded in the order table 220.
 [ステップS63]処理装置11は、第1指/第5指垂れ量生体情報テーブル210の特徴量を参照し、特徴量の多い上位3つの姿勢情報(初期値、調整値)を抽出する。抽出した姿勢情報(初期値、調整値)は、序列テーブル220に記録される。 [Step S63] The processing device 11 refers to the feature amount in the first finger / fifth finger droop amount biological information table 210, and extracts the top three posture information (initial value and adjustment value) having the largest feature amount. The extracted posture information (initial value, adjustment value) is recorded in the order table 220.
 たとえば、第1指/第5指垂れ量生体情報テーブル210に記録された特徴量のうち特徴量の多い順にv103、v102、v101であるとき、処理装置11は、各特徴量に対応する姿勢情報(初期値、調整値)を序列テーブル220に記録する。このようにして、序列テーブル220は、指間隔を違えて取得した複数の生体情報のうち特徴量の多い上位3つの姿勢情報(初期値、調整値)と、第1指/第5指垂れ量を違えて取得した複数の生体情報のうち特徴量の多い上位3つの姿勢情報(初期値、調整値)と、を記録する。 For example, when v103, v102, and v101 are the feature amounts recorded in the first finger / fifth sagging amount biometric information table 210 in the descending order of feature amounts, the processing device 11 determines the posture information corresponding to each feature amount. (Initial value, adjustment value) is recorded in the order table 220. In this way, the order table 220 includes the top three posture information (initial value, adjustment value) having the most feature amount among the plurality of pieces of biological information acquired with different finger intervals, and the first finger / fifth finger droop amount. And the top three posture information (initial value, adjustment value) having a large amount of features among a plurality of pieces of biometric information acquired with different values.
 [ステップS64]処理装置11は、抽出した姿勢情報の組み合わせパタンを生成する。たとえば、指間隔生体情報テーブル200から3つの姿勢情報を抽出し、第1指/第5指垂れ量生体情報テーブル210から3つの姿勢情報を抽出した場合、処理装置11は、9つ(=3×3)の姿勢情報の組み合わせパタンを生成する。 [Step S64] The processing device 11 generates a combination pattern of the extracted posture information. For example, when three pieces of posture information are extracted from the finger interval biological information table 200 and three pieces of posture information are extracted from the first finger / fifth sagging amount biological information table 210, the processing device 11 has nine (= 3). A combination pattern of (3) posture information is generated.
 このようにして生成された姿勢情報の組み合わせパタンは、組み合わせ生体情報テーブル230に記録される。
 [ステップS65]処理装置11は、組み合わせ生体情報テーブル230に記録した姿勢情報の組み合わせパタンの1つを取得する。処理装置11は、指間隔と、第1指/第5指垂れ量とを、取得した姿勢情報の組み合わせパタンで更新するようにセンサユニット20に指示する。センサユニット20は、処理装置11の指示にもとづいて、指間隔と、第1指/第5指垂れ量とを更新する。
The posture information combination pattern generated in this way is recorded in the combination biometric information table 230.
[Step S65] The processing apparatus 11 acquires one of the combination patterns of posture information recorded in the combination biometric information table 230. The processing device 11 instructs the sensor unit 20 to update the finger interval and the first finger / fifth finger sag amount with the combination pattern of the acquired posture information. The sensor unit 20 updates the finger interval and the first finger / fifth finger droop amount based on an instruction from the processing device 11.
 なお、センサユニット20が指間隔と、第1指/第5指垂れ量とを更新する場合、センサユニット20は、一度、初期値に復帰してから更新をおこなうようにしてもよい。これにより、センサユニット20は、調整値が同じであっても、調整方向が「+」でその調整値に至ったのか、あるいは調整方向が「-」でその調整値に至ったのか、で異なる手の姿勢変化に与える影響を排除する。たとえば、調整値が26mmと同じであっても、25mmから+1mmの調整で26mmに至ったのか、27mmから-1mmの調整でその調整値に至ったのかで手の姿勢が異なり得るが、一旦初期値に復帰することで姿勢変化の条件を揃えることができる。 In addition, when the sensor unit 20 updates the finger interval and the first finger / fifth finger droop amount, the sensor unit 20 may be updated after returning to the initial value once. Thereby, even if the adjustment value is the same, the sensor unit 20 differs depending on whether the adjustment direction is “+” and the adjustment value is reached, or whether the adjustment direction is “−” and the adjustment value is reached. Eliminate effects on hand posture changes. For example, even if the adjustment value is the same as 26 mm, the posture of the hand may be different depending on whether the adjustment value is 26 mm from 25 mm to +1 mm or the adjustment value is reached from 27 mm to −1 mm. By returning to the value, the condition of posture change can be made uniform.
 [ステップS66]処理装置11は、生体情報(手のひらの静脈画像)を取得するように、センサユニット20に指示する。センサユニット20は、取得した生体情報と、生体情報取得時の姿勢情報を処理装置11に応答する。 [Step S66] The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image). The sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
 [ステップS67]処理装置11は、組み合わせ生体情報テーブル230に記録した姿勢情報の組み合わせパタンのすべてについて、生体情報を取得したか否かを判定する。処理装置11は、姿勢情報の組み合わせパタンのすべてについて生体情報を取得した場合にステップS68にすすみ、取得していない場合にステップS65にすすむ。 [Step S67] The processing apparatus 11 determines whether or not biometric information has been acquired for all the combination patterns of posture information recorded in the combined biometric information table 230. The processing apparatus 11 proceeds to step S68 when the biological information is acquired for all the posture information combination patterns, and proceeds to step S65 when the biological information is not acquired.
 [ステップS68]処理装置11は、組み合わせ生体情報テーブル230に記録した姿勢情報の組み合わせパタンのすべてについて取得した生体情報(d200、d201、…)について、特徴量(v200、v201、…)を評価し、組み合わせ生体情報テーブル230に記録する。このようにして、組み合わせ生体情報テーブル230は、指間隔の異なる3つの姿勢情報(初期値、調整値)と、第1指/第5指垂れ量の異なる3つの姿勢情報(初期値、調整値)の各々の組み合わせパタンについて、生体情報および特徴量を関連付けて記録する。 [Step S68] The processing device 11 evaluates the feature values (v200, v201,...) For the biometric information (d200, d201,...) Acquired for all the posture information combination patterns recorded in the combined biometric information table 230. And recorded in the combined biometric information table 230. In this way, the combined biometric information table 230 includes three pieces of posture information (initial values and adjustment values) having different finger intervals and three pieces of posture information (initial values and adjustment values) having different first / fifth finger droop amounts. ) And the biometric information and the feature quantity are recorded in association with each other.
 処理装置11は、組み合わせ生体情報テーブル230を参照し、最も特徴量の多い生体情報を照合用生体情報として抽出する。処理装置11は、抽出した照合用生体情報と、抽出した照合用生体情報に対応する姿勢情報を照合用姿勢情報として、照合用生体情報抽出処理を終了する。なお、姿勢情報には、指間隔の姿勢情報、第1指/第5指垂れ量の姿勢情報に加えて、手首支持部位置の姿勢情報を含めてもよい。 The processing device 11 refers to the combined biometric information table 230 and extracts the biometric information with the most characteristic amount as biometric information for verification. The processing device 11 ends the verification biometric information extraction process using the extracted biometric information for verification and the posture information corresponding to the extracted biometric information for verification as the verification posture information. The posture information may include posture information of the wrist support portion position in addition to the posture information of the finger interval and the posture information of the first finger / fifth finger sag amount.
 なお、処理装置11は、上位3つを抽出候補として、抽出候補を対象にしてさらに複数回(たとえば3回)ずつ取得した生体情報について特徴量を評価してから照合用生体情報を抽出するようにしてもよい。 Note that the processing device 11 extracts the biometric information for verification after evaluating the feature amount of the biometric information acquired a plurality of times (for example, three times) for the extraction candidates with the top three as extraction candidates. It may be.
 このように、処理装置11は、個人差(生体取得部位の形状、大きさ、柔軟さなど)を考慮して、照合に用いるのに適した生体情報を取得し得る。
 また、処理装置11は、複数ある姿勢調整要素(指間隔と、第1指/第5指垂れ量)について、要素ごとに特徴量の多い姿勢を抽出して序列テーブル220を生成し、序列テーブル220にある姿勢の組み合わせのうちから特徴量の多い姿勢を抽出する。これにより、処理装置11は、姿勢調整要素が複数あっても、照合用情報の生成にかかる処理時間を低減することができる。
In this way, the processing device 11 can acquire biometric information suitable for use in collation in consideration of individual differences (shape, size, flexibility, etc. of the biometric acquisition site).
Further, the processing device 11 generates a ranking table 220 by extracting a posture having a large amount of feature for each of the plurality of posture adjustment elements (finger interval and first finger / fifth finger drooping amount), A posture having a large feature amount is extracted from the posture combinations in 220. Accordingly, the processing device 11 can reduce the processing time required for generating the verification information even when there are a plurality of posture adjustment elements.
 なお、第1指/第5指支持部25は、第1指と第5指とを支持する高さを同じように調整するものであったが、第1指支持部と第5指支持部とでそれぞれモータ、位置センサ、負荷センサを備えるようにして、独立に調整可能としてもよい。 In addition, although the 1st finger / 5th finger support part 25 adjusts the height which supports a 1st finger and a 5th finger similarly, a 1st finger support part and a 5th finger support part And a motor, a position sensor, and a load sensor, respectively, and may be independently adjustable.
 次に、認証装置である自動預払装置6が実行する認証処理について図19を用いて説明する。図19は、第2の実施形態の認証処理のフローチャートである。たとえば、認証処理は、自動預払装置6のICカードリーダライタ17がICカード16を受け付けることにより、本人認証の必要なサービスの提供前に実行される。 Next, an authentication process executed by the automatic depositing apparatus 6 as an authentication apparatus will be described with reference to FIG. FIG. 19 is a flowchart of authentication processing according to the second embodiment. For example, the authentication process is executed before provision of a service that requires personal authentication when the IC card reader / writer 17 of the automatic depositing apparatus 6 receives the IC card 16.
 [ステップS71]自動預払装置6は、ICカード16からユーザID(たとえば、口座番号)を取得する。
 [ステップS72]自動預払装置6は、ICカード16から姿勢情報(指間隔の姿勢情報、第1指/第5指垂れ量の姿勢情報)を取得する。
[Step S71] The automatic depositing apparatus 6 acquires a user ID (for example, an account number) from the IC card 16.
[Step S <b> 72] The automatic depositing apparatus 6 acquires posture information (posture information on the interval between fingers, posture information on the first finger / fifth amount of drooping) from the IC card 16.
 [ステップS73]自動預払装置6は、センサユニット20に取得した姿勢情報に対応する姿勢で手のひらを支持するように指示する。このとき、自動預払装置6は、まず、センサユニット20を各姿勢情報の初期値で手のひらを支持する姿勢としてから、利用者に手のひらを載置するように指示するとよい。そして、利用者が手のひらを載置した後に、自動預払装置6は、センサユニット20を各姿勢情報の調整値で手のひらを支持する姿勢を更新する。これにより、自動預払装置6は、登録装置10が生体情報を登録したときの姿勢の再現性を向上させることができる。 [Step S73] The automatic depositing apparatus 6 instructs the sensor unit 20 to support the palm in a posture corresponding to the obtained posture information. At this time, the automatic depositing apparatus 6 may first instruct the user to place the palm after placing the sensor unit 20 in a posture that supports the palm with the initial value of each posture information. Then, after the user places the palm, the automatic teller machine 6 updates the posture of the sensor unit 20 that supports the palm with the adjustment value of each posture information. Thereby, the automatic depositing apparatus 6 can improve the reproducibility of the posture when the registration apparatus 10 registers biometric information.
 [ステップS74]自動預払装置6は、生体情報を登録したときの手のひらの姿勢を再現して生体情報を取得する。
 [ステップS75]自動預払装置6は、ICカード16に記録してある照合用生体情報と、センサユニット20から取得した生体情報を照合する。
[Step S74] The automatic depositing apparatus 6 acquires the biological information by reproducing the posture of the palm when the biological information is registered.
[Step S75] The automatic depositing apparatus 6 collates the biometric information for verification recorded in the IC card 16 with the biometric information acquired from the sensor unit 20.
 [ステップS76]自動預払装置6は、センサユニット20から取得した生体情報の照合に成功した場合、ステップS77にすすみ、照合に失敗した場合、ステップS78にすすむ。 [Step S76] The automatic teller machine 6 proceeds to Step S77 when the verification of the biometric information acquired from the sensor unit 20 is successful, and proceeds to Step S78 when the verification fails.
 [ステップS77]自動預払装置6は、認証結果を「OK」として認証処理を終了する。
 [ステップS78]自動預払装置6は、認証結果を「NG」として認証処理を終了する。
[Step S77] The automatic depositing apparatus 6 sets the authentication result to “OK” and ends the authentication process.
[Step S78] The automatic depositing apparatus 6 sets the authentication result to “NG” and ends the authentication process.
 このようにして、自動預払装置6は、利用者の生体情報取得部位である手のひらを、照合用情報の登録時と同様の姿勢に案内することができ、認証精度の向上が期待できる。
 なお、上記の処理機能は、コンピュータによって実現することができる。その場合、各装置が有すべき機能の処理内容を記述したプログラムが提供される。そのプログラムをコンピュータで実行することにより、上記処理機能がコンピュータ上で実現される。処理内容を記述したプログラムは、コンピュータで読み取り可能な記録媒体(可搬型記録媒体を含む)に記録しておくことができる。コンピュータで読み取り可能な記録媒体としては、磁気記録装置、光ディスク、光磁気記録媒体、半導体メモリなどがある。磁気記録装置には、ハードディスク装置(HDD)、フレキシブルディスク(FD)、磁気テープなどがある。光ディスクには、DVD(Digital Versatile Disc)、DVD-RAM、CD-ROM、CD-R(Recordable)/RW(ReWritable)などがある。光磁気記録媒体には、MO(Magneto-Optical disk)などがある。
In this way, the automatic depositing apparatus 6 can guide the palm, which is the user's biometric information acquisition site, to the same posture as when the verification information is registered, and can be expected to improve authentication accuracy.
The above processing functions can be realized by a computer. In that case, a program describing the processing contents of the functions that each device should have is provided. By executing the program on a computer, the above processing functions are realized on the computer. The program describing the processing contents can be recorded on a computer-readable recording medium (including a portable recording medium). Examples of the computer-readable recording medium include a magnetic recording device, an optical disk, a magneto-optical recording medium, and a semiconductor memory. Examples of the magnetic recording device include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape. Optical disks include DVD (Digital Versatile Disc), DVD-RAM, CD-ROM, CD-R (Recordable) / RW (ReWritable), and the like. Magneto-optical recording media include MO (Magneto-Optical disk).
 プログラムを流通させる場合には、たとえば、そのプログラムが記録されたDVD、CD-ROMなどの可搬型記録媒体が販売される。また、プログラムをサーバコンピュータの記憶装置に格納しておき、ネットワークを介して、サーバコンピュータから他のコンピュータにそのプログラムを転送することもできる。 When distributing the program, for example, a portable recording medium such as a DVD or CD-ROM in which the program is recorded is sold. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
 プログラムを実行するコンピュータは、たとえば、可搬型記録媒体に記録されたプログラムもしくはサーバコンピュータから転送されたプログラムを、自己の記憶装置に格納する。そして、コンピュータは、自己の記憶装置からプログラムを読み取り、プログラムにしたがった処理を実行する。なお、コンピュータは、可搬型記録媒体から直接プログラムを読み取り、そのプログラムにしたがった処理を実行することもできる。また、コンピュータは、サーバコンピュータからプログラムが転送されるごとに、逐次、受け取ったプログラムにしたがった処理を実行することもできる。 The computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.
 なお、上述の実施の形態は、実施の形態の要旨を逸脱しない範囲内において種々の変更を加えることができる。
 上記については単に本発明の原理を示すものである。さらに、多数の変形、変更が当業者にとって可能であり、本発明は上記に示し、説明した正確な構成および応用例に限定されるものではなく、対応するすべての変形例および均等物は、添付の請求項およびその均等物による本発明の範囲とみなされる。
Note that various modifications can be made to the above-described embodiment without departing from the gist of the embodiment.
The above merely illustrates the principle of the present invention. In addition, many modifications and changes can be made by those skilled in the art, and the present invention is not limited to the precise configuration and application shown and described above, and all corresponding modifications and equivalents may be And the equivalents thereof are considered to be within the scope of the invention.
 1 情報処理装置
 1a 保持部
 1b 姿勢変更部
 1c 情報取得部
 1d 評価部
 1e 生成部
 2 生体
DESCRIPTION OF SYMBOLS 1 Information processing apparatus 1a Holding | maintenance part 1b Posture change part 1c Information acquisition part 1d Evaluation part 1e Generation | occurrence | production part 2 Living body

Claims (8)

  1.  生体から生体情報を取得する情報取得部と、
     前記情報取得部に対応して前記生体を保持する保持部と、
     前記保持部により保持された前記生体の姿勢を変更する姿勢変更部と、
     前記情報取得部により複数の異なる姿勢毎に取得した生体情報の特徴量を評価する評価部と、
     取得した複数の前記生体情報のうち前記評価にもとづいて選択した生体情報と、選択した前記生体情報に対応した姿勢を特定可能な姿勢情報とから前記生体の照合に用いる照合用情報を生成する生成部と、
     を備えることを特徴とする情報処理装置。
    An information acquisition unit for acquiring biological information from a living body;
    A holding unit for holding the living body corresponding to the information acquisition unit;
    A posture changing unit that changes the posture of the living body held by the holding unit;
    An evaluation unit that evaluates feature amounts of biological information acquired for each of a plurality of different postures by the information acquisition unit;
    Generation for generating verification information used for verification of the living body from the biological information selected based on the evaluation among the plurality of acquired biological information and posture information capable of specifying the posture corresponding to the selected biological information And
    An information processing apparatus comprising:
  2.  前記姿勢変更部は、所定の姿勢調整範囲を所定の調整単位で姿勢変更をおこなうことを特徴とする請求の範囲第1項記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the attitude changing unit changes the attitude of a predetermined attitude adjustment range in a predetermined adjustment unit.
  3.  前記姿勢変更部は、前記姿勢調整範囲内に初期値を設定し、正方向または負方向のうちいずれか一方に前記調整単位で姿勢変更をおこない、前記姿勢調整範囲から外れた場合に一旦初期値に復帰してから、正方向または負方向のうちのもう一方に前記調整単位で姿勢変更をおこなうことを特徴とする請求の範囲第2項記載の情報処理装置。 The posture changing unit sets an initial value within the posture adjustment range, changes the posture in the adjustment unit in either the positive direction or the negative direction, and temporarily returns to the initial value when the posture is out of the posture adjustment range. 3. The information processing apparatus according to claim 2, wherein the posture is changed in the adjustment unit in the other of the positive direction and the negative direction after returning to step (a).
  4.  前記姿勢変更部は、前記姿勢調整範囲内に初期値を設定し、前記姿勢変更の都度、一旦初期値に復帰することを特徴とする請求の範囲第2項記載の情報処理装置。 3. The information processing apparatus according to claim 2, wherein the posture changing unit sets an initial value within the posture adjustment range and once returns to the initial value every time the posture is changed.
  5.  前記姿勢変更部は、
     前記生体について第1部位の姿勢を変更可能な第1姿勢変更部と、
     前記生体について前記第1部位と異なる第2部位の姿勢を変更可能な第2姿勢変更部と、を備え、
     前記生成部は、前記照合用情報に含まれる前記生体情報を選択する前に、前記第2部位の姿勢を固定して前記第1部位の姿勢を変更して取得した複数の生体情報の評価にもとづき選択した1以上の第1部位の姿勢と、前記第1部位の姿勢を固定して前記第2部位の姿勢を変更して取得した複数の生体情報の評価にもとづき選択した1以上の第2部位の姿勢と、の組み合わせパタンを生成し、前記組み合わせパタンとなる姿勢で取得した生体情報について前記照合用情報に含まれる前記生体情報の選択候補とする、
     ことを特徴とする請求の範囲第1項乃至請求の範囲第4項のいずれか1項に記載の情報処理装置。
    The posture changing unit
    A first posture changing unit capable of changing the posture of the first part with respect to the living body;
    A second posture changing unit capable of changing the posture of the second part different from the first part with respect to the living body,
    The generation unit is configured to evaluate a plurality of pieces of biological information acquired by fixing the posture of the second part and changing the posture of the first part before selecting the biological information included in the verification information. One or more first parts selected based on the posture of the first part and one or more second parts selected based on the evaluation of a plurality of pieces of biological information acquired by changing the position of the second part while fixing the posture of the first part Generating a combination pattern with the posture of the part, and selecting the biometric information included in the verification information for the biometric information acquired with the posture that becomes the combination pattern;
    The information processing apparatus according to any one of claims 1 to 4, wherein the information processing apparatus is characterized in that:
  6.  前記照合用情報を照合対象者ごとの照合用情報記録媒体に登録する登録部を備えることを特徴とする請求の範囲第1項乃至請求の範囲第4項のいずれか1項に記載の情報処理装置。 The information processing according to any one of claims 1 to 4, further comprising a registration unit that registers the verification information in a verification information recording medium for each verification target person. apparatus.
  7.  コンピュータが実行する情報処理方法であって、
     保持部により保持された生体の姿勢を変更し、
     複数の異なる姿勢毎に前記生体から生体情報を取得し、
     取得した前記生体情報の特徴量を評価し、
     取得した複数の前記生体情報のうち前記評価にもとづいて選択した生体情報と、選択した前記生体情報に対応した姿勢を特定可能な姿勢情報とから前記生体の照合に用いる照合用情報を生成する、
     ことを特徴とする情報処理方法。
    An information processing method executed by a computer,
    Change the posture of the living body held by the holding unit,
    Obtaining biological information from the living body for each of a plurality of different postures;
    Evaluate the feature quantity of the acquired biological information,
    Generating collation information used for collation of the living body from the biological information selected based on the evaluation among the plurality of obtained biological information and posture information capable of specifying a posture corresponding to the selected biological information;
    An information processing method characterized by the above.
  8.  コンピュータに、
     保持部により保持された生体の姿勢を変更し、
     複数の異なる姿勢毎に前記生体から生体情報を取得し、
     取得した前記生体情報の特徴量を評価し、
     取得した複数の前記生体情報のうち前記評価にもとづいて選択した生体情報と、選択した前記生体情報に対応した姿勢を特定可能な姿勢情報とから前記生体の照合に用いる照合用情報を生成する、
     処理を実行させることを特徴とする情報処理プログラム。
    On the computer,
    Change the posture of the living body held by the holding unit,
    Obtaining biological information from the living body for each of a plurality of different postures;
    Evaluate the feature quantity of the acquired biological information,
    Generating collation information used for collation of the living body from the biological information selected based on the evaluation among the plurality of obtained biological information and posture information capable of specifying a posture corresponding to the selected biological information;
    An information processing program for executing a process.
PCT/JP2011/074827 2011-10-27 2011-10-27 Information processing device, information processing method, and information processing program WO2013061446A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2011/074827 WO2013061446A1 (en) 2011-10-27 2011-10-27 Information processing device, information processing method, and information processing program
JP2013540579A JP5655155B2 (en) 2011-10-27 2011-10-27 Information processing apparatus, information processing method, and information processing program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/074827 WO2013061446A1 (en) 2011-10-27 2011-10-27 Information processing device, information processing method, and information processing program

Publications (1)

Publication Number Publication Date
WO2013061446A1 true WO2013061446A1 (en) 2013-05-02

Family

ID=48167311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/074827 WO2013061446A1 (en) 2011-10-27 2011-10-27 Information processing device, information processing method, and information processing program

Country Status (2)

Country Link
JP (1) JP5655155B2 (en)
WO (1) WO2013061446A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016154512A1 (en) * 2015-03-25 2016-09-29 Gojo Industries, Inc. Dispenser dosing based on hand size

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS634381A (en) * 1986-06-24 1988-01-09 Mitsubishi Electric Corp Fingerprint collating device
JPH09108204A (en) * 1995-10-24 1997-04-28 Kdk Corp Measurement position reproduction method, measurement position reproduction device, and body fluid component concentration measuring device using it
JP2007156936A (en) * 2005-12-07 2007-06-21 Hitachi Ltd Biometric information verification system
JP2007159610A (en) * 2005-12-09 2007-06-28 Matsushita Electric Ind Co Ltd Registration device, authentication device, registration authentication device, registration method, authentication method, registration program, and authentication program
JP2008250601A (en) * 2007-03-30 2008-10-16 Hitachi Omron Terminal Solutions Corp Biological information reading device and biological information reading system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4207717B2 (en) * 2003-08-26 2009-01-14 株式会社日立製作所 Personal authentication device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS634381A (en) * 1986-06-24 1988-01-09 Mitsubishi Electric Corp Fingerprint collating device
JPH09108204A (en) * 1995-10-24 1997-04-28 Kdk Corp Measurement position reproduction method, measurement position reproduction device, and body fluid component concentration measuring device using it
JP2007156936A (en) * 2005-12-07 2007-06-21 Hitachi Ltd Biometric information verification system
JP2007159610A (en) * 2005-12-09 2007-06-28 Matsushita Electric Ind Co Ltd Registration device, authentication device, registration authentication device, registration method, authentication method, registration program, and authentication program
JP2008250601A (en) * 2007-03-30 2008-10-16 Hitachi Omron Terminal Solutions Corp Biological information reading device and biological information reading system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016154512A1 (en) * 2015-03-25 2016-09-29 Gojo Industries, Inc. Dispenser dosing based on hand size
US10219656B2 (en) 2015-03-25 2019-03-05 Gojo Industries, Inc. Dispenser dosing based on hand size

Also Published As

Publication number Publication date
JP5655155B2 (en) 2015-01-14
JPWO2013061446A1 (en) 2015-04-02

Similar Documents

Publication Publication Date Title
JP5671607B2 (en) Biometric authentication device, biometric authentication system, and biometric authentication method
JP5622928B2 (en) Verification device, verification program, and verification method
JP5509335B2 (en) Registration program, registration apparatus, and registration method
JP5681786B2 (en) Biological information acquisition apparatus and biological information acquisition method
EP2677490B1 (en) Authentication device, authentication program, and authentication method
US20060080254A1 (en) Individual authentication method, individual authentication device, and program for same
TW202011315A (en) Method, apparatus, and system for resource transfer
JP2007249556A (en) Individual authentication system, method and program using biological information
US20130170717A1 (en) Authentication apparatus, authentication program, and method of authentication
JP2013171325A (en) Collation object determination device, collation object determination program and collation object determination method
TW202038133A (en) System and method for rapidly locating iris using deep learning
CN101587547B (en) Vein register apparatus and vein register method
JP2014180435A (en) Biological information input device, biological information input program, and biological information input method
JP5655155B2 (en) Information processing apparatus, information processing method, and information processing program
WO2013046365A1 (en) Guidance device, biometric information acquisition device, and registration device
JP2013148988A (en) Bioinformation processing device, bioinformation processing program, and bioinformation processing method
JP2014026585A (en) Biometric information input device, biological object support state determination method, and biological object support state determination program
JP5685272B2 (en) Authentication apparatus, authentication program, and authentication method
TW202107383A (en) Identity verification device and method based on dynamic image
JP2018136897A (en) Authentication device and authentication method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11874732

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013540579

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11874732

Country of ref document: EP

Kind code of ref document: A1