WO2013061446A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations Download PDF

Info

Publication number
WO2013061446A1
WO2013061446A1 PCT/JP2011/074827 JP2011074827W WO2013061446A1 WO 2013061446 A1 WO2013061446 A1 WO 2013061446A1 JP 2011074827 W JP2011074827 W JP 2011074827W WO 2013061446 A1 WO2013061446 A1 WO 2013061446A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
posture
finger
unit
biological information
Prior art date
Application number
PCT/JP2011/074827
Other languages
English (en)
Japanese (ja)
Inventor
英夫 鎌田
彰孝 皆川
東浦 康之
健太郎 鎹
克美 井出
Original Assignee
富士通フロンテック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通フロンテック株式会社 filed Critical 富士通フロンテック株式会社
Priority to PCT/JP2011/074827 priority Critical patent/WO2013061446A1/fr
Priority to JP2013540579A priority patent/JP5655155B2/ja
Publication of WO2013061446A1 publication Critical patent/WO2013061446A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • G06V10/7515Shifting the patterns to accommodate for positional errors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and an information processing program.
  • the human body includes biological information that can identify an individual, and some of them are used as information for identifying and authenticating the individual.
  • biometric information that can be used for authentication includes fingerprints, eye retinas and irises, faces, blood vessels, and DNA (Deoxyribo Nucleic Acid).
  • biometric authentication is performed by comparing biometric information (registration template) collected during registration with biometric information acquired during authentication.
  • an increase in the number of patterns for recording biometric information causes problems such as an increase in recording capacity for recording biometric information, a decrease in authentication speed during authentication, and an increase in the acceptance rate of others.
  • the present invention has been made in view of these points, and an object thereof is to provide an information processing apparatus, an information processing method, and an information processing program that can acquire biometric information used for collation in consideration of individual differences. To do.
  • the information processing apparatus includes an information acquisition unit, a holding unit, a posture changing unit, an evaluation unit, and a generation unit.
  • the information acquisition unit acquires biological information from the living body.
  • the holding unit holds the living body corresponding to the information acquisition unit.
  • the posture changing unit changes the posture of the living body held by the holding unit.
  • An evaluation part evaluates the feature-value of the biometric information acquired for every some different attitude
  • the generating unit generates collation information used for biometric collation from the biometric information selected based on the evaluation among the plurality of obtained biometric information and the posture information that can specify the posture corresponding to the selected biometric information.
  • the information processing method which a computer performs changes the attitude
  • the verification information used for the verification of the living body from the biological information selected based on the evaluation among the plurality of acquired biological information and the posture information capable of specifying the posture corresponding to the selected biological information is evaluated. Generate.
  • the information processing program changes the posture of the living body held by the holding unit, acquires the biological information from the living body for each of a plurality of different postures, and obtains the feature amount of the acquired biological information.
  • FIG. 1 is a diagram illustrating the configuration of the information processing apparatus according to the first embodiment.
  • the information processing apparatus 1 generates verification information registered in advance when authenticating a user using biometric information.
  • the biological information is information that can uniquely identify a user specific to the user's biological body.
  • the biological information includes, for example, a palm vein pattern.
  • the information processing apparatus 1 includes a holding unit 1a, an attitude change unit 1b, an information acquisition unit 1c, an evaluation unit 1d, and a generation unit 1e.
  • the holding unit 1 a holds the living body 2 when acquiring biological information from the living body 2.
  • the information acquisition unit 1c acquires biological information from the living body 2 held by the holding unit 1a.
  • the posture changing unit 1b changes the posture of the living body 2 held by the holding unit 1a. Thereby, the information processing apparatus 1 can acquire biological information of various postures from the living body 2.
  • the evaluation unit 1d evaluates the feature amount of the biological information acquired for each of a plurality of different postures by the information acquisition unit 1c.
  • the generation unit 1e selects biometric information for verification based on the evaluation performed by the evaluation unit 1d from among the plurality of acquired biological information.
  • the generation unit 1e generates collation information used for biometric collation from posture information that can specify the posture corresponding to the selected biological information.
  • the posture information is information that can identify the posture changed by the posture changing unit 1b.
  • the information processing apparatus 1 acquires biometric information used for verification from the biometric information of various postures acquired from the biometric 2, and selects biometric information suitable for verification. That is, the information processing apparatus 1 can select biometric information suitable for collation even when the appropriate posture is different for each individual. Then, the information processing apparatus 1 generates verification information from the biometric information and the posture information corresponding to the biometric information, thereby enabling authentication with the same posture as that during registration.
  • FIG. 2 is a diagram illustrating a configuration of the authentication system according to the second embodiment.
  • an information processing system in which the authentication system 3 performs authentication using a palm vein is exemplified.
  • the present invention is not limited to this, and other feature detection parts of a living body whose feature amount also changes due to a change in posture. It can also be applied to a system that performs authentication. More preferably, the authentication system 3 is applicable not only to posture changes such as yawing, pitching, and rolling, but also to a system that performs authentication at a feature detection site having a posture change accompanying a shape change such as a palm.
  • the authentication system 3 is one of information processing systems that recognizes the characteristics of a living body and identifies and authenticates an individual. For example, the customer system is authenticated by a bank system or the like.
  • the authentication system 3 includes an information processing device such as a registration device 10, a plurality of automatic depositing devices 6 and an authentication server 4, and a network 8.
  • the authentication server 4 associates and stores identification information for identifying an individual and verification information (template) registered in advance before biometric authentication.
  • the identification information for identifying an individual is a unique ID (IDentification) assigned to a user directly (for example, a user number) or indirectly (for example, an account number).
  • the collation information registered in advance includes biometric information for collation and posture information for collation.
  • the biometric information for verification is feature information obtained by extracting a feature portion from image information using a predetermined feature extraction algorithm, encoded information obtained by encoding image information or feature information, and the like.
  • the verification posture information is information for designating a posture at the time of verification.
  • One or more automatic teller machines 6 are installed in ATMs (Automated Teller Machines) corners 5 and ATM booths 7 located inside financial institutions.
  • the automatic depositing apparatus 6 is one of authentication apparatuses that perform biometric authentication when authenticating a user prior to a financial transaction.
  • the automated teller machine 6 includes an IC (Integrated Circuit) card reader / writer 17 and a sensor unit 20.
  • the sensor unit 20 includes an imaging device and takes a vein image of the palm of the user.
  • the automatic teller machine 6 includes verification information (verification biometric information) specified from identification information read from a user's IC card (for example, an IC chip built-in cash card) by the IC card reader / writer 17 and the sensor unit 20. The user is authenticated from the biometric information of the acquired user.
  • the sensor unit 20 holds the posture of the palm of the user based on the verification posture information in the same posture as at the time of template registration.
  • the sensor unit 20 acquires biometric information in the same posture as when the template is registered. That is, the sensor unit 20 is a biological information acquisition device that acquires biological information, and the automatic depositing device 6 is an authentication device that includes the biological information acquisition device.
  • the registration device 10 is a device that is provided at a bank window or the like, and performs user template registration according to instructions or operations of an attendant.
  • the registration device 10 includes a processing device 11, a display 12, and a sensor unit 20, and includes a keyboard 13, a mouse 14, an IC card reader / writer 15 and the like as necessary.
  • the sensor unit 20 has a built-in imaging device, images the palm of the user, and outputs a captured image to the processing device 11.
  • the IC card reader / writer 15 reads and writes information on the IC card 16 of the user.
  • the keyboard 13 and the mouse 14 accept input operations.
  • template registration registration information registration
  • a user who requests template registration inputs identification information (for example, a user ID) for identifying the user using the keyboard 13, mouse 14, or IC card reader / writer 15.
  • the registration apparatus 10 guides the template registration to the user by display using the display 12, and requests input of biometric information for template registration.
  • the user inputs biometric information by holding his hand over the sensor unit 20.
  • the sensor unit 20 acquires a plurality of pieces of biological information while changing the posture of the hand, and selects biological information to be registered from the acquired pieces of biological information.
  • the registration device 10 creates verification information from the selected biological information and the posture information corresponding to the selected biological information, and stores the storage unit of the processing device 11, the storage unit of the authentication server 4, or the user's IC card. Record in at least one of the 16 storage units.
  • the automated teller machine 6 refers to the template in the storage unit of the authentication server 4 or the storage unit of the IC card 16 and collates the input biometric information.
  • FIG. 3 is a diagram illustrating an appearance of the sensor unit according to the second embodiment.
  • FIG. 4 is a diagram illustrating an example of changing the finger interval of the three-finger support unit according to the second embodiment.
  • FIG. 5 is a diagram illustrating an example of a first finger / fifth finger droop amount change of the first finger / fifth finger support unit according to the second embodiment.
  • FIG. 6 is a diagram illustrating an example of changing the wrist support portion position of the wrist support portion according to the second embodiment.
  • the sensor unit 20 includes a guide unit 40 that supports the palm and a sensor 26 that captures the palm.
  • the guide part 40 has a box shape with an open upper surface, and has a concave chamber 24 that expands from the bottom toward the opening.
  • the concave chamber 24 sets the distance between the sensor 26 and the palm to an appropriate position.
  • the concave chamber 24 prevents intrusion of ambient light in the imaging range of the sensor 26 and prevents unnecessary background reflection.
  • the sensor 26 is located on the bottom surface of the concave chamber 24 and faces the opening.
  • the sensor 26 is an image sensor that captures the palm (for example, a CMOS (Complementary Metal Oxide Semiconductor) sensor, a CCD (Charge Coupled Device) sensor, etc.), a condenser lens, and a distance to the subject by irradiating the subject. And a plurality of light emitting elements (LED: Light Emitting Diode).
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • LED Light Emitting Diode
  • the guide unit 40 includes a three-finger support unit 21, a first / fifth finger support unit 25, and a wrist support unit 27.
  • the three-finger support portion 21 has two finger separation ribs 22 and determines and supports the positions of three fingers (indicating finger, middle finger, ring finger) by the two finger separation ribs 22.
  • the positions of the two finger separation ribs 22 can be changed, and the finger interval between the fingers placed on the three-finger support portion 21 can be adjusted.
  • the guide unit 40 can reduce the finger interval W2 between the fingers placed on the three-finger support unit 21 to the finger interval W1.
  • the guide part 40 can widen the finger
  • the first finger / fifth finger support portion 25 is a support member disposed on the left and right sides of the concave chamber 24, and the first finger and the fifth finger, or the first finger and the fifth finger, and the thumb finger or little finger hill. Can be supported.
  • the height of the first finger / fifth finger support portion 25 can be changed, and the amount of sag (height) of the first finger and the fifth finger placed on the first finger / fifth finger support portion 25 Can be adjusted.
  • Adjustment of the amount of sagging of the first and fifth fingers is based on the fact that the pinnacle that is the base of the first finger or the little finger hill that is the base of the fifth finger is raised above the center of the palm. Since there are individual differences in the amount of bulge, the posture of each individual is made appropriate.
  • the guide unit 40 reduces the first finger / fifth finger sag amount H2 of the first finger and the fifth finger placed on the first finger / fifth finger support unit 25 to reduce the first finger / fifth finger.
  • the guide unit 40 increases the first finger / fifth finger sag amount H2 of the first finger and the fifth finger placed on the first finger / fifth finger support unit 25 to increase the first finger / fifth finger.
  • the wrist support unit 27 supports the wrist.
  • the wrist support portion 27 can adjust the position of the palm in the front-rear direction (direction along the front-rear axis).
  • the guide part 40 can advance the wrist support part 27, and can adjust from the wrist support position L2 to the wrist support position L1.
  • the guide part 40 can retract
  • the guide part 40 can open the finger naturally when the palm is placed on the guide part 40 by the two finger separation ribs 22, and can guide the entire palm to be horizontal. it can.
  • the guide unit 40 since the boundary between the palm and the finger is clearly indicated by the two finger separation ribs 22, the guide unit 40 contributes to the improvement of the palm contour extraction accuracy. Further, the entire palm of the guide unit 40 is leveled by the first finger / fifth finger support unit 25, and the deformation of the vein pattern is reduced to contribute to the stability of the authentication accuracy.
  • FIG. 7 is a diagram illustrating a configuration of a sensor unit according to the second embodiment.
  • the sensor unit (biological information acquisition device) 20 includes a sensing unit 30 and a guide unit (guide device) 40.
  • the sensing unit 30 captures a vein image of the palm and transmits the captured data to the processing device 11.
  • the sensing unit 30 includes a storage unit 31, an imaging unit 32, a control unit 33, and a communication unit 34.
  • the control unit 33 comprehensively controls each processing unit.
  • the imaging unit (sensor 26) 32 acquires image information from a living body that is a subject.
  • the storage unit 31 temporarily stores the image information acquired by the imaging unit 32.
  • the communication unit 34 communicates with the processing device 11 and the guide unit 40.
  • the photographing unit 32 photographs near infrared reflected light from a living body (palm) as a subject. Since hemoglobin in red blood cells flowing in the veins has lost oxygen, this hemoglobin (reduced hemoglobin) has a property of absorbing near infrared rays in the vicinity of 700 nm (nanometers) to 1000 nm. Therefore, when near infrared rays are applied to the palm, only a portion where the vein is present is less reflected, and the position of the vein can be recognized by the intensity of reflected light of the near infrared ray. The photographed image by the photographing unit 32 becomes an achromatic image although it is easy to extract characteristic information by using a specific light source.
  • the guide unit 40 changes the posture of the palm and transmits control data that can identify the changed posture to the processing device 11.
  • the guide unit 40 can transmit control data to the processing device 11 via the sensing unit 30.
  • the guide unit 40 includes a communication unit 41, a control unit 42, motors (for example, stepping motors) 44, 45, 46, position sensors 47, 49, 51, and load sensors 48, 50, 52.
  • the control unit 42 comprehensively controls each processing unit.
  • the communication unit 41 communicates with the sensing unit 30.
  • the motor 44 drives the finger separation rib 22.
  • the motor 45 drives the first finger / fifth finger support unit 25.
  • the motor 46 drives the wrist support portion 27.
  • the position sensor 47 detects the position of the finger separation rib 22.
  • the position sensor 47 may detect the position of the finger separation rib 22 based on the driving amount of the motor 44.
  • the position sensor 49 detects the position of the first finger / fifth finger support unit 25.
  • the position sensor 49 may detect the position detection of the first finger / fifth finger support portion 25 by the driving amount of the motor 45.
  • the position sensor 51 detects the position of the wrist support portion 27.
  • the position sensor 51 may detect the position of the wrist support portion 27 based on the driving amount of the motor 46.
  • the load sensor 48 detects the load of the motor 44.
  • the load sensor 50 detects the load of the motor 45.
  • the load sensor 52 detects the load of the motor 46.
  • a known drive mechanism using a cam or the like can be used as the drive mechanism in which the motors 44, 45, and 46 drive the drive unit.
  • the controller 42 drives the motor 44 according to the position of the finger separation rib 22 detected by the position sensor 47. Moreover, the control part 42 stops the drive of the motor 44 according to the load detection of the load sensor 48 for a user's safety.
  • the control unit 42 drives the motor 45 in accordance with the position of the first finger / fifth finger support unit 25 detected by the position sensor 49. Moreover, the control part 42 stops the drive of the motor 45 according to the load detection of the load sensor 50 for a user's safety.
  • the control unit 42 drives the motor 46 according to the position of the wrist support unit 27 detected by the position sensor 51. Moreover, the control part 42 stops the drive of the motor 46 according to the load detection of the load sensor 52 for a user's safety.
  • FIG. 8 is a diagram illustrating a hardware configuration example of the registration apparatus according to the second embodiment.
  • the registration device 10 includes a processing device 11, a display 12, a keyboard 13, a mouse 14, a sensor unit 20, and an IC card reader / writer 15.
  • the entire processing apparatus 11 is controlled by a CPU (Central Processing Unit) 101.
  • a RAM Random Access Memory
  • HDD Hard Disk Drive
  • a communication interface 104 a graphic processing device 105, and an input / output interface 106 are connected to the CPU 101 via a bus 107.
  • the RAM 102 temporarily stores at least part of an OS (Operating System) program and application programs to be executed by the CPU 101.
  • the RAM 102 stores various data necessary for processing by the CPU 101.
  • the HDD 103 stores an OS and application programs.
  • a display 12 is connected to the graphic processing device 105.
  • the graphic processing device 105 displays an image on the screen of the display 12 in accordance with a command from the CPU 101.
  • the input / output interface 106 is connected to the keyboard 13, the mouse 14, the sensor unit 20, and the IC card reader / writer 15.
  • the input / output interface 106 can be connected to a portable recording medium interface that can write information to the portable recording medium 110 and read information from the portable recording medium 110.
  • the input / output interface 106 transmits signals sent from the keyboard 13, mouse 14, sensor unit 20, IC card reader / writer 15, and portable recording medium interface to the CPU 101 via the bus 107.
  • the communication interface 104 is connected to the network 8.
  • the communication interface 104 transmits / receives data to / from other computers (for example, the authentication server 4).
  • the processing device 11 can also be configured to include modules each composed of an FPGA (Field Programmable Gate Array), a DSP (Digital Signal Processor), or the like, or can be configured without the CPU 101.
  • each of the processing devices 11 includes a nonvolatile memory (for example, an EEPROM (Electrically Erasable and Programmable Read Only Memory), a flash memory, a flash memory type memory card, etc.), and stores module firmware.
  • the nonvolatile memory can write firmware via the portable recording medium 110 or the communication interface 104. In this way, the processing device 11 can also update the firmware by rewriting the firmware stored in the nonvolatile memory.
  • FIG. 9 is a flowchart of template registration processing according to the second embodiment.
  • the template registration process is executed based on, for example, a template registration execution operation by a staff member.
  • Step S11 The processing device 11 notifies the start of template registration.
  • the notification of the start of template registration can be performed using display on the display 12 or sound from a speaker (not shown).
  • the processing apparatus 11 displays “Template registration starts” on the display 12.
  • the processing device 11 executes a posture adjustment range determination process for determining an adjustment range of the finger interval and the first finger / fifth finger droop amount. Details of the posture adjustment range determination processing will be described later with reference to FIG.
  • Step S16 The processing device 11 executes a biometric information extraction process for collation for extracting biometric information for collation. Details of the biometric information extraction process for verification will be described later with reference to FIGS. 14, 17, and 18.
  • the processing device 11 generates verification information including the extracted verification biometric information and verification posture information corresponding to the extracted verification biometric information.
  • the processing apparatus 11 performs template registration of the generated verification information.
  • the template registration of the verification information is performed by recording in at least one of the storage unit of the processing device 11, the storage unit of the authentication server 4, or the storage unit of the user's IC card 16, and the template registration process is performed. finish.
  • initial values may be acquired for each condition such as gender and age based on information such as gender and age input in advance. Further, it is also possible to determine the size of the hand based on an image taken by detecting a state where the hand is put on the sensor 26 and obtain an initial value.
  • Step S22 The processing device 11 notifies that the posture of the palm is to be adjusted.
  • the notification that the posture adjustment of the palm is to be performed can be performed using display on the display 12 or sound from a speaker (not shown). For example, the processing device 11 displays “Please put your palm” on the display 12.
  • the processing unit 11 determines that the wrist is positioned rearward and is not in the normal position
  • the processing unit 11 proceeds to step S23, and the sensor unit 20 so that the position of the wrist support portion 27 advances by a predetermined amount (for example, 5 mm).
  • a predetermined amount for example, 5 mm
  • the processing unit 11 proceeds to step S23, and the sensor unit 20 so that the position of the wrist support portion 27 is retracted by a predetermined amount (for example, 5 mm).
  • the processing device 11 ends the wrist support portion position adjustment process.
  • the processing device 11 determines the adjustment range of the finger interval, that is, the adjustment range of the distance between the two finger separation ribs 22, based on the determined form of the living body. Further, the processing device 11 determines the adjustment range of the first finger / fifth finger sag amount, that is, the height adjustment range of the first finger / fifth finger support unit 25 based on the determined form of the living body.
  • the processing device 11 determines an initial adjustment value of the finger interval, that is, an initial adjustment value of the distance between the two finger separation ribs 22, based on the determined form of the living body. Further, the processing device 11 determines an initial adjustment value of the first finger / fifth finger sag amount, that is, an initial adjustment value of the height of the first finger / fifth finger support unit 25, based on the determined form of the living body. To do.
  • the processing device 11 determines the finger spacing adjustment initial value as 25 mm when the palm size is standard, and determines the finger spacing adjustment initial value as 20 mm when the palm size is small. If larger, the initial adjustment value of the finger interval is determined to be 30 mm. Further, the processing apparatus 11 determines the adjustment initial value of the first finger / fifth finger dripping amount to 5 mm when the palm size is standard, and the first finger / fifth finger dripping amount when the palm size is small. Is adjusted to 3 mm, and when the palm size is large, the adjustment initial value of the first finger / fifth finger drooping amount is determined to be 7 mm.
  • the processing device 11 individually determines the initial adjustment value of the finger interval and the initial adjustment value of the first finger / fifth finger sag amount based on the actual measurement value of the biological information acquired in step S24. Also good.
  • the processing device 11 determines the adjustment unit of the finger interval, that is, the adjustment unit of the distance between the two finger separation ribs 22, based on the determined form of the living body. Moreover, the processing apparatus 11 determines the adjustment unit of the height of the 1st finger / 5th finger support part 25, ie, the adjustment unit of the height of the 1st finger / 5th finger support part 25, based on the determined form of the living body.
  • the processing apparatus 11 determines the finger spacing adjustment unit as 1 mm when the palm size is standard, and determines the finger spacing adjustment unit as 0.5 mm when the palm size is small. If larger, the unit for adjusting the finger interval is determined to be 1.5 mm. Further, the processing device 11 determines the adjustment unit of the first finger / fifth finger sag amount as 1 mm when the palm size is standard and when the palm size is large, and the first finger when the palm size is small. / The adjustment unit of the fifth drooping amount is determined to be 0.5 mm.
  • FIG. 12 is a flowchart of finger interval change processing according to the second embodiment.
  • the finger interval change process is executed in the template registration process.
  • Step S43 The processing device 11 adds the adjustment unit determined in step S34 of the posture adjustment range determination process and updates the adjustment value. That is, the processing device 11 increases the finger interval by one adjustment unit.
  • Step S46 The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image).
  • the sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
  • FIG. 13 is a flowchart of the first finger / fifth finger droop amount changing process of the second embodiment.
  • the first finger / fifth finger droop amount changing process is executed in the template registration process.
  • Step S51 The processing device 11 sets the adjustment initial value determined in step S33 of the posture adjustment range determination process, and sets the finger interval and the first finger / fifth finger droop amount to the adjustment initial values.
  • the sensor unit 20 sets the first finger / fifth finger sag amount as an initial adjustment value based on an instruction from the processing device 11.
  • Step S52 The processing apparatus 11 instructs the sensor unit 20 to acquire biometric information (palm vein image).
  • the sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
  • Step S54 The processing device 11 determines whether or not there is an adjustment value within the adjustment range determined in step S32 of the posture adjustment range determination process. When the adjustment value is within the adjustment range, the processing device 11 instructs the sensor unit 20 to update the first finger / fifth finger drooping amount, and proceeds to step S52. The processing device 11 proceeds to step S55 when the adjustment value exceeds the adjustment range.
  • Step S55 The processing apparatus 11 sets the initial adjustment value determined in step S33 of the posture adjustment range determination process, and instructs the sensor unit 20 to set the first finger / fifth finger droop amount to the adjustment initial value. To do.
  • the sensor unit 20 sets the first finger / fifth finger sag amount as an initial adjustment value based on an instruction from the processing device 11.
  • Step S56 The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image).
  • the sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
  • Step S57 The processing device 11 updates the adjustment value by subtracting the adjustment unit determined in step S34 of the posture adjustment range determination process. That is, the processing device 11 reduces the first finger / fifth finger sag amount by one adjustment unit.
  • Step S58 The processing device 11 determines whether or not there is an adjustment value within the adjustment range determined in step S32 of the posture adjustment range determination process. When the adjustment value is within the adjustment range, the processing device 11 instructs the sensor unit 20 to update the first finger / fifth finger sag amount, and proceeds to step S56. The processing device 11 ends the first finger / fifth finger droop amount changing process when the adjustment value exceeds the adjustment range.
  • processing device 11 expands from the initial adjustment value to the maximum adjustment range for each adjustment unit, then returns to the initial adjustment value and then reduces to the minimum adjustment range for each adjustment unit. You may make it return to an initial value, whenever it expands and contracts.
  • FIG. 14 is a flowchart of the biometric information extraction process for collation according to the second embodiment.
  • FIG. 15 is a diagram illustrating an example of a finger interval biometric information table according to the second embodiment.
  • FIG. 16 is a diagram illustrating an example of a first finger / fifth sagging amount biological information table according to the second embodiment.
  • FIG. 17 is a diagram illustrating an example of an order table according to the second embodiment.
  • FIG. 18 is a diagram illustrating an example of the combined biometric information table according to the second embodiment.
  • the biometric information extraction process for verification is executed in the template registration process.
  • the processing device 11 calculates the feature amount of the biological information acquired in step S42 and step S46 of the finger interval change process. Thereby, the processing apparatus 11 obtains the finger interval biometric information table 200 together with the posture information (initial value, adjustment value) acquired in step S42 and step S46 of the finger interval change process.
  • the finger interval biometric information table 200 records initial values, adjustment values, and feature quantities at the time of finger interval adjustment in association with each other for a plurality of pieces of biometric information (d000, d001,...) Acquired with different finger intervals.
  • the processing device 11 calculates the feature amount of the biological information acquired in step S52 and step S56 of the first finger / fifth drooping amount changing process. Thereby, the processing apparatus 11 combines the posture information (initial value, adjustment value) acquired in step S52 and step S56 of the first finger / fifth finger droop amount change process, and the first finger / fifth finger droop amount.
  • a biological information table 210 is obtained.
  • the first finger / fifth finger sag amount biological information table 210 is a first finger / fifth finger sag for a plurality of pieces of biometric information (d100, d101,...) Acquired with different first finger / fifth finger sag amounts.
  • the initial value, adjustment value, and feature amount at the time of adjusting the amount are recorded in association with each other.
  • the features of the palm vein image are evaluated comprehensively, for example, by evaluating each item of the vein branch point and the amount of veins per unit area.
  • the evaluation of the feature amount included in the biological information is not limited to the above example, and any evaluation method can be adopted.
  • the processing device 11 has posture information (initial value, adjustment value) corresponding to each feature amount. Are recorded in the order table 220.
  • the processing device 11 refers to the feature amount in the first finger / fifth finger droop amount biological information table 210, and extracts the top three posture information (initial value and adjustment value) having the largest feature amount.
  • the extracted posture information (initial value, adjustment value) is recorded in the order table 220.
  • the processing device 11 determines the posture information corresponding to each feature amount. (Initial value, adjustment value) is recorded in the order table 220.
  • the order table 220 includes the top three posture information (initial value, adjustment value) having the most feature amount among the plurality of pieces of biological information acquired with different finger intervals, and the first finger / fifth finger droop amount. And the top three posture information (initial value, adjustment value) having a large amount of features among a plurality of pieces of biometric information acquired with different values.
  • the posture information combination pattern generated in this way is recorded in the combination biometric information table 230.
  • the processing apparatus 11 acquires one of the combination patterns of posture information recorded in the combination biometric information table 230.
  • the processing device 11 instructs the sensor unit 20 to update the finger interval and the first finger / fifth finger sag amount with the combination pattern of the acquired posture information.
  • the sensor unit 20 updates the finger interval and the first finger / fifth finger droop amount based on an instruction from the processing device 11.
  • the sensor unit 20 may be updated after returning to the initial value once.
  • the sensor unit 20 differs depending on whether the adjustment direction is “+” and the adjustment value is reached, or whether the adjustment direction is “ ⁇ ” and the adjustment value is reached.
  • Eliminate effects on hand posture changes For example, even if the adjustment value is the same as 26 mm, the posture of the hand may be different depending on whether the adjustment value is 26 mm from 25 mm to +1 mm or the adjustment value is reached from 27 mm to ⁇ 1 mm. By returning to the value, the condition of posture change can be made uniform.
  • Step S66 The processing device 11 instructs the sensor unit 20 to acquire biometric information (palm vein image).
  • the sensor unit 20 responds to the processing device 11 with the acquired biological information and posture information at the time of acquiring the biological information.
  • Step S67 The processing apparatus 11 determines whether or not biometric information has been acquired for all the combination patterns of posture information recorded in the combined biometric information table 230. The processing apparatus 11 proceeds to step S68 when the biological information is acquired for all the posture information combination patterns, and proceeds to step S65 when the biological information is not acquired.
  • the processing device 11 evaluates the feature values (v200, v201,%) For the biometric information (d200, d201,%) Acquired for all the posture information combination patterns recorded in the combined biometric information table 230. And recorded in the combined biometric information table 230.
  • the combined biometric information table 230 includes three pieces of posture information (initial values and adjustment values) having different finger intervals and three pieces of posture information (initial values and adjustment values) having different first / fifth finger droop amounts. ) And the biometric information and the feature quantity are recorded in association with each other.
  • the processing device 11 refers to the combined biometric information table 230 and extracts the biometric information with the most characteristic amount as biometric information for verification.
  • the processing device 11 ends the verification biometric information extraction process using the extracted biometric information for verification and the posture information corresponding to the extracted biometric information for verification as the verification posture information.
  • the posture information may include posture information of the wrist support portion position in addition to the posture information of the finger interval and the posture information of the first finger / fifth finger sag amount.
  • the processing device 11 extracts the biometric information for verification after evaluating the feature amount of the biometric information acquired a plurality of times (for example, three times) for the extraction candidates with the top three as extraction candidates. It may be.
  • the processing device 11 can acquire biometric information suitable for use in collation in consideration of individual differences (shape, size, flexibility, etc. of the biometric acquisition site). Further, the processing device 11 generates a ranking table 220 by extracting a posture having a large amount of feature for each of the plurality of posture adjustment elements (finger interval and first finger / fifth finger drooping amount), A posture having a large feature amount is extracted from the posture combinations in 220. Accordingly, the processing device 11 can reduce the processing time required for generating the verification information even when there are a plurality of posture adjustment elements.
  • the 1st finger / 5th finger support part 25 adjusts the height which supports a 1st finger and a 5th finger similarly, a 1st finger support part and a 5th finger support part And a motor, a position sensor, and a load sensor, respectively, and may be independently adjustable.
  • the automatic depositing apparatus 6 acquires a user ID (for example, an account number) from the IC card 16.
  • a user ID for example, an account number
  • the automatic depositing apparatus 6 acquires posture information (posture information on the interval between fingers, posture information on the first finger / fifth amount of drooping) from the IC card 16.
  • the automatic depositing apparatus 6 instructs the sensor unit 20 to support the palm in a posture corresponding to the obtained posture information.
  • the automatic depositing apparatus 6 may first instruct the user to place the palm after placing the sensor unit 20 in a posture that supports the palm with the initial value of each posture information.
  • the automatic teller machine 6 updates the posture of the sensor unit 20 that supports the palm with the adjustment value of each posture information. Thereby, the automatic depositing apparatus 6 can improve the reproducibility of the posture when the registration apparatus 10 registers biometric information.
  • the automatic depositing apparatus 6 acquires the biological information by reproducing the posture of the palm when the biological information is registered. [Step S75] The automatic depositing apparatus 6 collates the biometric information for verification recorded in the IC card 16 with the biometric information acquired from the sensor unit 20.
  • Step S76 The automatic teller machine 6 proceeds to Step S77 when the verification of the biometric information acquired from the sensor unit 20 is successful, and proceeds to Step S78 when the verification fails.
  • the automatic depositing apparatus 6 can guide the palm, which is the user's biometric information acquisition site, to the same posture as when the verification information is registered, and can be expected to improve authentication accuracy.
  • the above processing functions can be realized by a computer.
  • a program describing the processing contents of the functions that each device should have is provided.
  • the program describing the processing contents can be recorded on a computer-readable recording medium (including a portable recording medium). Examples of the computer-readable recording medium include a magnetic recording device, an optical disk, a magneto-optical recording medium, and a semiconductor memory.
  • Examples of the magnetic recording device include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape.
  • Optical disks include DVD (Digital Versatile Disc), DVD-RAM, CD-ROM, CD-R (Recordable) / RW (ReWritable), and the like.
  • Magneto-optical recording media include MO (Magneto-Optical disk).
  • a portable recording medium such as a DVD or CD-ROM in which the program is recorded is sold. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
  • the computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Input (AREA)

Abstract

La présente invention génère des informations biométriques utilisées à des fins de vérification par examen des différences individuelles. Un dispositif de traitement d'informations (1) génère des informations de vérification qui sont préenregistrées lors de l'authentification d'un utilisateur au moyen d'informations biométriques. Une unité d'immobilisation (1a) immobilise un organisme biologique (2) lors de l'acquisition d'informations biométriques propres audit organisme biologique (2). Une unité d'acquisition d'informations (1c) acquiert des informations biométriques propres audit organisme biologique (2) ainsi immobilisé. Une unité de modification de la position (1b) modifie la position de l'organisme biologique (2) immobilisé. En conséquence, le dispositif de traitement d'informations (1) peut acquérir les informations biométriques propres audit organisme biologique (2) alors que celui-ci se trouve dans diverses positions. Une unité d'évaluation (1d) évalue le nombre de caractéristiques des ensembles d'informations biométriques acquises par l'unité d'acquisition d'informations (1c) pour chacune des diverses positions. Une unité de génération (1e) sélectionne, au sein des multiples ensembles d'informations biométriques acquises, l'ensemble d'informations biométriques devant être utilisé à des fins de vérification sur la base de l'évaluation réalisée par l'unité d'évaluation (1d). L'unité de génération (1e) génère des informations de vérification utilisées pour la vérification de l'organisme biologique à partir d'un ensemble d'informations de position permettant d'identifier la position correspondant aux informations biométriques sélectionnées.
PCT/JP2011/074827 2011-10-27 2011-10-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations WO2013061446A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2011/074827 WO2013061446A1 (fr) 2011-10-27 2011-10-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
JP2013540579A JP5655155B2 (ja) 2011-10-27 2011-10-27 情報処理装置、情報処理方法、および情報処理プログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/074827 WO2013061446A1 (fr) 2011-10-27 2011-10-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2013061446A1 true WO2013061446A1 (fr) 2013-05-02

Family

ID=48167311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/074827 WO2013061446A1 (fr) 2011-10-27 2011-10-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations

Country Status (2)

Country Link
JP (1) JP5655155B2 (fr)
WO (1) WO2013061446A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016154512A1 (fr) * 2015-03-25 2016-09-29 Gojo Industries, Inc. Dispositif de dosage de distributeur basé sur la taille de la main

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS634381A (ja) * 1986-06-24 1988-01-09 Mitsubishi Electric Corp 指紋照合装置
JPH09108204A (ja) * 1995-10-24 1997-04-28 Kdk Corp 測定位置再現方法および測定位置再現装置並びにそれを使用した体液成分濃度測定装置
JP2007156936A (ja) * 2005-12-07 2007-06-21 Hitachi Ltd 生体情報照合システム
JP2007159610A (ja) * 2005-12-09 2007-06-28 Matsushita Electric Ind Co Ltd 登録装置、認証装置、登録認証装置、登録方法、認証方法、登録プログラムおよび認証プログラム
JP2008250601A (ja) * 2007-03-30 2008-10-16 Hitachi Omron Terminal Solutions Corp 生体情報読取装置および生体情報読取システム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4207717B2 (ja) * 2003-08-26 2009-01-14 株式会社日立製作所 個人認証装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS634381A (ja) * 1986-06-24 1988-01-09 Mitsubishi Electric Corp 指紋照合装置
JPH09108204A (ja) * 1995-10-24 1997-04-28 Kdk Corp 測定位置再現方法および測定位置再現装置並びにそれを使用した体液成分濃度測定装置
JP2007156936A (ja) * 2005-12-07 2007-06-21 Hitachi Ltd 生体情報照合システム
JP2007159610A (ja) * 2005-12-09 2007-06-28 Matsushita Electric Ind Co Ltd 登録装置、認証装置、登録認証装置、登録方法、認証方法、登録プログラムおよび認証プログラム
JP2008250601A (ja) * 2007-03-30 2008-10-16 Hitachi Omron Terminal Solutions Corp 生体情報読取装置および生体情報読取システム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016154512A1 (fr) * 2015-03-25 2016-09-29 Gojo Industries, Inc. Dispositif de dosage de distributeur basé sur la taille de la main
US10219656B2 (en) 2015-03-25 2019-03-05 Gojo Industries, Inc. Dispenser dosing based on hand size

Also Published As

Publication number Publication date
JP5655155B2 (ja) 2015-01-14
JPWO2013061446A1 (ja) 2015-04-02

Similar Documents

Publication Publication Date Title
JP5671607B2 (ja) 生体認証装置、生体認証システム、および生体認証方法
JP5622928B2 (ja) 照合装置、照合プログラム、および照合方法
JP5509335B2 (ja) 登録プログラム、登録装置、および登録方法
JP5681786B2 (ja) 生体情報取得装置、および生体情報取得方法
EP2677490B1 (fr) Dispositif, programme et procédé d'authentification
US20060080254A1 (en) Individual authentication method, individual authentication device, and program for same
TW202011315A (zh) 資源轉移方法、裝置及系統
JP2007249556A (ja) 生体情報を用いた個人認証システム,方法およびプログラム
US20130170717A1 (en) Authentication apparatus, authentication program, and method of authentication
JP2013171325A (ja) 照合対象決定装置、照合対象決定プログラム、および照合対象決定方法
TW202038133A (zh) 利用深度學習快速定位虹膜之裝置與方法
CN101587547B (zh) 静脉纹注册装置以及静脉纹注册方法
JP2014180435A (ja) 生体情報入力装置、生体情報入力プログラム、生体情報入力方法
JP5655155B2 (ja) 情報処理装置、情報処理方法、および情報処理プログラム
WO2013046365A1 (fr) Dispositif de guidage, dispositif d'acquisition d'informations biométriques et dispositif d'enregistrement
JP2013148988A (ja) 生体情報処理装置、生体情報処理プログラム、および生体情報処理方法
JP2014026585A (ja) 生体情報入力装置、生体支持状態判定方法、および生体支持状態判定プログラム
JP5685272B2 (ja) 認証装置、認証プログラム、および認証方法
TW202107383A (zh) 基於動態影像的身分驗證裝置及方法
JP2018136897A (ja) 認証装置及び認証方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11874732

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013540579

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11874732

Country of ref document: EP

Kind code of ref document: A1