JP6431044B2 - Biometric authentication device, biometric authentication method, and program - Google Patents

Biometric authentication device, biometric authentication method, and program Download PDF

Info

Publication number
JP6431044B2
JP6431044B2 JP2016509684A JP2016509684A JP6431044B2 JP 6431044 B2 JP6431044 B2 JP 6431044B2 JP 2016509684 A JP2016509684 A JP 2016509684A JP 2016509684 A JP2016509684 A JP 2016509684A JP 6431044 B2 JP6431044 B2 JP 6431044B2
Authority
JP
Japan
Prior art keywords
feature
value corresponding
luminance value
omnidirectional
directional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2016509684A
Other languages
Japanese (ja)
Other versions
JPWO2015145589A1 (en
Inventor
鈴木 智晴
智晴 鈴木
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Frontech Ltd
Original Assignee
Fujitsu Frontech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Frontech Ltd filed Critical Fujitsu Frontech Ltd
Publication of JPWO2015145589A1 publication Critical patent/JPWO2015145589A1/en
Application granted granted Critical
Publication of JP6431044B2 publication Critical patent/JP6431044B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1359Extracting features related to ridge properties; Determining the fingerprint type, e.g. whorl or loop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/32Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
    • H04L9/3226Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using a predetermined code, e.g. password, passphrase or PIN
    • H04L9/3231Biological data, e.g. fingerprint, voice or retina
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
    • G06V40/1388Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger using image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Collating Specific Patterns (AREA)

Description

本開示の実施形態は、生体認証の技術に関する。  Embodiments of the present disclosure relate to biometric authentication technology.

既存の生体認証装置では、撮影後の画像から抽出される生体情報と、予め登録されている生体情報とが互いに一致する場合、本人であると判定する。この生体情報には掌紋や静脈などを示す特徴が含まれており、静脈を示す特徴を用いて生体認証を行う場合は、撮影後の画像から掌紋を示す特徴を分離して、できるだけ静脈を示す特徴のみにする必要がある。掌紋を示す特徴を分離する方法として、例えば、偏光フィルタなどを利用して光学的に分離する方法が知られている。また、他の方法として、例えば、複数波長撮影による方法が知られている。  In the existing biometric authentication device, when the biometric information extracted from the image after photographing and the biometric information registered in advance match each other, it is determined that the user is the person himself / herself. This biometric information includes features such as palm prints and veins. When biometric authentication is performed using the features indicating veins, the features indicating the palm prints are separated from the captured image to indicate the veins as much as possible. It needs to be a feature only. As a method of separating the feature indicating the palm print, for example, a method of optical separation using a polarizing filter or the like is known. As another method, for example, a method using multiple wavelength imaging is known.

A.Ross, A.K.Jain, and J.Reisman, “A Hybrid fingerprint matcher”, Pattern Recognition, vol.36, no.7, pp.1661-1673, 2003.A. Ross, A. K. Jain, and J. Reisman, “A Hybrid fingerprint matcher”, Pattern Recognition, vol. 36, no. 7, pp. 1661-1673, 2003.

しかしながら、掌紋を示す特徴を物理的に分離する方法を適用できない場合では、掌紋を示す特徴が含まれたままの生体情報を用いて生体認証を行わなくてはならない。掌紋を示す特徴の多様性は、静脈を示す特徴の多様性に比べて乏しいため、生体情報に含まれる掌紋を示す特徴が多い程、他人受入率(FAR(False Acceptance Rate))を高めてしまう。また、メラニンが強く沈着した掌紋を示す特徴が含まれている場合、静脈を示す特徴に比べて掌紋を示す特徴が抽出され易くなるため、他人受入率をさらに高めてしまう。  However, when the method of physically separating the feature indicating the palm print cannot be applied, the biometric authentication must be performed using the biometric information including the feature indicating the palm print. Because the diversity of features that indicate palm prints is less than the variety of features that indicate veins, the more features that indicate palm prints included in biological information, the higher the acceptance rate (FAR (False Acceptance Rate)). . In addition, when a feature indicating a palm print in which melanin is strongly deposited is included, a feature indicating a palm print is more easily extracted than a feature indicating a vein, and the acceptance rate of others is further increased.

本開示の実施形態では、画像から掌紋を示す特徴を物理的に分離する方法を適用できない場合であっても、他人受入率が高くなることを抑えることが可能な生体認証装置、生体認証方法、及びプログラムを提供することを目的とする。  In the embodiment of the present disclosure, a biometric authentication device, a biometric authentication method, and a biometric authentication method capable of suppressing an increase in the acceptance rate of others even when a method of physically separating a feature indicating a palmprint from an image cannot be applied. And to provide a program.

本開示の実施形態の生体認証装置は、入力される画像から複数の指向的特徴を抽出するフィルタと、前記フィルタにより抽出される複数の指向的特徴に対して静脈に相当する輝度値よりも大きくなる傾向にある掌紋に相当する輝度値が静脈に相当する輝度値に比べて小さくなるように正規化を行う方向毎指向的特徴正規化処理部と、前記方向毎指向的特徴正規化処理部から出力される複数の指向的特徴により無指向的特徴を生成する無指向的特徴生成部と、前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求める照合処理部と、前記類似度を用いて本人であるか否かの判定を行う判定部とを備える。 Biometric authentication apparatus of an embodiment of the present disclosure includes a filter for extracting a plurality of directional characteristics from the image input is greater than the brightness value corresponding to the vein to a plurality of directional features extracted by said filter A direction-specific feature normalization processing unit that normalizes so that a luminance value corresponding to a palm print that tends to become smaller than a luminance value corresponding to a vein, and the direction-specific feature normalization processing unit A non-directional feature generation unit that generates an omni-directional feature from a plurality of output directional features, and a collation process for obtaining a similarity between the omni-directional feature and a registered omni-directional feature stored in the storage unit And a determination unit that determines whether or not the user is the person using the similarity.

また、本開示の実施形態の生体認証方法は、コンピュータが、入力される画像から複数の指向的特徴を抽出し、前記抽出した複数の指向的特徴に対して静脈に相当する輝度値よりも大きくなる傾向にある掌紋に相当する輝度値が静脈に相当する輝度値に比べて小さくなるように正規化を行い、前記正規化した複数の指向的特徴により無指向的特徴を生成し、前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求め、前記類似度を用いて本人であるか否かの判定を行う。 In the biometric authentication method according to the embodiment of the present disclosure, the computer extracts a plurality of directional features from an input image, and the extracted plurality of directional features is larger than a luminance value corresponding to a vein. Normalization is performed such that a luminance value corresponding to a palmprint that tends to become smaller than a luminance value corresponding to a vein, and an omnidirectional feature is generated by the plurality of normalized directional features. The similarity between the target feature and the registered omnidirectional feature stored in the storage unit is obtained, and it is determined whether or not the user is the person using the similarity.

また、本開示の実施形態のプログラムは、コンピュータに、入力される画像から複数の指向的特徴を抽出し、前記抽出した複数の指向的特徴に対して静脈に相当する輝度値よりも大きくなる傾向にある掌紋に相当する輝度値が静脈に相当する輝度値に比べて小さくなるように正規化を行い、前記正規化した複数の指向的特徴により無指向的特徴を生成し、前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求め、前記類似度を用いて本人であるか否かの判定を行うことを実行させる。 Further, the program according to the embodiment of the present disclosure extracts a plurality of directional features from an image input to a computer, and the extracted plurality of directional features tend to be larger than a luminance value corresponding to a vein. Normalization is performed such that a luminance value corresponding to a palm print in the image becomes smaller than a luminance value corresponding to a vein, and an omnidirectional feature is generated by the normalized directional features, and the omnidirectional feature is generated. And the registration omni-directional feature stored in the storage unit are obtained, and it is executed to determine whether or not the person is the person using the similarity.

本開示の実施形態によれば、画像から掌紋を示す特徴を物理的に分離する方法を適用できない場合であっても、他人受入率が高くなることを抑えることができる。  According to the embodiment of the present disclosure, it is possible to suppress an increase in the acceptance rate of others even when a method of physically separating a feature indicating a palm print from an image cannot be applied.

本開示の実施形態の生体認証装置の一例を示す図である。It is a figure which shows an example of the biometrics apparatus of embodiment of this indication. 本開示の実施形態の生体認証方法を示すフローチャートである。It is a flowchart which shows the biometrics authentication method of embodiment of this indication. 本開示の実施形態の特徴抽出部の一例を示す図である。It is a figure which shows an example of the feature extraction part of embodiment of this indication. 本開示の実施形態の照合処理部の一例を示す図である。It is a figure which shows an example of the collation process part of embodiment of this indication. 生体認証装置のハードウェアの一例を示す図である。It is a figure which shows an example of the hardware of a biometrics authentication apparatus.

図1は、本開示の実施形態の生体認証装置の一例を示す図である。
図1に示す生体認証装置1は、画像取得部2と、領域特定部3と、特徴抽出部4と、照合処理部5と、スコア判定部6(判定部)と、記憶部7とを備える。
FIG. 1 is a diagram illustrating an example of a biometric authentication device according to an embodiment of the present disclosure.
A biometric authentication device 1 illustrated in FIG. 1 includes an image acquisition unit 2, a region specification unit 3, a feature extraction unit 4, a matching processing unit 5, a score determination unit 6 (determination unit), and a storage unit 7. .

図2は、本開示の実施形態の生体認証方法を示すフローチャートである。
まず、画像取得部2は、被験者の手の画像を取得する(S1)。例えば、画像取得部2は、撮像装置であって、単板の撮像素子とベイヤー配列のRGBの各カラーフィルタにより被検者の手の撮像画像を取得する。
FIG. 2 is a flowchart illustrating a biometric authentication method according to an embodiment of the present disclosure.
First, the image acquisition unit 2 acquires an image of a subject's hand (S1). For example, the image acquisition unit 2 is an imaging device, and acquires a captured image of a hand of a subject using a single-plate imaging element and RGB color filters in a Bayer array.

次に、領域特定部3は、画像取得部2で取得される画像において被検者の手のひらに相当する手のひら領域(ROI(Region Of Interest))を特定する(S2)。  Next, the region specifying unit 3 specifies a palm region (ROI (Region Of Interest)) corresponding to the palm of the subject in the image acquired by the image acquiring unit 2 (S2).

次に、特徴抽出部4は、領域特定部3により特定される手のひら領域の画像fから無指向的特徴を抽出する(S3)。なお、無指向的とは、ある画像fに対してフィルタ処理Sを行う場合において、フィルタ処理Sの前に様々な角度θに対する画像回転変換処理TθやTθの逆変換処理Tθ −1を挿入した場合であっても、フィルタ処理S単体の結果とほとんど変わらないことと定義する。すなわち、記号であらわすと、無指向的とは、任意の角度θにおいて、S(f)=Tθ −1(S(Tθ(f)))であると定義する。Next, the feature extraction unit 4 extracts an omnidirectional feature from the palm region image f specified by the region specification unit 3 (S3). Note that the non-directional, in the case of performing a filtering process S to a certain image f, the inverse conversion processing of the image rotation transformation processing T theta and T theta for various angles theta before filtering S T theta -1 It is defined that even when the is inserted, the result is almost the same as the result of the filtering process S alone. That is, when expressed by a symbol, non-directional is defined as S (f) = T θ −1 (S (T θ (f))) at an arbitrary angle θ.

次に、照合処理部5は、特徴抽出部4により抽出される無指向的特徴と、予め登録され記憶部7に記憶されている登録無指向的特徴との類似度を求める(S4)。  Next, the matching processing unit 5 obtains a similarity between the omnidirectional feature extracted by the feature extraction unit 4 and the registered omnidirectional feature registered in advance and stored in the storage unit 7 (S4).

次に、スコア判定部6は、照合処理部5により求められる類似度により本人であるか否かの判定を行う(S5)。  Next, the score determination unit 6 determines whether or not the person is the person based on the similarity obtained by the matching processing unit 5 (S5).

図3は、本開示の実施形態の特徴抽出部4の一例を示す図である。
図3に示す特徴抽出部4は、フィルタ41と、方向毎指向的特徴正規化処理部42と、各点最大値選択(point-wise maximum)部43と、2値化部44と、細線化部45とを備える。
FIG. 3 is a diagram illustrating an example of the feature extraction unit 4 according to the embodiment of the present disclosure.
The feature extraction unit 4 shown in FIG. 3 includes a filter 41, a direction-specific feature normalization processing unit 42, a point-wise maximum selection unit 43, a binarization unit 44, and a thinning process. Part 45.

フィルタ41は、入力される手のひら領域の画像fに対して、8つの方向θ(0°、22.5°、45°、67.5°、90°、112.5°、135°、157.5°)毎にGaborフィルタ処理を行い、それぞれのフィルタ応答(輝度値)を指向的特徴gθ(指向的特徴g、指向的特徴g2 2.5°、指向的特徴g45°、指向的特徴g67.5°、指向的特徴g90°、指向的特徴g112.5 °、指向的特徴g135°、指向的特徴g157.5°)として得る。なお、フィルタリング処理時に設定される方向θの数は2つ以上であれば、8つに限定されない。また、フィルタリング処理は、画像f内の各方向θの線状の暗部に対して高いフィルタ応答をもつものであれば、Gaborフィルタ処理に限定されない。The filter 41 performs Gabor filter processing for each of the eight directions θ (0 °, 22.5 °, 45 °, 67.5 °, 90 °, 112.5 °, 135 °, 157.5 °) with respect to the input image f of the palm region. Each filter response (luminance value) is assigned to a directional feature g θ (directional feature g 0 ° , directional feature g 2 2.5 ° , directional feature g 45 ° , directional feature g 67.5 ° , directional feature g 90 ° , directional feature g 112.5 ° , directional feature g 135 ° , directional feature g 157.5 ° ). Note that the number of directions θ set during the filtering process is not limited to eight as long as it is two or more. The filtering process is not limited to the Gabor filter process as long as it has a high filter response with respect to a linear dark part in each direction θ in the image f.

方向毎指向的特徴正規化処理部42は、フィルタ41から抽出される各指向的特徴gθに対してそれぞれ正規化を行う。Direction each directional feature normalization processing unit 42 performs respective normalized for each directional characteristic g theta extracted from the filter 41.

各点最大値選択部43は、方向毎指向的特徴正規化処理部42から出力される各指向的特徴gθにより無指向的特徴gを生成する。例えば、各点最大値選択部43は、式1に示すように、方向毎指向的特徴正規化処理部42から出力される各指向的特徴gθ(i,j)のうち、最大の指向的特徴maxθ{gθ(i,j)}を無指向的特徴g(i,j)として出力する。なお、iは、手のひら領域内の各画素の位置を2次元座標上の位置に対応させたときのその2次元座標の横軸方向の位置を示し、jは、その2次元座標の縦軸方向の位置を示す。Maximum value selector 43 points produces a non-directional characteristics g of each directional characteristic g theta outputted from the direction each directional feature normalization processing unit 42. For example, each point maximum value selection unit 43, as shown in Equation 1, among the directional features g θ (i, j) output from the directional feature normalization processing unit 42 for each direction, The feature max θ {g θ (i, j)} is output as an omnidirectional feature g (i, j). Note that i indicates the position in the horizontal axis direction of the two-dimensional coordinate when the position of each pixel in the palm region corresponds to the position on the two-dimensional coordinate, and j indicates the vertical axis direction of the two-dimensional coordinate. Indicates the position.

Figure 0006431044
Figure 0006431044

2値化部44は、式2に示すように、各点最大値選択部43から出力される無指向的特徴g(i,j)が正の値であるとき、1を無指向的面特徴b(i,j)として出力し、無指向的特徴g(i,j)が正の値以外の値であるとき、0を無指向的面特徴b(i,j)として出力する。このとき得られた無指向的面特徴bは、記憶部7に記憶される。  As shown in Equation 2, when the omnidirectional feature g (i, j) output from each point maximum value selection unit 43 is a positive value, the binarizing unit 44 sets 1 to an omnidirectional surface feature. b (i, j) is output, and when the omnidirectional feature g (i, j) is a value other than a positive value, 0 is output as the omnidirectional surface feature b (i, j). The omnidirectional surface feature b obtained at this time is stored in the storage unit 7.

Figure 0006431044
Figure 0006431044

なお、上記2値化部44では、単純な定数0による閾値処理によって2値化処理を行っているが、より高度なAdaptive-thresholding法を用いて2値化処理を行ってもよい。  In the binarization unit 44, the binarization process is performed by a threshold process using a simple constant 0. However, the binarization process may be performed using a more advanced adaptive-thresholding method.

細線化部45は、式3に示すように、無指向的面特徴bに対して、細線化処理(skeltonizing)を行うことにより、無指向的線特徴LFを得る。なお、skelは、細線化処理を表す。また、このとき得られた無指向的線特徴LFは、記憶部7に記憶される。また、線特徴とは、線状からなる画像である。  The thinning unit 45 obtains an omnidirectional line feature LF by performing thinning processing (skeltonizing) on the omnidirectional surface feature b as shown in Expression 3. Note that skel represents a thinning process. Further, the omnidirectional line feature LF obtained at this time is stored in the storage unit 7. A line feature is an image made up of lines.

Figure 0006431044
Figure 0006431044

図1に示す照合処理部5は、図4に示すように、細線化部45から出力され記憶部7に記憶されている無指向的線特徴LFと、予め登録され記憶部7に記憶されている登録無指向的線特徴TLFとの類似度scoreを求める。  As shown in FIG. 4, the collation processing unit 5 shown in FIG. 1 includes an omnidirectional line feature LF output from the thinning unit 45 and stored in the storage unit 7 and pre-registered and stored in the storage unit 7. A similarity score with a registered non-directional line feature TLF is obtained.

図1に示すスコア判定部6は、類似度scoreが閾値以上であるとき、本人であると判定する。  The score determination unit 6 illustrated in FIG. 1 determines that the person is the person when the similarity score is equal to or greater than a threshold value.

掌紋に相当する輝度値は静脈に相当する輝度値よりも大きくなる傾向にある。そのため、フィルタ41から抽出される各指向的特徴gθに対して正規化が行われると、各指向的特徴gθにおいて、静脈に相当する輝度値に比べて掌紋に相当する輝度値が小さくなる。そして、正規化後の各指向的特徴gθが用いられて生成される無指向的線特徴LFは、掌紋の影響が抑えられたものになる。そのため、掌紋の影響を抑えて本人であるか否かの判定を行うことができるため、認証精度を向上させることができる。すなわち、本開示の実施形態の生体認証装置1によれば、画像から掌紋を示す特徴を物理的に分離する方法を適用できない場合であっても、他人受入率が高くなることを抑えることができる。また、特に被験者の手のひらにおいてメラニンが強く沈着し、指向的特徴gθに掌紋が多く含まれる場合、特に、無指向的線特徴LFにおける掌紋の影響を抑えて本人であるか否かの判定を行うことができるため、他人受入率を低減することができる。The luminance value corresponding to the palm print tends to be larger than the luminance value corresponding to the vein. Therefore, when normalization is performed for each directional characteristic g theta extracted from the filter 41, in each directional characteristic g theta, intensity values corresponding to the palm print is smaller than the luminance value corresponding to the vein . Then, omnidirectionally line feature LF each directional characteristic g theta after the normalization is generated used will what influence the palm print is suppressed. Therefore, since it is possible to determine whether or not the person is the person while suppressing the influence of the palm print, the authentication accuracy can be improved. That is, according to the biometric authentication device 1 of the embodiment of the present disclosure, it is possible to suppress an increase in the acceptance rate of others even when a method for physically separating a feature indicating a palmprint from an image cannot be applied. . In particular deposited strong melanin in the palm of the subject, if the palm print is abundant in directional characteristics g theta, in particular, whether a person with reduced influence of a palm print in the non-oriented line feature LF determination Since it can be performed, the acceptance rate of others can be reduced.

また、フィルタ41から抽出される各指向的特徴gθに対して正規化が行われると、各指向的特徴gθにおいて、掌紋に相当する輝度値に比べて静脈に相当する輝度値が大きくなる。そして、正規化後の各指向的特徴gθが用いられて生成される無指向的線特徴LFは、静脈が強調されたものになる。そのため、掌紋に比べて多様性が高い静脈を強調させて本人であるか否かの判定を行うことができるので、本人拒否率を低減することができる。Further, when normalization is performed for each directional characteristic g theta extracted from the filter 41, in each directional characteristic g theta, intensity values corresponding to the vein is larger than the luminance value corresponding to a palm print . Then, omnidirectionally line feature LF each directional characteristic g theta after the normalization is generated used will what vein is emphasized. Therefore, it is possible to determine whether or not the person is the person by emphasizing veins that are more diverse than the palm print, so that the person rejection rate can be reduced.

図5は、本開示の実施形態の生体認証装置1を構成するハードウェアの一例を示す図である。  FIG. 5 is a diagram illustrating an example of hardware configuring the biometric authentication device 1 according to the embodiment of the present disclosure.

図5に示すように、生体認証装置1を構成するハードウェアは、制御部1201と、記憶部1202と、記録媒体読取装置1203と、入出力インタフェース1204と、通信インタフェース1205とを備え、それらがバス1206によってそれぞれ接続されている。なお、画像処理装置1を構成するハードウェアは、クラウドなどを用いて実現してもよい。  As shown in FIG. 5, the hardware constituting the biometric authentication device 1 includes a control unit 1201, a storage unit 1202, a recording medium reading device 1203, an input / output interface 1204, and a communication interface 1205. Each is connected by a bus 1206. Note that the hardware configuring the image processing apparatus 1 may be realized using a cloud or the like.

制御部1201は、例えば、Central Processing Unit(CPU)、マルチコアCPU、プログラマブルなデバイス(Field Programmable Gate Array(FPGA)、Programmable Logic Device(PLD)など)を用いることが考えられ、図1に示す領域特定部3、特徴抽出部4、照合処理部5、及びスコア判定部6に相当する。  For example, the control unit 1201 may use a Central Processing Unit (CPU), a multi-core CPU, and a programmable device (Field Programmable Gate Array (FPGA), Programmable Logic Device (PLD), etc.). This corresponds to the unit 3, the feature extraction unit 4, the matching processing unit 5, and the score determination unit 6.

記憶部1202は、図1に示す記憶部7に相当し、例えばRead Only Memory(ROM)、Random Access Memory(RAM)などのメモリやハードディスクなどが考えられる。なお、記憶部1202は、実行時のワークエリアとして用いてもよい。また、生体認証装置1の外部に他の記憶部を設けてもよい。  The storage unit 1202 corresponds to the storage unit 7 illustrated in FIG. 1 and may be a memory such as a read only memory (ROM) or a random access memory (RAM), a hard disk, or the like. Note that the storage unit 1202 may be used as a work area at the time of execution. Further, another storage unit may be provided outside the biometric authentication device 1.

記録媒体読取装置1203は、制御部1201の制御により、記録媒体1207に記録されるデータを読み出したり、記録媒体1207にデータを書き込んだりする。また、着脱可能な記録媒体1207は、コンピュータで読み取り可能なnon-transitory(非一時的)な記録媒体であって、例えば、磁気記録装置、光ディスク、光磁気記録媒体、半導体メモリなどが考えられる。磁気記録装置は、例えば、ハードディスク装置(HDD)などが考えられる。光ディスクは、例えば、Digital Versatile Disc(DVD)、DVD−RAM、Compact Disc Read Only Memory(CD−ROM)、CD−R(Recordable)/RW(ReWritable)などが考えられる。光磁気記録媒体は、例えば、Magneto-Optical disk(MO)などが考えられる。なお、記憶部1202もnon-transitory(非一時的)な記録媒体に含まれる。  The recording medium reading device 1203 reads data recorded on the recording medium 1207 and writes data to the recording medium 1207 under the control of the control unit 1201. The removable recording medium 1207 is a non-transitory recording medium that can be read by a computer. For example, a magnetic recording device, an optical disk, a magneto-optical recording medium, a semiconductor memory, and the like are conceivable. As the magnetic recording device, for example, a hard disk device (HDD) can be considered. As the optical disk, for example, Digital Versatile Disc (DVD), DVD-RAM, Compact Disc Read Only Memory (CD-ROM), CD-R (Recordable) / RW (ReWritable), and the like can be considered. As the magneto-optical recording medium, for example, a Magneto-Optical disk (MO) can be considered. Note that the storage unit 1202 is also included in a non-transitory recording medium.

入出力インタフェース1204は、入出力部1208が接続され、ユーザにより入出力部1208から入力された情報をバス1206を介して制御部1201に送る。また、入出力インタフェース1204は、制御部1201から送られてくる情報をバス1206を介して入出力部1208に送る。  The input / output interface 1204 is connected to the input / output unit 1208, and sends information input from the input / output unit 1208 by the user to the control unit 1201 via the bus 1206. The input / output interface 1204 sends information sent from the control unit 1201 to the input / output unit 1208 via the bus 1206.

入出力部1208は、図1に示す画像取得部2に相当し、例えば、撮像装置などが考えられる。また、入出力部1208は、例えば、キーボード、ポインティングデバイス(マウスなど)、タッチパネル、Cathode Ray Tube(CRT)ディスプレイ、プリンタなどが考えられる。  The input / output unit 1208 corresponds to the image acquisition unit 2 illustrated in FIG. 1 and may be an imaging device, for example. The input / output unit 1208 may be, for example, a keyboard, a pointing device (such as a mouse), a touch panel, a Cathode Ray Tube (CRT) display, or a printer.

通信インタフェース1205は、Local Area Network(LAN)接続やインターネット接続を行うためのインタフェースである。また、通信インタフェース1205は必要に応じ、他のコンピュータとの間のLAN接続やインターネット接続や無線接続を行うためのインタフェースとして用いてもよい。  The communication interface 1205 is an interface for performing Local Area Network (LAN) connection or Internet connection. Further, the communication interface 1205 may be used as an interface for performing a LAN connection, Internet connection, or wireless connection with another computer as necessary.

このようなハードウェアを有するコンピュータを用いることによって、生体認証装置1が行う各種処理機能が実現される。この場合、生体認証装置1が行う各種処理機能の内容を記述したプログラムをコンピュータで実行することにより、上記各処理機能(例えば、領域特定部3、特徴抽出部4、照合処理部5、スコア判定部6)がコンピュータ上で実現される。各種処理機能の内容を記述したプログラムは、記憶部1202や記録媒体1207に格納しておくことができる。  By using a computer having such hardware, various processing functions performed by the biometric authentication device 1 are realized. In this case, each processing function (for example, the area specifying unit 3, the feature extracting unit 4, the matching processing unit 5, the score determination) is executed by executing a program describing the contents of various processing functions performed by the biometric authentication device 1 on a computer. Part 6) is realized on a computer. A program describing the contents of various processing functions can be stored in the storage unit 1202 or the recording medium 1207.

プログラムを流通させる場合には、例えば、そのプログラムが記録されたDVD、CD−ROMなどの記録媒体1207が販売される。また、プログラムをサーバコンピュータの記憶装置に記録しておき、ネットワークを介して、サーバコンピュータから他のコンピュータにそのプログラムを転送することもできる。  When distributing the program, for example, a recording medium 1207 such as a DVD or a CD-ROM in which the program is recorded is sold. It is also possible to record the program in a storage device of the server computer and transfer the program from the server computer to another computer via a network.

プログラムを実行するコンピュータは、例えば、記録媒体1207に記録されたプログラム、又は、サーバコンピュータから転送されたプログラムを、記憶部1202に記憶する。そして、コンピュータは、記憶部1202からプログラムを読み取り、プログラムに従った処理を実行する。なお、コンピュータは、記録媒体1207から直接プログラムを読み取り、そのプログラムに従った処理を実行することもできる。また、コンピュータは、サーバコンピュータからプログラムが転送されるごとに、逐次、受け取ったプログラムに従った処理を実行することもできる。  The computer that executes the program stores, for example, the program recorded in the recording medium 1207 or the program transferred from the server computer in the storage unit 1202. The computer reads the program from the storage unit 1202 and executes processing according to the program. The computer can also read the program directly from the recording medium 1207 and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.

なお、本開示の実施形態では、手のひらの静脈を用いて認証を行う画像処理装置を例示して説明したが、これに限らず、生体のその他の特徴検出部位であればどこでもよい。  In the embodiment of the present disclosure, the image processing apparatus that performs authentication using the palm vein has been described as an example. However, the present invention is not limited thereto, and any other feature detection site of a living body may be used.

たとえば、生体のその他の特徴検出部位は、静脈に限らず、生体の血管像や、生体の紋様、生体の指紋や掌紋、足の裏、手足の指、手足の甲、手首、腕などであってもよい。  For example, other feature detection parts of a living body are not limited to veins, but include a blood vessel image of a living body, a pattern of a living body, a fingerprint or palm print of a living body, a sole, a toe, a back of a limb, a wrist, an arm, etc. May be.

なお、認証に静脈を用いる場合、生体のその他の特徴検出部位は、静脈を観察可能な部位であればよい。  In addition, when using a vein for authentication, the other characteristic detection site | part of a biological body should just be a site | part which can observe a vein.

なお、生体情報を特定可能な生体のその他の特徴検出部位であれば認証に有利である。たとえば、手のひらや顔などであれば、取得した画像から部位を特定可能である。また、上述の実施の形態は、実施の形態の要旨を逸脱しない範囲内において種々の変更を加えることができる。さらに、上述の実施の形態は、多数の変形、変更が当業者にとって可能であり、説明した正確な構成および応用例に限定されるものではない。  Any other feature detection part of a living body that can specify biological information is advantageous for authentication. For example, in the case of a palm or face, a part can be specified from the acquired image. Further, the above-described embodiment can be variously modified within a range not departing from the gist of the embodiment. Further, the above-described embodiments can be modified and changed by those skilled in the art, and are not limited to the exact configurations and application examples described.

1 生体認証装置
2 画像取得部
3 領域特定部
4 特徴抽出部
5 照合処理部
6 スコア判定部
7 記憶部
41 フィルタ
42 方向毎指向的特徴正規化処理部
43 各点最大値選択部
44 2値化部
45 細線化部
DESCRIPTION OF SYMBOLS 1 Biometric authentication apparatus 2 Image acquisition part 3 Area | region specific part 4 Feature extraction part 5 Collation processing part 6 Score determination part 7 Storage part 41 Filter 42 Directional characteristic normalization processing part 43 Each point maximum value selection part 44 Binarization Part 45 Thinning part

Claims (3)

入力される画像から複数の指向的特徴を抽出するフィルタと、
前記フィルタにより抽出される複数の指向的特徴に対して静脈に相当する輝度値よりも大きくなる傾向にある掌紋に相当する輝度値が静脈に相当する輝度値に比べて小さくなるように正規化を行う方向毎指向的特徴正規化処理部と、
前記方向毎指向的特徴正規化処理部から出力される複数の指向的特徴により無指向的特徴を生成する無指向的特徴生成部と、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求める照合処理部と、
前記類似度を用いて本人であるか否かの判定を行う判定部と、
を備える生体認証装置。
A filter that extracts multiple directional features from the input image;
For a plurality of directional features extracted by the filter, normalization is performed so that a luminance value corresponding to a palm print that tends to be larger than a luminance value corresponding to a vein is smaller than a luminance value corresponding to a vein. A direction-specific feature normalization processing unit to perform,
An omnidirectional feature generation unit that generates an omnidirectional feature by a plurality of directional features output from the directional feature normalization processing unit for each direction;
A collation processing unit for obtaining a similarity between the omnidirectional feature and a registered omnidirectional feature stored in the storage unit;
A determination unit that determines whether or not the person is the person using the similarity,
A biometric authentication device.
コンピュータが、
入力される画像から複数の指向的特徴を抽出し、
前記抽出した複数の指向的特徴に対して静脈に相当する輝度値よりも大きくなる傾向にある掌紋に相当する輝度値が静脈に相当する輝度値に比べて小さくなるように正規化を行い、
前記正規化した複数の指向的特徴により無指向的特徴を生成し、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求め、
前記類似度を用いて本人であるか否かの判定を行う、
ことを特徴とする生体認証方法。
Computer
Extract multiple directional features from the input image,
Normalizing the extracted plurality of directional features so that the luminance value corresponding to the palmprint that tends to be larger than the luminance value corresponding to the vein is smaller than the luminance value corresponding to the vein ,
Generating omnidirectional features by the plurality of normalized directional features;
Obtaining a similarity between the omnidirectional feature and the registered omnidirectional feature stored in the storage unit;
It is determined whether or not the person is the person using the similarity.
A biometric authentication method.
コンピュータに、
入力される画像から複数の指向的特徴を抽出し、
前記抽出した複数の指向的特徴に対して静脈に相当する輝度値よりも大きくなる傾向にある掌紋に相当する輝度値が静脈に相当する輝度値に比べて小さくなるように正規化を行い、
前記正規化した複数の指向的特徴により無指向的特徴を生成し、
前記無指向的特徴と記憶部に記憶されている登録無指向的特徴との類似度を求め、
前記類似度を用いて本人であるか否かの判定を行う、
ことを実行させるためのプログラム。
On the computer,
Extract multiple directional features from the input image,
Normalizing the extracted plurality of directional features so that the luminance value corresponding to the palmprint that tends to be larger than the luminance value corresponding to the vein is smaller than the luminance value corresponding to the vein ,
Generating omnidirectional features by the plurality of normalized directional features;
Obtaining a similarity between the omnidirectional feature and the registered omnidirectional feature stored in the storage unit;
It is determined whether or not the person is the person using the similarity.
A program to make things happen.
JP2016509684A 2014-03-25 2014-03-25 Biometric authentication device, biometric authentication method, and program Active JP6431044B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/058384 WO2015145589A1 (en) 2014-03-25 2014-03-25 Biometric authentication device, biometric authentication method, and program

Publications (2)

Publication Number Publication Date
JPWO2015145589A1 JPWO2015145589A1 (en) 2017-04-13
JP6431044B2 true JP6431044B2 (en) 2018-11-28

Family

ID=54194187

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2016509684A Active JP6431044B2 (en) 2014-03-25 2014-03-25 Biometric authentication device, biometric authentication method, and program

Country Status (4)

Country Link
US (1) US10019616B2 (en)
EP (1) EP3125195B1 (en)
JP (1) JP6431044B2 (en)
WO (1) WO2015145589A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106878023A (en) * 2017-02-22 2017-06-20 福建升腾资讯有限公司 A kind of method and system that cloud desktop is logined based on fin- ger vein authentication

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6282304B1 (en) * 1999-05-14 2001-08-28 Biolink Technologies International, Inc. Biometric system for biometric input, comparison, authentication and access control and method therefor
JP2000358025A (en) * 1999-06-15 2000-12-26 Nec Corp Information processing method, information processor and recording medium storing information processing program
US7072525B1 (en) 2001-02-16 2006-07-04 Yesvideo, Inc. Adaptive filtering of visual image using auxiliary image information
WO2002096181A2 (en) * 2001-05-25 2002-12-05 Biometric Informatics Technology, Inc. Fingerprint recognition system
US7636455B2 (en) 2002-06-04 2009-12-22 Raytheon Company Digital image edge detection and road network tracking method and system
CN1238809C (en) 2002-09-04 2006-01-25 长春鸿达光电子与生物统计识别技术有限公司 Fingerprint identification method as well as fingerprint controlling method and system
HK1062117A2 (en) * 2002-09-25 2004-09-17 Univ Hong Kong Polytechnic Method of palm print identification using geometry, line and/or texture features
US20040057606A1 (en) 2002-09-25 2004-03-25 The Hong Kong Polytechnic University Apparatus for capturing a palmprint image
US7496214B2 (en) * 2002-09-25 2009-02-24 The Hong Kong Polytechnic University Method of palm print identification
JPWO2005008753A1 (en) 2003-05-23 2006-11-16 株式会社ニコン Template creation method and apparatus, pattern detection method, position detection method and apparatus, exposure method and apparatus, device manufacturing method, and template creation program
JP2005149455A (en) 2003-10-21 2005-06-09 Sharp Corp Image collating apparatus, image collating method, image collating program and computer readable recording medium stored with image collating program
US20050281438A1 (en) 2004-06-21 2005-12-22 Zhang David D Palm print identification using palm line orientation
US7664326B2 (en) 2004-07-09 2010-02-16 Aloka Co., Ltd Method and apparatus of image processing to detect and enhance edges
US7359555B2 (en) 2004-10-08 2008-04-15 Mitsubishi Electric Research Laboratories, Inc. Detecting roads in aerial images using feature-based classifiers
KR100752640B1 (en) * 2005-01-05 2007-08-29 삼성전자주식회사 Method and apparatus for segmenting fingerprint region using directional gradient filters
US20070036400A1 (en) * 2005-03-28 2007-02-15 Sanyo Electric Co., Ltd. User authentication using biometric information
JP4932177B2 (en) * 2005-04-19 2012-05-16 グローリー株式会社 Coin classification device and coin classification method
JP4871144B2 (en) 2006-01-13 2012-02-08 株式会社東芝 Image processing apparatus, method, and program
JP4937607B2 (en) 2006-03-14 2012-05-23 富士通株式会社 Biometric authentication method and biometric authentication device
US20080298642A1 (en) * 2006-11-03 2008-12-04 Snowflake Technologies Corporation Method and apparatus for extraction and matching of biometric detail
JP2008152530A (en) 2006-12-18 2008-07-03 Sony Corp Face recognition device, face recognition method, gabor filter applied device, and computer program
JP4611427B2 (en) * 2007-01-24 2011-01-12 富士通株式会社 Image reading apparatus, image reading program, and image reading method
US8285010B2 (en) * 2007-03-21 2012-10-09 Lumidigm, Inc. Biometrics based on locally consistent features
US20090185746A1 (en) 2008-01-22 2009-07-23 The University Of Western Australia Image recognition
JP5061988B2 (en) * 2008-03-25 2012-10-31 日本電気株式会社 Ridge direction extraction device, ridge direction extraction program, and ridge direction extraction method
JP5031641B2 (en) * 2008-03-31 2012-09-19 富士通株式会社 Pattern alignment method, verification method, and verification device
US8265347B2 (en) 2008-04-24 2012-09-11 The Hong Kong Polytechnic University Method and system for personal identification using 3D palmprint imaging
JP4997178B2 (en) * 2008-06-10 2012-08-08 学校法人中部大学 Object detection device
WO2010044250A1 (en) * 2008-10-15 2010-04-22 日本電気株式会社 Pattern check device and pattern check method
US20100158329A1 (en) * 2008-12-19 2010-06-24 Shajil Asokan Thaniyath Elegant Solutions for Fingerprint Image Enhancement
EP2495699B1 (en) 2009-10-30 2019-07-10 Fujitsu Frontech Limited Biometric information registration method, biometric authentication method, and biometric authentication device
JP5870922B2 (en) * 2010-08-12 2016-03-01 日本電気株式会社 Image processing apparatus, image processing method, and image processing program
JP5500024B2 (en) * 2010-09-27 2014-05-21 富士通株式会社 Image recognition method, apparatus, and program
US20120108973A1 (en) 2010-11-01 2012-05-03 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus and ultrasonic image processing apparatus
WO2012078114A1 (en) * 2010-12-09 2012-06-14 Nanyang Technological University Method and an apparatus for determining vein patterns from a colour image
US20120194662A1 (en) 2011-01-28 2012-08-02 The Hong Kong Polytechnic University Method and system for multispectral palmprint verification
JP2012256272A (en) * 2011-06-10 2012-12-27 Seiko Epson Corp Biological body identifying device and biological body identifying method
US20130004028A1 (en) * 2011-06-28 2013-01-03 Jones Michael J Method for Filtering Using Block-Gabor Filters for Determining Descriptors for Images
JP5915664B2 (en) * 2011-12-15 2016-05-11 富士通株式会社 Vein authentication method and vein authentication apparatus
SG11201405394PA (en) 2012-03-16 2014-11-27 Universal Robot Kabushiki Kaisha Personal authentication method and personal authentication device
JP6024141B2 (en) 2012-03-23 2016-11-09 富士通株式会社 Biological information processing apparatus, biological information processing method, and biological information processing program
JP6129309B2 (en) 2012-07-12 2017-05-17 デュアル・アパーチャー・インターナショナル・カンパニー・リミテッド Gesture based user interface
US8953854B2 (en) 2012-08-08 2015-02-10 The Hong Kong Polytechnic University Contactless 3D biometric feature identification system and method thereof
JP5971089B2 (en) 2012-11-14 2016-08-17 富士通株式会社 Biological information correction apparatus, biological information correction method, and biological information correction computer program
JP6116291B2 (en) 2013-02-27 2017-04-19 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
WO2014142171A1 (en) * 2013-03-13 2014-09-18 富士通フロンテック株式会社 Image processing device, image processing method, and program
US9141872B2 (en) 2013-09-11 2015-09-22 Digitalglobe, Inc. Automated and scalable object and feature extraction from imagery
US10599932B2 (en) * 2014-06-09 2020-03-24 Lawrence Livermore National Security, Llc Personal electronic device for performing multimodal imaging for non-contact identification of multiple biometric traits

Also Published As

Publication number Publication date
EP3125195A1 (en) 2017-02-01
EP3125195B1 (en) 2020-03-11
WO2015145589A1 (en) 2015-10-01
EP3125195A4 (en) 2017-02-22
JPWO2015145589A1 (en) 2017-04-13
US20170004348A1 (en) 2017-01-05
US10019616B2 (en) 2018-07-10

Similar Documents

Publication Publication Date Title
Priesnitz et al. An overview of touchless 2D fingerprint recognition
JP6528608B2 (en) Diagnostic device, learning processing method in diagnostic device, and program
JP6553976B2 (en) Authentication apparatus, authentication method and recording medium
US20160162673A1 (en) Technologies for learning body part geometry for use in biometric authentication
KR20190094352A (en) System and method for performing fingerprint based user authentication using a captured image using a mobile device
Kumar Can we use minor finger knuckle images to identify humans?
JP2011159035A (en) Biometric authentication apparatus, biometric authentication method and program
KR102369412B1 (en) Device and method to recognize iris
JP6648639B2 (en) Biological information processing apparatus, biological information processing method, and biological information processing program
EP3217659B1 (en) Image processing apparatus, image processing method, and program
US10019619B2 (en) Biometrics authentication device and biometrics authentication method
JP6069581B2 (en) Biometric authentication device, biometric authentication method, and program
JP6629150B2 (en) Palm detection device, palm print authentication device, palm detection method, and program
JP6431044B2 (en) Biometric authentication device, biometric authentication method, and program
JP6117988B2 (en) Biometric authentication device, biometric authentication method, and program
CN109409322B (en) Living body detection method and device, face recognition method and face detection system
JP2010277196A (en) Information processing apparatus and method, and program
JP6242726B2 (en) Biometric information registration method, biometric authentication method, biometric information registration device, biometric authentication device, and program
KR102028049B1 (en) Method for authorizing pet using vein pattern of ear

Legal Events

Date Code Title Description
A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20161220

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170217

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20170314

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170613

A911 Transfer to examiner for re-examination before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20170622

A912 Re-examination (zenchi) completed and case transferred to appeal board

Free format text: JAPANESE INTERMEDIATE CODE: A912

Effective date: 20170707

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20181101

R150 Certificate of patent or registration of utility model

Ref document number: 6431044

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150