TWI531985B - Palm biometric method - Google Patents

Palm biometric method Download PDF

Info

Publication number
TWI531985B
TWI531985B TW100103407A TW100103407A TWI531985B TW I531985 B TWI531985 B TW I531985B TW 100103407 A TW100103407 A TW 100103407A TW 100103407 A TW100103407 A TW 100103407A TW I531985 B TWI531985 B TW I531985B
Authority
TW
Taiwan
Prior art keywords
point
palm
processor
points
tested
Prior art date
Application number
TW100103407A
Other languages
Chinese (zh)
Other versions
TW201232428A (en
Inventor
Yang-Zhen-Sen Ou
Jing-Tai Jiang
Rong-Qing Wu
zhong-yan Cai
Original Assignee
Univ Ishou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Ishou filed Critical Univ Ishou
Priority to TW100103407A priority Critical patent/TWI531985B/en
Publication of TW201232428A publication Critical patent/TW201232428A/en
Application granted granted Critical
Publication of TWI531985B publication Critical patent/TWI531985B/en

Links

Landscapes

  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Description

手掌生物辨識方法Palm biometric method

本發明是有關於一種手掌生物辨識方法,特別是指一種可以有效辨識以任何方向及任何角度輸入待測手掌影像之手掌生物辨識方法。The invention relates to a palm biometric identification method, in particular to a palm biometric identification method capable of effectively recognizing a palm image to be tested in any direction and at any angle.

手掌辨識技術因為具有低成本極高辨識率等優點,而逐漸成為現今以生物特徵進行身份確認的主流技術之一。Because of its low cost and high recognition rate, palm recognition technology has gradually become one of the mainstream technologies for identity recognition with biometrics.

Ribaric,S.等人於2009年在IEEE Transactions on Pattern Analysis and Machine Intelligence中所提出之「A biometric identification system based on eigenpalm and eigenfinger features」,其中,該先前技術藉由一待測手掌及其手指之多數個子影像以得到相關的特徵值,進而藉由該等特徵值以進行生物特徵辨識,然而,該先前技術對於所偵測的該待測手掌的角度及背景亮度的變化,皆有特定的要求,因此在實際應用上往往會容易產生誤判。"A biometric identification system based on eigenpalm and eigenfinger features" proposed by Ribaric, S. et al. in IEEE Transactions on Pattern Analysis and Machine Intelligence, wherein the prior art is controlled by a palm and its fingers. The majority of the sub-images are used to obtain the relevant feature values, and then the feature values are used for biometric identification. However, the prior art has specific requirements for the detected angles of the palm to be tested and the background brightness. Therefore, it is often easy to produce false positives in practical applications.

而Wu,X.,Zhang等人於2006年在IEEE Transactions on Systems,Man,and Cybernetics中所提出之「Palm line extraction and matching for personal authentication」,其中,該先前技術利用一鏈結編碼(chaining coding)技術,得到多數條手掌切割線(palm lines)以進行生物特之辨識,然而,該先前技術往往因為該鏈結編碼的演算法靈敏度太高導致很難得到該等適當的手掌切割線。Wu, X., Zhang et al., "Palm line extraction and matching for personal authentication", IEEE Transactions on Systems, Man, and Cybernetics, 2006, wherein the prior art utilizes chaining coding (chaining coding). The technique is to obtain a plurality of palm lines for biometric identification. However, this prior art often makes it difficult to obtain such appropriate palm cutting lines because the sensitivity of the algorithm of the link encoding is too high.

此外,Li,F.,Leung等人於2006年在Proceedings of 18th Conference on Pattern Recognition中所提出之「Hierarchical identification of palmprint using line-based Hough transform」,以及Wu,J.,Qiu等人於2006年在Proceedings of 18th Conference on Pattern Recognition中所提出之「A hierarchical palmprint identification method using hand geometry and grayscale distribution features」,該等先前技術皆是利用手掌幾何及其灰階分布(Gray-scale distribution)特徵以進行生物特之辨識,然而該等先前技術對於一非垂直放置之待測手掌影像(亦即具有一旋轉角度之手掌影像),無法校正其旋轉角度而得到該手掌之正確相關位置。In addition, Li, F., Leung et al., "Hierarchical identification of palmprint using line-based Hough transform" proposed in Proceedings of 18th Conference on Pattern Recognition, and Wu, J., Qiu et al. "A hierarchical palmprint identification method using hand geometry and grayscale distribution features" proposed in Proceedings of 18th Conference on Pattern Recognition, which utilizes palm geometry and its gray-scale distribution characteristics for performing Biometric identification, however, such prior art is unable to correct the angle of rotation of a palm image to be tested that is not vertically placed (i.e., a palm image having a rotational angle) to obtain the correct relative position of the palm.

因此,Ouyang,C.-S.等人於2006年在Proceedings of 21th International Conference on Industrial,Engineering & Other Applications of Applied Intelligent Systems中所提出之「An adaptive biometric system based on palm texture feature and LVQ neural network」,其根據一局部模糊樣本(Local fuzzy pattern,LFP)之特徵描述態樣(Texture descriptor),將每一待測手掌之子影像(Subimage)轉換為一對應的特徵向量(Feature vector),然後利用該等特徵向量偵測出該待測手掌的轉動角度及位置,雖然相較於前面所述之該等先前技術而言,可以克服需要特定角度及背景亮度、演算法靈敏度過高、具有轉動角度等等的問題,然而,該先前技術掃描該待測手掌影像時,若是該待測手掌之旋轉角度大於50度時將可能產生誤判動作,使得一合法使用者因該手掌辨識系統無法判讀其待測手掌影像而發生無法登入的情形。Therefore, Ouyang, C.-S. et al., "An adaptive biometric system based on palm texture feature and LVQ neural network", Proceedings of 21th International Conference on Industrial, Engineering & Other Applications of Applied Intelligent Systems, 2006. According to a texture descriptor of a local fuzzy pattern (LFP), each sub-image of the palm to be tested is converted into a corresponding feature vector, and then utilized. The feature vector detects the rotation angle and position of the palm to be tested, although it can overcome the need for a specific angle and background brightness, the sensitivity of the algorithm is too high, has a rotation angle, etc., compared with the prior art described above. The problem, however, when the prior art scans the image of the palm to be tested, if the rotation angle of the palm to be tested is greater than 50 degrees, a misjudgment action may occur, so that a legitimate user cannot interpret the test for the palm identification system. A situation in which the palm image is unable to log in.

此外,Ouyang,C.-S.等人於2010年在International conference on machine learning and cybernetics(ICMLC)中所提出之「An improved neural-network-based biometric system with rotation detection mechanism」,雖可接受該待測手掌以任意旋轉角度掃描之影像,然而其為了降低辨識演算法複雜度而降低該待測手掌影像之解析度,同時將對應產生之多數個子影像(Subimage)正規化成直方圖後,會使所有手指子影像的長度、寬度等資訊有所漏失,因此,該先前技術雖然可增加辨識待測手掌之位置的準確率,但卻可能發生具有相似手掌特徵之使用者,因為對應之子影像中漏失部分資訊而導致該手掌辨識系統的誤判率上升之情形,使得該手掌辨識系統誤將一合法使用者判定為一非法使用者、或是將一非法使用者判定為一合法使用者。In addition, Ouyang, C.-S. et al., "An improved neural-network-based biometric system with rotation detection mechanism" proposed in International conference on machine learning and cybernetics (ICMLC) in 2010, although acceptable The image of the palm is scanned at an arbitrary rotation angle. However, in order to reduce the complexity of the recognition algorithm, the resolution of the palm image to be tested is reduced, and the corresponding subimages (Subimage) are normalized into a histogram, which will cause all The length and width of the finger image are missing. Therefore, although the prior art can increase the accuracy of identifying the position of the palm to be tested, a user with similar palm characteristics may occur because the missing portion of the corresponding sub-image is lost. The information causes the false positive rate of the palm recognition system to rise, so that the palm recognition system mistakes a legitimate user as an illegal user or an illegal user as a legitimate user.

因此,如何找到一個既可提升該待測手掌影像之辨識度且可有效降低辨識所需花費之時間的方法,使得一手掌辨識系統的辨識準確率及辨識效率可以有效改善,是值得相關人士持續研究以改善的目標。Therefore, how to find a method that can improve the recognition of the image of the palm to be tested and effectively reduce the time required for the identification, so that the recognition accuracy and identification efficiency of the palm recognition system can be effectively improved, and it is worthy of continuous Research to improve goals.

因此,本發明之目的,即在提供一種手掌生物辨識方法,適用於以一處理器接收一使用者手掌所對應之待測手掌影像,其包含以下步驟:組配該處理器,以處理該待測手掌影像,以得到一組邊界像素;組配該處理器,以偵測該待測手掌之旋轉角度;組配該處理器,依據該待測手掌方向,依序搜尋該組邊界像素中每一邊界像素座標之y值與前後邊界像素座標之y值的變化情形,設定多數個特徵點,其中該等特徵點包括多數個指谷點、指尖點及目標點;組配該處理器,以根據該等特徵點得到多數個子影像,其包括以下子步驟:組配該處理器,根據一指尖點與對應之二指谷點,或一指尖點與對應之一目標點與一指谷點,計算出對應每一指尖點對應之中點,再根據該等中點與對應之指尖點的連線分別得到五個間隔點,且每一間隔點於該組邊界像素中對應得到二交越點,而該等交越點與該指尖點、該中點、該等指谷點或該指谷點與該目標點得到一對應之第一子影像;組配該處理器,根據二目標點所形成之一第一設定線段,並根據該第一設定線段為一邊長所形成之正方形以得到一特定點,再根據該特定點、該等目標點及對應之該等指谷點得到一第二子影像;組配該處理器,將該等子影像之亮度正規化;及組配該處理器,將每一子影像轉換為一對應的特徵向量;及組配該處理器,根據該等特徵向量,得到並儲存該使用者所對應之生物代表模型。Therefore, the object of the present invention is to provide a palm biometric method for receiving a palm image to be tested corresponding to a user's palm by a processor, comprising the steps of: assembling the processor to process the Measuring the palm image to obtain a set of boundary pixels; assembling the processor to detect the rotation angle of the palm to be tested; assembling the processor, sequentially searching each of the set of boundary pixels according to the direction of the palm to be tested a variation of the y value of a boundary pixel coordinate and the y value of the front and rear boundary pixel coordinates, wherein a plurality of feature points are set, wherein the feature points include a plurality of finger valley points, fingertip points, and target points; Obtaining a plurality of sub-images according to the feature points, comprising the following sub-steps: assembling the processor, according to a fingertip point and a corresponding two-finger valley point, or a fingertip point and a corresponding one of the target point and the one finger The valley point is calculated corresponding to the midpoint corresponding to each fingertip point, and then five interval points are respectively obtained according to the connection line between the midpoint and the corresponding fingertip point, and each interval point corresponds to the group boundary pixel get a crossover point, and the intersection point and the fingertip point, the midpoint, the finger valley point or the finger valley point and the target point are corresponding to the first sub-image; assembling the processor, according to a first set line segment formed by the two target points, and a square formed by one side of the first set line segment to obtain a specific point, and then obtained according to the specific point, the target points, and the corresponding valley points a second sub-image; assembling the processor to normalize the brightness of the sub-images; and assembling the processor to convert each sub-image into a corresponding feature vector; and assembling the processor, according to The feature vectors obtain and store a bio-representative model corresponding to the user.

有關本發明之前述及其他技術內容、特點與功效,在以下配合參考圖式之一個較佳實施例的詳細說明中,將可清楚的呈現。The above and other technical contents, features and advantages of the present invention will be apparent from the following detailed description of the preferred embodiments.

根據Ouyang,C.-S.等人於2006年在Proceedings of 21th International Conference on Industrial,Engineering & Other Applications of Applied Intelligent Systems中所提出之「An adaptive biometric system based on palm texture feature and LVQ neural network」相關技術中,一手掌生物辨識方法包含以下程序:一註冊程序(Registration)及一驗證程序(Identification and verification)。According to Ouyang, C.-S. et al., "An adaptive biometric system based on palm texture feature and LVQ neural network" proposed by Proceedings of 21th International Conference on Industrial, Engineering & Other Applications of Applied Intelligent Systems, 2006. In the technology, a palm biometric method includes the following procedures: a registration procedure and an identification and verification.

該註冊程序包括接收一組合法使用者的手掌特徵向量(Palm feature vector),並經由一學習向量量化(Learning vector quantization,LVQ2)演算法運算之後,得到該合法使用者所對應之生物代表模型(Representative prototype),並將該等生物代表模型儲存於一合法使用者資料庫(Registration user database)中,其中,每一生物代表模型具有六個特徵向量。The registration procedure includes receiving a palm feature vector of a set of legitimate users, and performing a learning vector quantization (LVQ2) algorithm operation to obtain a bio-representative model corresponding to the legal user ( Representative prototype), and store the bio-representative models in a Registry user database, where each bio-representative model has six feature vectors.

該驗證程序包括接收一組未知使用者的手掌特徵向量,並將該等手掌特徵向量與儲存於該合法使用者資料庫中的每一生物代表模型之六個特徵向量進行比對,若是存在一生物代表模型之六個特徵向量與該組未知使用者的手掌特徵向量相等時,則該未知使用者將被視為一合法使用者而接受其登入請求,若否,則該未知使用者將被視為一非法使用者而拒絕其登入請求。The verification program includes receiving a palm eigenvector of a group of unknown users, and comparing the palm eigenvectors with six eigenvectors of each biometric model stored in the legitimate user database, if one exists When the six feature vectors of the biological representative model are equal to the palm feature vector of the unknown user, the unknown user will be regarded as a legitimate user and accept the login request. If not, the unknown user will be Rejected as a illegitimate user and denied their login request.

因此,本發明之特色著重於:如何有效得到一組合法使用者的手掌特徵向量,以將其轉換為對應之生物代表模型後並儲存之,或是得到一組未知使用者的手掌特徵向量,以將其與該合法使用者資料庫中的每一生物代表模型之六個特徵向量進行比對,判定該未知使用者是否為一合法使用者。Therefore, the features of the present invention focus on how to effectively obtain a palm eigenvector of a group of legitimate users, convert it into a corresponding bio-representative model and store it, or obtain a set of palm eigenvectors of an unknown user. The six eigenvectors of each biometric model in the legal user database are compared to determine whether the unknown user is a legitimate user.

參閱圖1,本發明之手掌生物辨識方法9之一較佳實施例,適用於以一處理器接收一使用者手掌所對應之待測手掌影像,其包括以下步驟:步驟91是組配該處理器,以處理該待測手掌影像;步驟92是組配該處理器,以偵測該待測手掌之旋轉角度;步驟93是組配該處理器,以搜尋該待測手掌之所有特徵點;及步驟94是組配該處理器,以根據該等特徵點得到該待測手掌之特徵向量。Referring to FIG. 1 , a preferred embodiment of the palm biometric identification method 9 of the present invention is adapted to receive, by a processor, a palm image to be tested corresponding to a user's palm, and the method includes the following steps: Step 91 is to assemble the processing. The processor is configured to process the image of the palm to be tested; step 92 is to assemble the processor to detect the rotation angle of the palm to be tested; and step 93 is to assemble the processor to search for all feature points of the palm to be tested; And step 94 is to assemble the processor to obtain a feature vector of the palm to be tested according to the feature points.

每一步驟91~94分別說明如下:參閱圖2,步驟91是處理該待測手掌影像之步驟,具有以下子步驟:子步驟911是組配該處理器,以設定一灰階像素臨界值(Threshold of gray level)T,並根據該灰階像素臨界值,將該待測手掌影像轉換為一個二位元影像(Binary image),其中,該灰階像素臨界值T的設定方式如下方程式(F.1)所示:Each step 91-94 is described as follows: Referring to FIG. 2, step 91 is a step of processing the image of the palm to be tested, and has the following sub-steps: sub-step 911 is to assemble the processor to set a grayscale pixel threshold ( Threshold of gray level)T, and according to the grayscale pixel threshold, the palm image to be tested is converted into a binary image, wherein the grayscale pixel threshold T is set as follows ( F .1) shown:

M、N代表該待測手掌影像之解析度為M×N,而g(x,y)代表該待測手掌影像中座標為(x,y)之像素值(Pixel value)。M and N represent the resolution of the palm image to be tested as M×N, and g ( x , y ) represents the pixel value (pixel value) of the coordinates of ( x , y ) in the image of the palm to be tested.

此外,該二位元影像之該轉換方式為:當該處理器判斷該待測手掌影像中座標為(x,y)之像素的像素值g(x,y)大於該灰階像素臨界值T時,組配該處理器將該像素值g(x,y)設定為255,反之,當該像素值g(x,y)小於該灰階像素臨界值T時,組配該處理器將該像素值g(x,y)設定為0;子步驟912是組配該處理器,根據一3×3的中位濾波器(Median filter),消除該二位元影像中的雜訊點;子步驟913是組配該處理器,利用一拉普拉斯濾波器(Laplacian filter),以完成該二位元影像中的邊緣偵測(Edge detection),進而得到一組邊界像素,其中,該組邊界像素中的每一邊界像素皆為該待測手掌影像中手掌輪廓上之一像素;及子步驟914是組配該處理器,依序標記該子步驟914中之該組邊界像素,其依序方式為,由(M-1,N-1)、(M-1,N-2)、...、(M-1,0)、(M-2,N-1)、(M-2,N-2)、...、(0,0)。In addition, the conversion manner of the two-bit image is: when the processor determines that the pixel value g ( x , y ) of the pixel whose coordinates are ( x , y ) in the image of the palm to be tested is greater than the threshold value T of the gray-scale pixel When the processor is configured to set the pixel value g ( x , y ) to 255, and when the pixel value g ( x , y ) is less than the gray-scale pixel threshold T, the processor is configured to The pixel value g ( x , y ) is set to 0; the sub-step 912 is to assemble the processor, and to eliminate the noise point in the binary image according to a 3×3 median filter; Step 913 is to assemble the processor, and use a Laplacian filter to complete edge detection in the binary image, thereby obtaining a set of boundary pixels, wherein the group Each boundary pixel in the boundary pixel is one of the pixels on the palm contour of the image of the palm to be tested; and sub-step 914 is to assemble the processor to sequentially mark the set of boundary pixels in the sub-step 914. The order is: (M-1, N-1), (M-1, N-2), ..., (M-1, 0), (M-2, N-1), (M- 2, N-2), ..., (0, 0).

聯合參閱圖3、4,步驟92是偵測手掌旋轉角度之步驟,具有以下子步驟:子步驟921是組配該處理器,設定多數條與該待測手掌影像交錯之水平線段,並判斷當一條水平線段上的一座標值與該組邊界像素中的一像素座標值相等時,即代表該水平線段與該待測手掌影像中之手掌輪廓有一交錯點;子步驟922是組配該處理器,選取一與該手掌輪廓有八個交錯點P1~P8之水平線段,作為一第一參考線段;子步驟923是組配該處理器,選取該等交錯點P2與P3之中心點PM1,並根據該中心點PM1由該組邊界像素中選取一對應之參考點Pref;子步驟924是組配該處理器,根據該參考點Pref座標與該交錯點P2判斷該待測手掌方向,當該參考點Pref座標之y座標大於該該交錯點P2之y座標,代表該待測手掌方向為手指尖端方向朝-y方向,反之,當該參考點Pref座標之y座標小於該該交錯點P2之y座標,代表該待測手掌方向為手指尖端方向朝+y方向;子步驟925是組配該處理器,根據該等交錯點P3、P4設定一第一標記點,其中,當該待測手掌方向為手指尖端方向朝+y方向時,該第一標記點的設定方式為計算其交錯點P3、P4之中點PM2之後,再計算該組邊界像素中,x座標介於該交錯點P3、P4之間且y座標大於該中點PM2之y座標的每一像素與該中點PM2之距離,其中具有最大距離之像素即為該第一標記點,相似的,若是該待測手掌方向為手指尖端方向朝-y方向時,則計算該組邊界像素中,x座標介於該交錯點P3、P4之間且y座標小於該中點PM2之y座標的每一像素與該中點PM2之距離,其中具有最大距離之像素即為該第一標記點;同理,組配該處理器根據該等交錯點P5、P6與該等交錯點P5、P6之中點PM3設定一第二標記點;子步驟926是組配該處理器,分別計算出該第一標記點與中點PM2的斜率為一第一斜率,該第二標記點與中點PM3的斜率為一第二斜率,再計算該第一斜率與該第二斜率之平均值為一平均斜率;及子步驟927是組配該處理器,根據該平均斜率即可得到一對應之轉動角度,例如:當該平均斜率為1時,根據畢氏定理可以得知,其對應之轉動角度為45度,其他依此類推。Referring to FIG. 3 and FIG. 4 together, step 92 is a step of detecting a rotation angle of the palm, and has the following sub-steps: sub-step 921 is to assemble the processor, set a horizontal line segment in which a plurality of strips are interlaced with the image of the palm to be tested, and determine when When a value on a horizontal line segment is equal to a pixel coordinate value in the set of boundary pixels, that is, the horizontal line segment has an intersection with the palm contour in the image of the palm to be tested; and sub-step 922 is to assemble the processor. Selecting a horizontal line segment having eight interlaced points P1~P8 with the palm outline as a first reference line segment; sub-step 923 is assembling the processor, selecting the center point PM1 of the interlaced points P2 and P3, and Selecting a corresponding reference point P ref from the set of boundary pixels according to the center point PM1; sub-step 924 is to assemble the processor, and determining the direction of the palm to be tested according to the reference point P ref coordinate and the interlaced point P2 The y coordinate of the reference point P ref coordinate is larger than the y coordinate of the interlaced point P2, which means that the direction of the palm to be tested is the direction of the finger tip toward the -y direction, and vice versa, when the y coordinate of the reference point P ref coordinate is smaller than the interlace Point P2 a coordinate indicating that the direction of the palm to be tested is the direction of the tip of the finger toward the +y direction; the sub-step 925 is to assemble the processor, and a first marker point is set according to the interlaced points P3, P4, wherein the direction of the palm to be tested is When the direction of the tip of the finger is toward the +y direction, the first marker point is set by calculating the point PM2 among the interlaced points P3 and P4, and then calculating the boundary pixel of the group, the x coordinate is between the interlaced points P3 and P4. The distance between the y coordinate and the y coordinate of the y coordinate of the midpoint PM2 and the midpoint PM2, wherein the pixel having the largest distance is the first marker point, similarly, if the direction of the palm to be tested is a finger When the tip direction is in the -y direction, the distance between each pixel of the set of boundary pixels, the x coordinate is between the interlaced points P3, P4, and the y coordinate is smaller than the midpoint PM2, and the midpoint PM2 is calculated. The pixel having the largest distance is the first marker point; similarly, the processor is configured to set a second marker point according to the interlace points P5 and P6 and the point PM3 among the interlaced points P5 and P6; Sub-step 926 is to assemble the processor to calculate the first mark separately The slope of the midpoint PM2 is a first slope, the slope of the second marker point and the midpoint PM3 is a second slope, and the average of the first slope and the second slope is calculated as an average slope; Step 927 is to assemble the processor, and according to the average slope, a corresponding rotation angle can be obtained. For example, when the average slope is 1, according to the Bishop's theorem, the corresponding rotation angle is 45 degrees, and the other So on and so forth.

聯合參閱圖5、6,步驟93是搜尋手掌特徵點之步驟,具有以下子步驟:子步驟931是組配該處理器,依據該待測手掌方向,依序搜尋該組邊界像素中每一邊界像素之y值與前後邊界像素之y值的變化情形,當該待測手掌方向為手指尖端方向朝+y方向時,一目標邊界像素Sn的y值分別小於前一邊界像素Sn-1與後一邊界像素Sn+1的y值,該目標邊界像素Sn被定義為一指谷點(Valley-point),而若是一目標邊界像素Sm的y值分別大於前一邊界像素Sm-1與後一邊界像素Sm+1的y值,該目標邊界像素Sm被定義為一指尖點(fingertip-point),因此最終可以得到五個指尖點T1~T5、及四個指谷點B1~B4,其中,該第二指尖點T2即為該第一標記點,該第三指尖點T3即為該第二標記點;反之,當該待測手掌方向為手指尖端方向朝-y方向時,一目標邊界像素Sn的y值分別小於前一邊界像素Sn-1與後一邊界像素Sn+1的y值,該目標邊界像素Sn被定義為一指尖點,而若是一目標邊界像素Sm的y值分別大於前一邊界像素Sm-1與後一邊界像素Sm+1的y值,該目標邊界像素Sm被定義為一指谷點;及子步驟932是組配該處理器,根據該第一指尖點T1與該第一指谷點B1之距離,得到在該組邊界像素中具有相等距離之一邊界像素,且定義其為一第一目標點K1,相似的,組配該處理器,根據該第四指尖點T4與該第三指谷點B3,得到在該組邊界像素中具有相等距離之一邊界像素,且定義其為一第二目標點K2,及根據該第五指尖點T5與該第四指谷點B4,得到在該組邊界像素中具有相等距離之一邊界像素,且定義其為一第三目標點K3。Referring to FIG. 5 and FIG. 6, step 93 is a step of searching for palm feature points, and has the following sub-steps: sub-step 931 is to assemble the processor, and sequentially search for each boundary of the set of boundary pixels according to the direction of the palm to be tested. y y values change in the case of the pixel values of boundary pixels of the front and rear, the palm when the test direction towards the finger tip + y direction, a certain boundary pixels S n y values are smaller than the previous boundary pixels S n-1 and after a boundary pixel S n + y value of 1, the object boundary pixels S n is defined as a finger valley points (Valley-point), while the y value if a certain boundary pixels S m are each greater than the previous one boundary pixel S The y value of m-1 and the latter boundary pixel S m+1 , the target boundary pixel S m is defined as a fingertip-point, so that five fingertip points T1 to T5 and four can be finally obtained. Pointing to the valley point B1~B4, wherein the second fingertip point T2 is the first marking point, and the third fingertip point T3 is the second marking point; otherwise, when the direction of the palm to be tested is a finger -y direction towards the tip, y boundary values of a target pixel is smaller than S n are respectively a front boundary pixels S n-1 as a boundary and the rear The y value of the prime S n+1 , the target boundary pixel S n is defined as a fingertip point, and if the y value of a target boundary pixel S m is greater than the previous boundary pixel S m-1 and the subsequent boundary pixel S respectively m + y value of 1, the target pixel boundary is defined as S m a valley means; and a sub-step 932 is set with the processor, the first valley point B1 and the distance of the fingertip based on the first point T1 Obtaining a boundary pixel having an equal distance in the set of boundary pixels, and defining it as a first target point K1, similarly, assembling the processor according to the fourth fingertip point T4 and the third finger valley Point B3, obtaining a boundary pixel having an equal distance in the set of boundary pixels, and defining it as a second target point K2, and according to the fifth fingertip point T5 and the fourth finger point B4, The group boundary pixel has one of the equal distance boundary pixels and is defined as a third target point K3.

因此,此步驟完成之後可以得到五個指尖點T1~T5、四個指谷點B1~B4,及三個目標點K1~K3等共計十二個手掌特徵點。Therefore, after this step is completed, five fingertip points T1~T5, four finger points B1~B4, and three target points K1~K3 are obtained, and a total of twelve palm feature points are obtained.

參閱圖7,步驟94是得到該待測手掌特徵向量之步驟,具有以下子步驟:步驟941是組配該處理器,得到六個子影像(Sub image),其中,包括五個手指子影像及一個手掌子影像,而該處理器根據該第一目標點K1、該第一指谷點B1及該第一指尖點T1得到一第一手指子影像,根據第一、第二指谷點B1、B2及該第二指尖點T2得到一第二手指子影像,根據第二、第三指谷點B2、B3及該第三指尖點T3得到一第三手指子影像,根據該第二目標點K2、該第三指谷點B3及該第四指尖點T4得到一第四手指子影像,根據該第三目標點K3、該第四指谷點B4及該第五指尖點T5得到一第五手指子影像;參閱圖8,現以該第一手指子影像說明其設定方式如下:該處理器計算該第一目標點K1及該第一指谷點B1之中點m1,並且根據一以該中點m1與該第一指尖點T1為兩端點之線段,分別於六分之一線段長度處、六分之二線段長度處、六分之三線段長度處、六分之四線段長度處,及六分之五線段長度處,設定一間隔點m2、m3、m4、m5、m6,再根據該等間隔點m2~m6分別延伸交越該第一手指之輪廓於一組交越點(F1,F2)、(F3,F4)、(F5,F6)、(F7,F8)、(F9,F10),因此,依序沿著該第一目標點K1、該等交越點F1、F3、F5、F7、F9、該第一指尖點T1、該等交越點F10、F8、F6、F4、F2、該第一指谷點B1及該中點m1,取得一封閉的第一手指子影像。Referring to FIG. 7, step 94 is a step of obtaining the palm feature vector to be tested, and has the following sub-steps: step 941 is to assemble the processor to obtain six sub images, including five finger images and one a palm image, and the processor obtains a first finger image according to the first target point K1, the first finger point B1, and the first finger point T1, according to the first and second finger points B1. B2 and the second fingertip point T2 obtain a second finger image, and according to the second and third finger points B2 and B3 and the third fingertip point T3, a third finger image is obtained according to the second target. a fourth finger image is obtained from the point K2, the third finger point B3, and the fourth finger point T4, and is obtained according to the third target point K3, the fourth finger point B4, and the fifth finger point point T5. a fifth finger image; referring to FIG. 8, the first finger sub-image is described as being set as follows: the processor calculates the first target point K1 and the point m1 of the first finger point B1, and according to a line segment at which the midpoint m1 and the first fingertip point T1 are both end points, respectively, at a length of one sixth of the line segment , at the length of the two-sixth line, the length of the three-sixth line, the length of the six-fifth line, and the length of the five-fifth line, set a spacing point m2, m3, m4, m5, m6, and then The interval points m2~m6 respectively extend over the contour of the first finger to a set of crossing points (F1, F2), (F3, F4), (F5, F6), (F7, F8), (F9, F10), therefore, sequentially along the first target point K1, the intersection points F1, F3, F5, F7, F9, the first fingertip point T1, the intersection points F10, F8, F6, F4, F2, the first finger point B1 and the midpoint m1 obtain a closed first finger sub-image.

其餘之手指子影像之取得方式相同於該第一手指子影像,在此不多作贅述。The rest of the finger images are obtained in the same manner as the first finger image, and will not be described here.

另外,參閱圖9,該手掌子影像之設定方式如下所述:該處理器根據一以該第一目標點K1與該第二目標點K2得到一第一設定線段,然後以該第一設定線段為一邊長形成一正方形,以得到一與該第一目標點相連之一特定點S1,最後,組配該處理器於依序沿著該特定點S1、該第一目標點K1、該第一指谷點B1、該第三指谷點B3、該第二目標點K2,及該第三目標點K3,取得一封閉的手掌子影像。In addition, referring to FIG. 9, the palm image is set as follows: the processor obtains a first set line segment according to the first target point K1 and the second target point K2, and then uses the first set line segment. Forming a square for one side to obtain a specific point S1 connected to the first target point, and finally, assembling the processor to sequentially follow the specific point S1, the first target point K1, the first The valley point B1, the third index point B3, the second target point K2, and the third target point K3 are obtained to obtain a closed palm image.

因此,參閱圖10,可以得到對應之五個手指子影像及一個手掌子影像。Therefore, referring to FIG. 10, the corresponding five finger images and one palm image can be obtained.

回復參閱圖7,步驟942是組配該處理器,根據直方圖等化(histogram equalization)之方法將該等子影像之亮度正規化,使得每一子影像之亮度較為均勻;及步驟943是組配該處理器,根據局部模糊樣本(Local fuzzy pattern,LFP)之特徵描述態樣(Texture descriptor),將每一待測手掌之子影像轉換為一對應的特徵向量(Feature vector),其轉換方式說明如下:假設一子影像的每個像素點P(x,y),且其對應的亮度值為I(x,y),且其鄰域N 1P 1個參考方塊之亮度值集合為{I(x i ,y i )|1 i P 1)。首先,計算下列之向量:Referring to FIG. 7, step 942 is to assemble the processor, normalize the brightness of the sub-images according to a histogram equalization method, so that the brightness of each sub-image is relatively uniform; and step 943 is a group. The processor is configured to convert a sub-image of each palm to be tested into a corresponding feature vector according to a texture descriptor of a local fuzzy pattern (LFP), and a conversion method thereof is described. As follows: Suppose each pixel point P ( x , y ) of a sub-image, and its corresponding luminance value is I ( x , y ), and the luminance value set of P 1 reference blocks in its neighborhood N 1 is { I ( x i , y i )|1 i P 1 ). First, calculate the following vectors:

而μ i (1)=1/(1+exp{-α(I(x i ,y i )-I(x,y))}),μ i (0)=1-μ i (1),其中,μ i (1)及μ i (0)分別為一模糊歸屬函數,用以表示在一鄰域中參考像素P(x i ,y i )與像素P(x,y)之間亮度差值編碼成1及0的程度值。接著,計算下列之向量H(x,y):And μ i (1)=1/(1+exp{-α( I ( x i , y i )- I ( x , y ))}), μ i (0)=1-μ i (1), Where μ i (1) and μ i (0) are respectively a fuzzy attribution function to represent the difference in luminance between the reference pixel P ( x i , y i ) and the pixel P ( x , y ) in a neighborhood The value is encoded as a degree value of 1 and 0. Next, calculate the following vector H ( x , y ):

H(x,y)=(ω12,...,ω Q ).....(F.2) H ( x , y )=(ω 12 ,...,ω Q ).....( F .2)

其中,ω j 1(b 1)×μ2(b 2)×...×Q=-1,b 1 b 2...為第jP 1位元之二進位數字,共有Q=-1組可能的二進位組合。故ω j 表示編碼成b 1 b 2...b P 的程度值。再將其正規化,取得正規化後的向量H'(x,y)=()。最後,將子圖中所有像素的H'(x,y)相加並做正規化,便可以取得該子影像對應的特徵向量。Where ω j = μ 1 ( b 1 ) × μ 2 ( b 2 ) × ... × , Q = -1, b 1 b 2 ... P j for the first group the binary digits 1 yuan, the total Q = - 1 set of possible binary combinations. Therefore, ω j represents the degree of value encoded into b 1 b 2 ... b P . Then normalize it to obtain the normalized vector H' ( x , y ) = ( ). Finally, by adding and normalizing H' ( x , y ) of all the pixels in the subgraph, the feature vector corresponding to the sub image can be obtained.

綜上所述,本發明可以有效辨識以任何位置、任何角度輸入之待測手掌影像,而且藉由精準的設定該等子影像,可以對應得到更為精確的該等特徵向量以降低一手掌辨識系統之誤判率,故確實能達成本發明之目的。In summary, the present invention can effectively recognize the image of the palm to be tested input at any position and at any angle, and by accurately setting the sub-images, the more accurate feature vectors can be correspondingly obtained to reduce the palm recognition. The false positive rate of the system makes it possible to achieve the object of the present invention.

惟以上所述者,僅為本發明之較佳實施例而已,當不能以此限定本發明實施之範圍,即大凡依本發明申請專利範圍及發明說明內容所作之簡單的等效變化與修飾,皆仍屬本發明專利涵蓋之範圍內。The above is only the preferred embodiment of the present invention, and the scope of the invention is not limited thereto, that is, the simple equivalent changes and modifications made by the scope of the invention and the description of the invention are All remain within the scope of the invention patent.

9...手掌生物辨識方法9. . . Palm biometric method

91~94...步驟91~94. . . step

911~914...子步驟911~914. . . Substep

921~927...子步驟921~927. . . Substep

931~932...子步驟931~932. . . Substep

941~943...子步驟941~943. . . Substep

圖1是本發明之較佳實施例之流程圖;Figure 1 is a flow chart of a preferred embodiment of the present invention;

圖2是該處理該待測手掌影像之子步驟之流程圖;2 is a flow chart of the substep of processing the image of the palm to be tested;

圖3是該偵測手掌旋轉角度之子步驟之流程圖;Figure 3 is a flow chart of the substep of detecting the rotation angle of the palm;

圖4是該偵測手掌旋轉角度之一範例之示意圖;4 is a schematic diagram of an example of detecting a rotation angle of a palm;

圖5是該搜尋手掌特徵點之子步驟之流程圖;Figure 5 is a flow chart showing the substeps of the search for palm feature points;

圖6是該待測手掌之特徵點之示意圖;Figure 6 is a schematic view of the feature points of the palm to be tested;

圖7是該得到該待測手掌特徵向量之子步驟之流程圖;7 is a flow chart of the substep of obtaining the palm feature vector to be tested;

圖8是設定一手指子影像之示意圖;Figure 8 is a schematic diagram of setting a finger image;

圖9是設定一手掌子影像之示意圖;及Figure 9 is a schematic diagram of setting a palm image; and

圖10是該較佳實施例得到該等子影像之示意圖。FIG. 10 is a schematic diagram of the sub-image obtained by the preferred embodiment.

9...手掌生物辨識方法9. . . Palm biometric method

91~94...步驟91~94. . . step

Claims (4)

一種手掌生物辨識方法,適用於以一處理器接收一使用者手掌所對應之待測手掌影像,其包含以下步驟:組配該處理器,以處理該待測手掌影像,以得到一組邊界像素;組配該處理器,以偵測該待測手掌之旋轉角度,包括以下子步驟:組配該處理器,根據一第一參考線段中二交錯點之中心點,由該組邊界像素中選取一對應之參考點;組配該處理器,根據該參考點座標與該等交錯點之座標,以判斷該待測手掌方向;及組配該處理器,分別根據相鄰之二交錯點設定一第一標記點與一第二標記點,且計算出該等對應之中點,然後,再根據該等中點及該第一、第二標記點計算出一第一斜率與一第二斜率,並由該第一、第二斜率之平均值得到一平均斜率;及組配該處理器,根據該平均斜率得到一對應之轉動角度;組配該處理器,依據該待測手掌方向,依序搜尋該組邊界像素中每一邊界像素座標之y值與前後邊界像素座標之y值的變化情形,設定多數個特徵點,其中該等特徵點包括多數個指谷點、指尖點及目標點;組配該處理器,以根據該等特徵點得到多數個子影 像,其包括以下子步驟:組配該處理器,根據一指尖點與對應之二指谷點,或一指尖點與對應之一目標點與一指谷點,計算出對應每一指尖點對應之中點,再根據該等中點與對應之指尖點的連線分別得到五個間隔點,且每一間隔點於該組邊界像素中對應得到二交越點,而該等交越點與該指尖點、該中點、該等指谷點或該指谷點與該目標點得到一對應之第一子影像;組配該處理器,根據二目標點所形成之一第一設定線段,並根據該第一設定線段為一邊長所形成之正方形以得到一特定點,再根據該特定點、該等目標點及對應之該等指谷點得到一第二子影像;組配該處理器,將該等子影像之亮度正規化;及組配該處理器,將每一子影像轉換為一對應的特徵向量;及組配該處理器,根據該等特微向量,得到並儲存該使用者所對應之生物代表模型。 A palm biometric identification method is adapted to receive a palm image to be tested corresponding to a user's palm by a processor, comprising the steps of: assembling the processor to process the image of the palm to be tested to obtain a set of boundary pixels And assembling the processor to detect the rotation angle of the palm to be tested, comprising the following substeps: assembling the processor, selecting, according to a center point of two interlaced points in a first reference line segment, by the group of boundary pixels a corresponding reference point; assembling the processor, determining the direction of the palm to be tested according to coordinates of the reference point coordinates and the interlaced points; and assembling the processor, respectively setting one according to the adjacent two interlaced points a first marker point and a second marker point, and calculating the corresponding midpoints, and then calculating a first slope and a second slope according to the midpoints and the first and second marker points, And obtaining an average slope from the average of the first and second slopes; and assembling the processor to obtain a corresponding rotation angle according to the average slope; assembling the processor according to the direction of the palm to be tested, in order Search a plurality of feature points are set, wherein the feature points include a plurality of finger points, fingertip points, and target points, wherein a y value of each boundary pixel coordinate of the set of boundary pixels and a y value of the front and rear boundary pixel coordinates are changed; The processor is assembled to obtain a plurality of sub-images according to the feature points For example, the method includes the following substeps: assembling the processor, calculating a corresponding finger according to a fingertip point and a corresponding two finger valley point, or a fingertip point and a corresponding one of the target point and the one finger valley point The cusp corresponds to the midpoint, and then five gap points are respectively obtained according to the connection between the midpoint and the corresponding fingertip point, and each interval point correspondingly obtains two crossing points in the set of boundary pixels, and the same a first sub-image of the intersection point and the fingertip point, the midpoint, the finger valley point or the finger valley point and the target point; the processor is assembled, and one of the two target points is formed a first set line segment, and a square formed by one side of the first set line segment to obtain a specific point, and then obtaining a second sub-image according to the specific point, the target points, and the corresponding valley points; The processor is configured to normalize the brightness of the sub-images; and the processor is configured to convert each sub-image into a corresponding feature vector; and the processor is assembled, according to the special micro-vectors And storing the bio representative model corresponding to the user. 依據申請專利範圍第1項所述之手掌生物辨識方法,其中,在偵測該待測手掌之旋轉角度步驟中,當該參考點座標之y座標大於該等交錯點之y座標時,判斷該待測手掌方向為手指尖端朝-y方向,當該參考點座標之y座標小於該等交錯點之y座標時,代表該待測手掌方向為 手指尖端方向朝+y方向。 According to the palm biometric identification method of claim 1, wherein in the step of detecting the rotation angle of the palm to be tested, when the y coordinate of the reference point coordinate is larger than the y coordinate of the interlaced point, determining the The direction of the palm to be tested is the tip of the finger toward the -y direction. When the y coordinate of the coordinate of the reference point is smaller than the y coordinate of the interlaced point, the direction of the palm to be tested is The tip of the finger is oriented in the +y direction. 依據申請專利範圍第1項所述之手掌生物辨識方法,其中,在偵測該待測手掌之旋轉角度步驟中,該第一參考線段是一與該待測手掌影像有八個交錯點之水平線段。 According to the palm biometric identification method of claim 1, wherein in the step of detecting the rotation angle of the palm to be tested, the first reference line segment is a horizontal line having eight interlaced points with the image of the palm to be tested. segment. 依據申請專利範圍第1項所述之手掌生物辨識方法,其中,該搜尋該待測手掌之所有特徵點之步驟還包括:組配該處理器,根據一指尖點與對應之指谷點之距離,設定一在該組邊界像素中具有相等距離之一邊界像素為一目標點。According to the palm biometric identification method of claim 1, wherein the step of searching for all the feature points of the palm to be tested further comprises: assembling the processor according to a fingertip point and a corresponding finger point point. Distance, setting a boundary pixel having an equal distance in the set of boundary pixels as a target point.
TW100103407A 2011-01-28 2011-01-28 Palm biometric method TWI531985B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW100103407A TWI531985B (en) 2011-01-28 2011-01-28 Palm biometric method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW100103407A TWI531985B (en) 2011-01-28 2011-01-28 Palm biometric method

Publications (2)

Publication Number Publication Date
TW201232428A TW201232428A (en) 2012-08-01
TWI531985B true TWI531985B (en) 2016-05-01

Family

ID=47069601

Family Applications (1)

Application Number Title Priority Date Filing Date
TW100103407A TWI531985B (en) 2011-01-28 2011-01-28 Palm biometric method

Country Status (1)

Country Link
TW (1) TWI531985B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI658411B (en) * 2018-02-26 2019-05-01 關鍵禾芯科技股份有限公司 Non-directional finger palm print recognition method and non-directional finger palm print data establishment method
CN110765857A (en) * 2019-09-12 2020-02-07 敦泰电子(深圳)有限公司 Fingerprint identification method, chip and electronic device

Also Published As

Publication number Publication date
TW201232428A (en) 2012-08-01

Similar Documents

Publication Publication Date Title
Lee et al. Partial fingerprint matching using minutiae and ridge shape features for small fingerprint scanners
Raja Fingerprint recognition using minutia score matching
US11017210B2 (en) Image processing apparatus and method
Feng et al. Fingerprint matching using ridges
Bansal et al. Minutiae extraction from fingerprint images-a review
US9274607B2 (en) Authenticating a user using hand gesture
Zhang et al. Combining global and minutia deep features for partial high-resolution fingerprint matching
US9489561B2 (en) Method and system for estimating fingerprint pose
WO2017041488A1 (en) Fingerprint ridge point recognition method and apparatus
US20130011022A1 (en) Method and Computer Program Product for Extracting Feature Vectors from a Palm Image
Jung et al. Noisy and incomplete fingerprint classification using local ridge distribution models
JP2021532453A (en) Extraction of fast and robust skin imprint markings using feedforward convolutional neural networks
Fang et al. Real-time hand posture recognition using hand geometric features and fisher vector
WO2008054940A2 (en) Print matching method and apparatus using pseudo-ridges
CN108288276B (en) Interference filtering method in touch mode in projection interaction system
Ouyang et al. Fingerprint pose estimation based on faster R-CNN
Akbar et al. Palm vein biometric identification system using local derivative pattern
Alpar Online signature verification by continuous wavelet transformation of speed signals
Bahaa-Eldin A medium resolution fingerprint matching system
TWI531985B (en) Palm biometric method
Wang et al. Sphere-spin-image: A viewpoint-invariant surface representation for 3D face recognition
Saha et al. Topomorphological approach to automatic posture recognition in ballet dance
JP2019507435A (en) Dynamic mobile tracking based handwritten signature authentication system and method for spatial segmentation
Daramola et al. Algorithm for fingerprint verification system
Masmoudi et al. Implementation of a fingerprint recognition system using LBP descriptor

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees