CN108280428B - Composite finger and palm print verification system - Google Patents

Composite finger and palm print verification system Download PDF

Info

Publication number
CN108280428B
CN108280428B CN201810069348.4A CN201810069348A CN108280428B CN 108280428 B CN108280428 B CN 108280428B CN 201810069348 A CN201810069348 A CN 201810069348A CN 108280428 B CN108280428 B CN 108280428B
Authority
CN
China
Prior art keywords
finger
image
palm
point
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810069348.4A
Other languages
Chinese (zh)
Other versions
CN108280428A (en
Inventor
陈波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wanbo Intelligent Control Technology Co., Ltd
Original Assignee
Shenzhen Wanbo Intelligent Control Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wanbo Intelligent Control Technology Co Ltd filed Critical Shenzhen Wanbo Intelligent Control Technology Co Ltd
Priority to CN201810069348.4A priority Critical patent/CN108280428B/en
Publication of CN108280428A publication Critical patent/CN108280428A/en
Application granted granted Critical
Publication of CN108280428B publication Critical patent/CN108280428B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor

Abstract

In order to improve the identifiability of palmprint collection, the invention provides a composite fingerprint and palmprint verification system, which comprises: the device comprises a first correction processing unit, a first intermediate image acquisition unit, a second correction processing unit, a second intermediate image acquisition unit and a palm print image acquisition unit.

Description

Composite finger and palm print verification system
Technical Field
The invention belongs to the field of biological characteristic collection, and particularly relates to a composite fingerprint and palm print verification system.
Background
With the increasing diversification of functions of integrated circuits, devices that were either special products or were not available at all have become increasingly popular. Such as a palm print scanner, similar to a fingerprint scanner, is an example. In the past, palm print recognition systems that have been scarce or even inaudible in the traditional consumer product market have become increasingly popular among the average users concerned with access control and identification due to the advent of integrated circuit palm print scanners; the field of application of palm print recognition systems is no longer limited to governments and security personnel. These devices are used to ensure that only authorized users can access a computer system or database and that the size has been reduced to fit into a portable computer.
Under the background of increasingly mature networking development, the networking application of the palm print scanner is called. However, since the network transmission and the remote fingerprint scanner user are not guided by professional personnel, the quality of the collected image is poor and the noise is too high.
Disclosure of Invention
In order to improve the identifiability of on-line fingerprint acquisition, the invention provides a compound fingerprint and palm print verification system, which comprises: the system comprises a finger and palm print remote video information acquisition subsystem and a finger and palm print verification subsystem, wherein the finger and palm print remote video information acquisition subsystem is used for acquiring the fingerprint and palm print of a person to be identified based on a video frame mode, and the finger and palm print verification subsystem is used for verifying the identity of the person according to the fingerprint and palm print.
Further, the fingerprint and palm print remote video information acquisition subsystem comprises:
the first correction processing unit is used for carrying out first gray correction processing on the collected first frame palm image by using initial gray to obtain a finger image, and the palm image comprises a palm image and a finger image which respectively correspond to a left hand and a right hand;
the first intermediate image acquisition unit is used for acquiring a first intermediate image representing a fingerprint area corresponding to each finger by using the physiological characteristics of the finger lines;
the second correction processing unit is used for carrying out second gray scale correction processing on the collected second frame palm image with second gray scale to obtain a finger image, and the palm image comprises a palm image and a finger image which respectively correspond to a left hand and a right hand;
the second intermediate image acquisition unit is used for acquiring a second intermediate image representing a fingerprint area corresponding to each finger by using the physiological characteristics of the finger lines;
and the palm print image acquisition unit is used for carrying out noise reduction processing on the first intermediate image and the second intermediate image to obtain a noise-reduced palm print image.
Further, the first correction processing unit includes:
a first average gray-scale calculating unit for calculating the average value A of gray-scales of the left and right palm images based on the first frame imageGrey scale=(ALeft palm+ARight palm)/2;
A first finger specifying subunit operable to specify, in the palm image, an image of the shortest finger in the palm image as an image corresponding to a thumb, specify an image of the second shortest finger as an image corresponding to a little finger, specify an image close to the thumb as an image corresponding to an index finger, specify an image close to the little finger as an image corresponding to a ring finger, and specify the remaining finger-like images as images corresponding to the middle finger, based on the finger shape and length;
the first finger root part fork point determining subunit is used for determining that the position of the finger root part fork in the palm image is the finger root part fork point;
a first segmentation subunit, configured to perform the following processing on the left palm image: connecting all the finger root part fork points, taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the third finger and the finger root part fork point of the third finger and the little finger and the outline of the palm part close to the little finger in the palm part image as a first point, and taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as a second point; removing the left palm image from the left palm image according to the finger root part fork points, the first points and the second points of the left palm image to obtain a left finger image;
the second segmentation subunit is used for processing the right palm image as follows: connecting all the finger root part fork points, taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the ring finger and the finger root part fork point of the ring finger and the little finger and the outline of the palm part close to the little finger in the palm part image as third points, and taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as fourth points; removing the right hand palm image from the right hand palm image according to each finger root part fork point, the first point and the second point of the right hand palm image to obtain a right hand finger image;
the first gray average value calculating subunit is used for calculating the gray average values of the neighborhood pixels with the first point, the second point, the third point and the fourth point as centers and r as a radius respectively, and forming a matrix M of 1 × 4 by using the 4 gray average values;
the first finger gray scale calculation operator unit is used for taking the image of the tip in the left palm image as an image corresponding to each fingertip of the left hand, taking the image of the tip in the right palm image as an image corresponding to each fingertip of the right hand, calculating the gray scale mean value of a neighborhood taking the fingertip position in the image corresponding to each fingertip as the center and r as the radius, and forming a matrix N of 10 × 1 by using the 10 gray scale mean values;
a first matrix calculating subunit, configured to calculate an eigenvalue a' of the matrix obtained by N × M, that is, an eigenvector a;
a first coordinate system establishing subunit, configured to establish a rectangular plane coordinate system of the left-hand finger image with the first point as an origin, and the second point as a coordinate daAnd dbThe fourth point is taken as the original pointPoint establishing a plane rectangular coordinate system of the right-hand finger image, wherein the coordinate of the third point is d'aAnd d'b
A first cross correction coefficient calculation subunit for calculating a cross correction coefficient α ═ A' × (1-A) for each pixel in the left and right finger imagesGrey scale×(1-x×ed’a/da)/(1-y×edb/d’b) And obtaining left-hand and right-hand finger images after gray correction, wherein x and y are horizontal and vertical coordinate values of each pixel in a left-hand and right-hand coordinate system respectively.
Further, the first intermediate image acquisition unit includes:
and aiming at a certain finger, looking up the thickest line according to the thickness degree of the line perpendicular to the extending direction of each finger along the direction from the fingertip to the joint of the finger and the palm, and taking the line as a boundary to obtain a region from the boundary to the corresponding fingertip as a first intermediate image corresponding to the finger.
Further, the second correction processing unit includes:
a background light gray scale adjusting unit for adjusting the background light gray scale of the collected palm image to AGrey scale/2;
A second average gray-scale calculating unit for calculating the average value A of gray-scales of the left and right palm images based on the second frame imageGrey scale 2=(ALeft palm 2+ARight palm 2) 2; wherein A isLeft palm 2、ARight palm 2Respectively, are: according to the gray values of the second frame image, the left hand palm image and the right hand palm image;
a second finger specifying subunit operable to specify, in the palm image, an image of the shortest finger in the palm image as an image corresponding to a thumb, specify an image of the second shortest finger as an image corresponding to a little finger, specify an image close to the thumb as an image corresponding to an index finger, specify an image close to the little finger as an image corresponding to a ring finger, and specify the remaining finger-like images as images corresponding to middle fingers, based on the finger shape and length;
the second finger root part fork point determining subunit is used for determining that the position of the finger root part fork in the palm part image is the finger root part fork point;
a third segmentation subunit, configured to perform the following processing on the left palm image: connecting all the finger root part fork points, taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the third finger and the finger root part fork point of the third finger and the little finger and the outline of the palm part close to the little finger in the palm part image as a first point, and taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as a second point; removing the left palm image from the left palm image according to the finger root part fork points, the first points and the second points of the left palm image to obtain a left finger image;
the fourth segmentation subunit is used for processing the right palm image as follows: connecting all the finger root part fork points, taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the ring finger and the finger root part fork point of the ring finger and the little finger and the outline of the palm part close to the little finger in the palm part image as third points, and taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as fourth points; removing the right hand palm image from the right hand palm image according to each finger root part fork point, the first point and the second point of the right hand palm image to obtain a right hand finger image;
the second gray level average value operator unit is used for respectively calculating the gray level average values of the neighborhood pixels with the first point, the second point, the third point and the fourth point as centers and R as radius, and forming a matrix M of 1 × 4 by using the 4 gray level average values;
the second finger gray scale calculation operator unit is used for taking the image of the tip in the left palm image as an image corresponding to each fingertip of the left hand, taking the image of the tip in the right palm image as an image corresponding to each fingertip of the right hand, calculating the gray scale mean value of a neighborhood taking the fingertip position in the image corresponding to each fingertip as the center and R as the radius, and forming a matrix N of 10 × 1 by using the 10 gray scale mean values;
the second matrix calculation subunit is used for calculating an eigenvalue A' of the matrix obtained by N × M, namely an eigenvector b;
a second coordinate system establishing subunit, configured to establish a rectangular plane coordinate system of the left-hand finger image with the second point as an origin, and the first point as a coordinate faAnd fbEstablishing a plane rectangular coordinate system of the right-hand finger image by taking the third point as an origin, wherein the coordinate of the fourth point is f'aAnd f'b
A second cross correction coefficient calculation subunit for calculating a cross correction coefficient α ═ a "× (1-a) of each pixel in the left and right finger imagesGrey scale 2× (1-x × lg (f 'a/fa))/(1-y × lg (fb/f' b))) to obtain left-hand and right-hand finger images after gray scale correction, wherein x and y are respectively the abscissa and ordinate values of each pixel in the left-hand and right-hand coordinate systems.
Further, the second intermediate image acquisition unit includes:
and aiming at a certain finger, looking up the finest line according to the thickness degree of the lines vertical to the extending direction of each finger along the direction from the fingertip to the joint of the finger and the palm, and taking the line as a boundary to obtain a region from the boundary to the corresponding fingertip as a second intermediate image corresponding to the finger.
Further, the palm print image acquiring unit includes:
a first area calculation subunit for calculating a left-hand palm area B1 and a right-hand palm area B2 in the first frame, the palm-to-finger boundary being connected with reference to the finger root part cross-point of each hand;
the second area calculation subunit is used for calculating a left-hand palm area B '1 and a right-hand palm area B' 2 in the second frame, and the boundaries of the palms and the fingers refer to the finger root part fork point connecting lines of each hand;
a finger area calculating subunit, configured to accumulate, for fingers of the left hand and the right hand, the areas of the first intermediate image and the second intermediate image, respectively, so as to obtain an area sum C1 of the first intermediate image and an area sum C2 of the second intermediate image;
the reference matrix calculation subunit is used for calculating a matrix a × b to obtain a matrix E;
a palm area factor calculating subunit for calculating the palm area factors of the left hand and the right hand, respectively: p is a radical ofLeft palm=ln(r×(B1/(2×B’1))),pRight palm=ln(R×(B2/(2×B’2)));
An image noise filtering subunit, configured to filter the palm image, where a filtering factor β is:
Figure GDA0002123832050000071
and performing exponential filtering on the second intermediate image corresponding to each finger of the first frame palm image and the second frame palm image according to an image noise filter with a filtering factor of β, wherein the filtering parameter is the filtering factor β.
Furthermore, the value range of R is 0.02-0.1, and the value range of R is 0.08-0.3.
Further, R is 3 times R.
Further, the initial gray scale is RGB (255, 255, 255).
The technical scheme of the invention has the following advantages:
by actively changing the background gray level during online fingerprint acquisition, creatively using the gray level balance technology of fingers and palms of a left hand and a right hand, and by means of the similarity between the fingers and palms, the influence of frame jitter or network transmission signal-to-noise ratio on the quality of fingerprint acquisition images during online acquisition is reduced. Through tests, the recognition rate is improved by more than 40% compared with the prior on-line acquisition technology.
Drawings
Fig. 1 shows a block diagram of the components of the composite fingerprint and palm print verification system according to the present invention.
Fig. 2 shows a block diagram of the components of the finger-palm print remote video information acquisition subsystem according to the present invention.
Detailed Description
As shown in fig. 1, the composite fingerprint and palm print verification system of the present invention includes: the system comprises a finger and palm print remote video information acquisition subsystem and a finger and palm print verification subsystem, wherein the finger and palm print remote video information acquisition subsystem is used for acquiring the fingerprint and palm print of a person to be identified based on a video frame mode, and the finger and palm print verification subsystem is used for verifying the identity of the person according to the fingerprint and palm print.
Preferably, as shown in fig. 2, the fingerprint and palm print remote video information acquiring subsystem includes: the first correction processing unit is used for carrying out first gray correction processing on the collected first frame palm image by using initial gray to obtain a finger image, and the palm image comprises a palm image and a finger image which respectively correspond to a left hand and a right hand;
the first intermediate image acquisition unit is used for acquiring a first intermediate image representing a fingerprint area corresponding to each finger by using the physiological characteristics of the finger lines;
the second correction processing unit is used for carrying out second gray scale correction processing on the collected second frame palm image with second gray scale to obtain a finger image, and the palm image comprises a palm image and a finger image which respectively correspond to a left hand and a right hand;
the second intermediate image acquisition unit is used for acquiring a second intermediate image representing a fingerprint area corresponding to each finger by using the physiological characteristics of the finger lines;
and the palm print image acquisition unit is used for carrying out noise reduction processing on the first intermediate image and the second intermediate image to obtain a noise-reduced palm print image.
Preferably, the first correction processing unit includes:
a first average gray-scale calculating unit for calculating the average value A of gray-scales of the left and right palm images based on the first frame imageGrey scale=(ALeft palm+ARight palm)/2;
A first finger specifying subunit operable to specify, in the palm image, an image of the shortest finger in the palm image as an image corresponding to a thumb, specify an image of the second shortest finger as an image corresponding to a little finger, specify an image close to the thumb as an image corresponding to an index finger, specify an image close to the little finger as an image corresponding to a ring finger, and specify the remaining finger-like images as images corresponding to the middle finger, based on the finger shape and length;
the first finger root part fork point determining subunit is used for determining that the position of the finger root part fork in the palm image is the finger root part fork point;
a first segmentation subunit, configured to perform the following processing on the left palm image: connecting all the finger root part fork points, taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the third finger and the finger root part fork point of the third finger and the little finger and the outline of the palm part close to the little finger in the palm part image as a first point, and taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as a second point; removing the left palm image from the left palm image according to the finger root part fork points, the first points and the second points of the left palm image to obtain a left finger image;
the second segmentation subunit is used for processing the right palm image as follows: connecting all the finger root part fork points, taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the ring finger and the finger root part fork point of the ring finger and the little finger and the outline of the palm part close to the little finger in the palm part image as third points, and taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as fourth points; removing the right hand palm image from the right hand palm image according to each finger root part fork point, the first point and the second point of the right hand palm image to obtain a right hand finger image;
the first gray average value calculating subunit is used for calculating the gray average values of the neighborhood pixels with the first point, the second point, the third point and the fourth point as centers and r as a radius respectively, and forming a matrix M of 1 × 4 by using the 4 gray average values;
the first finger gray scale calculation operator unit is used for taking the image of the tip in the left palm image as an image corresponding to each fingertip of the left hand, taking the image of the tip in the right palm image as an image corresponding to each fingertip of the right hand, calculating the gray scale mean value of a neighborhood taking the fingertip position in the image corresponding to each fingertip as the center and r as the radius, and forming a matrix N of 10 × 1 by using the 10 gray scale mean values;
a first matrix calculating subunit, configured to calculate an eigenvalue a' of the matrix obtained by N × M, that is, an eigenvector a;
a first coordinate system establishing subunit, configured to establish a rectangular plane coordinate system of the left-hand finger image with the first point as an origin, and the second point as a coordinate daAnd dbEstablishing a plane rectangular coordinate system of the image of the right hand finger by taking the fourth point as an origin, wherein the coordinate of the third point is d'aAnd d'b
A first cross correction coefficient calculation subunit for calculating a cross correction coefficient α ═ A' × (1-A) for each pixel in the left and right finger imagesGrey scale×(1-x×ed’a/da)/(1-y×edb/d’b) And obtaining left-hand and right-hand finger images after gray correction, wherein x and y are horizontal and vertical coordinate values of each pixel in a left-hand and right-hand coordinate system respectively.
Preferably, the first intermediate image acquisition unit includes:
and aiming at a certain finger, looking up the thickest line according to the thickness degree of the line perpendicular to the extending direction of each finger along the direction from the fingertip to the joint of the finger and the palm, and taking the line as a boundary to obtain a region from the boundary to the corresponding fingertip as a first intermediate image corresponding to the finger.
Preferably, the second correction processing unit includes:
a background light gray scale adjusting unit for adjusting the background light gray scale of the collected palm image to AGrey scale/2;
A second average gray-scale calculating unit for calculating the average value A of gray-scales of the left and right palm images based on the second frame imageGrey scale 2=(ALeft palm 2+ARight palm 2) 2; wherein A isLeft palm 2、ARight palm 2Respectively, are: according to the gray values of the second frame image, the left hand palm image and the right hand palm image;
a second finger specifying subunit operable to specify, in the palm image, an image of the shortest finger in the palm image as an image corresponding to a thumb, specify an image of the second shortest finger as an image corresponding to a little finger, specify an image close to the thumb as an image corresponding to an index finger, specify an image close to the little finger as an image corresponding to a ring finger, and specify the remaining finger-like images as images corresponding to middle fingers, based on the finger shape and length;
the second finger root part fork point determining subunit is used for determining that the position of the finger root part fork in the palm part image is the finger root part fork point;
a third segmentation subunit, configured to perform the following processing on the left palm image: connecting all the finger root part fork points, taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the third finger and the finger root part fork point of the third finger and the little finger and the outline of the palm part close to the little finger in the palm part image as a first point, and taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as a second point; removing the left palm image from the left palm image according to the finger root part fork points, the first points and the second points of the left palm image to obtain a left finger image;
the fourth segmentation subunit is used for processing the right palm image as follows: connecting all the finger root part fork points, taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the ring finger and the finger root part fork point of the ring finger and the little finger and the outline of the palm part close to the little finger in the palm part image as third points, and taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as fourth points; removing the right hand palm image from the right hand palm image according to each finger root part fork point, the first point and the second point of the right hand palm image to obtain a right hand finger image;
the second gray level average value operator unit is used for respectively calculating the gray level average values of the neighborhood pixels with the first point, the second point, the third point and the fourth point as centers and R as radius, and forming a matrix M of 1 × 4 by using the 4 gray level average values;
the second finger gray scale calculation operator unit is used for taking the image of the tip in the left palm image as an image corresponding to each fingertip of the left hand, taking the image of the tip in the right palm image as an image corresponding to each fingertip of the right hand, calculating the gray scale mean value of a neighborhood taking the fingertip position in the image corresponding to each fingertip as the center and R as the radius, and forming a matrix N of 10 × 1 by using the 10 gray scale mean values;
the second matrix calculation subunit is used for calculating an eigenvalue A' of the matrix obtained by N × M, namely an eigenvector b;
a second coordinate system establishing subunit, configured to establish a rectangular plane coordinate system of the left-hand finger image with the second point as an origin, and the first point as a coordinate faAnd fbEstablishing a plane rectangular coordinate system of the right-hand finger image by taking the third point as an origin, wherein the coordinate of the fourth point is f'aAnd f'b
A second cross correction coefficient calculation subunit for calculating a cross correction coefficient α ═ a "× (1-a) of each pixel in the left and right finger imagesGrey scale 2× (1-x × lg (f 'a/fa))/(1-y × lg (fb/f' b))) to obtain left-hand and right-hand finger images after gray scale correction, wherein x and y are respectively the abscissa and ordinate values of each pixel in the left-hand and right-hand coordinate systems.
Preferably, the second intermediate image acquiring unit includes:
and aiming at a certain finger, looking up the finest line according to the thickness degree of the lines vertical to the extending direction of each finger along the direction from the fingertip to the joint of the finger and the palm, and taking the line as a boundary to obtain a region from the boundary to the corresponding fingertip as a second intermediate image corresponding to the finger.
Preferably, the palm print image acquiring unit includes:
a first area calculation subunit for calculating a left-hand palm area B1 and a right-hand palm area B2 in the first frame, the palm-to-finger boundary being connected with reference to the finger root part cross-point of each hand;
the second area calculation subunit is used for calculating a left-hand palm area B '1 and a right-hand palm area B' 2 in the second frame, and the boundaries of the palms and the fingers refer to the finger root part fork point connecting lines of each hand;
a finger area calculating subunit, configured to accumulate, for fingers of the left hand and the right hand, the areas of the first intermediate image and the second intermediate image, respectively, so as to obtain an area sum C1 of the first intermediate image and an area sum C2 of the second intermediate image;
the reference matrix calculation subunit is used for calculating a matrix a × b to obtain a matrix E;
a palm area factor calculating subunit for calculating the palm area factors of the left hand and the right hand, respectively: p is a radical ofLeft palm=ln(r×(B1/(2×B’1))),pRight palm=ln(R×(B2/(2×B’2)));
An image noise filtering subunit, configured to filter the palm image, where a filtering factor β is:
Figure GDA0002123832050000131
and performing exponential filtering on the second intermediate image corresponding to each finger of the first frame palm image and the second frame palm image according to an image noise filter with a filtering factor of β, wherein the filtering parameter is the filtering factor β.
Preferably, the value range of R is 0.02-0.1, and R is 0.08-0.3.
Preferably, R is 3 times R.
Preferably, the initial gray scale is RGB (255, 255, 255).
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (6)

1. A composite fingerprint and palm print verification system is characterized by comprising: the system comprises a finger and palm print remote video information acquisition subsystem and a finger and palm print verification subsystem, wherein the finger and palm print remote video information acquisition subsystem is used for acquiring the fingerprint and palm print of a person to be identified based on a video frame mode, and the finger and palm print verification subsystem is used for verifying the identity of the person according to the fingerprint and palm print;
the fingerprint and palm print remote video information acquisition subsystem comprises:
the first correction processing unit is used for carrying out first gray correction processing on the collected first frame palm image by using initial gray to obtain a finger image, and the palm image comprises a palm image and a finger image which respectively correspond to a left hand and a right hand;
the first intermediate image acquisition unit is used for acquiring a first intermediate image representing a fingerprint area corresponding to each finger by using the physiological characteristics of the finger lines;
the second correction processing unit is used for carrying out second gray scale correction processing on the collected second frame palm part image with second gray scale to obtain a finger image, and the second frame palm part image comprises a palm image and a finger image which respectively correspond to a left hand and a right hand;
the second intermediate image acquisition unit is used for acquiring a second intermediate image representing a fingerprint area corresponding to each finger by using the physiological characteristics of the finger lines;
the palm print image acquisition unit is used for carrying out noise reduction processing on the first intermediate image and the second intermediate image to obtain a noise-reduced palm print image;
characterized in that the first correction processing unit includes:
a first average gray-scale calculating unit for calculating the average value A of gray-scales of the left and right palm images based on the first frame imageGrey scale=(ALeft palm+ARight palm)/2;
A first finger specifying subunit operable to specify, in the palm image, an image of the shortest finger in the palm image as an image corresponding to a thumb, specify an image of the second shortest finger as an image corresponding to a little finger, specify an image close to the thumb as an image corresponding to an index finger, specify an image close to the little finger as an image corresponding to a ring finger, and specify the remaining finger-like images as images corresponding to the middle finger, based on the finger shape and length;
the first finger root part fork point determining subunit is used for determining that the position of the finger root part fork in the palm image is the finger root part fork point;
a first segmentation subunit, configured to perform the following processing on the left palm image: connecting all the finger root part fork points, taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the third finger and the finger root part fork point of the third finger and the little finger and the outline of the palm part close to the little finger in the palm part image as a first point, and taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as a second point; removing the left palm image from the left palm image according to the finger root part fork points, the first points and the second points of the left palm image to obtain a left finger image;
the second segmentation subunit is used for processing the right palm image as follows: connecting all the finger root part fork points, taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the ring finger and the finger root part fork point of the ring finger and the little finger and the outline of the palm part close to the little finger in the palm part image as third points, and taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as fourth points; removing the right hand palm image from the right hand palm image according to each finger root part fork point, the third point and the fourth point of the right hand palm image to obtain a right hand finger image;
the first gray average value calculating subunit is used for calculating the gray average values of the neighborhood pixels with the first point, the second point, the third point and the fourth point as centers and r as a radius respectively, and forming a matrix M of 1 × 4 by using the 4 gray average values;
the first finger gray scale calculation operator unit is used for taking the image of the tip in the left palm image as an image corresponding to each fingertip of the left hand, taking the image of the tip in the right palm image as an image corresponding to each fingertip of the right hand, calculating the gray scale mean value of a neighborhood taking the fingertip position in the image corresponding to each fingertip as the center and r as the radius, and forming a matrix N of 10 × 1 by using the 10 gray scale mean values;
a first matrix calculating subunit, configured to calculate an eigenvalue a' of the matrix obtained by N × M, that is, an eigenvector a;
a first coordinate system establishing subunit, configured to establish a rectangular plane coordinate system of the left-hand finger image with the first point as an origin, and the second point as a coordinate daAnd dbEstablishing a plane rectangular coordinate system of the image of the right hand finger by taking the fourth point as an origin, wherein the coordinate of the third point is d'aAnd d'b
A first cross correction coefficient calculation subunit for calculating the cross correction coefficient of each pixel in the left hand finger image and the right hand finger image
Figure FDA0002202287490000031
Obtaining left-hand and right-hand finger images after gray correction, wherein x and y are horizontal and vertical coordinate values of each pixel in a left-hand coordinate system and a right-hand coordinate system respectively;
the first intermediate image acquisition unit includes:
aiming at a certain finger, searching the thickest line according to the thickness degree of the line vertical to the extending direction of each finger along the direction from the fingertip to the joint of the finger and the palm, and taking the line as a boundary line to obtain a region from the boundary line to the corresponding fingertip as a first intermediate image corresponding to the finger;
the second correction processing unit includes:
a background light gray scale adjusting unit for adjusting the background light gray scale of the collected palm imageInteger AGrey scale/2;
A second average gray-scale calculating unit for calculating the average value A of gray-scales of the left and right palm images based on the second frame imageGrey scale 2=(ALeft palm 2+ARight palm 2)/2;
A second finger specifying subunit operable to specify, in the palm image, an image of the shortest finger in the palm image as an image corresponding to a thumb, specify an image of the second shortest finger as an image corresponding to a little finger, specify an image close to the thumb as an image corresponding to an index finger, specify an image close to the little finger as an image corresponding to a ring finger, and specify the remaining finger-like images as images corresponding to middle fingers, based on the finger shape and length;
the second finger root part fork point determining subunit is used for determining that the position of the finger root part fork in the palm part image is the finger root part fork point;
a third segmentation subunit, configured to perform the following processing on the left palm image: connecting all the finger root part fork points, taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the third finger and the finger root part fork point of the third finger and the little finger and the outline of the palm part close to the little finger in the palm part image as a first point, and taking the intersection point of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as a second point; removing the left palm image from the left palm image according to the finger root part fork points, the first points and the second points of the left palm image to obtain a left finger image;
the fourth segmentation subunit is used for processing the right palm image as follows: connecting all the finger root part fork points, taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the ring finger and the finger root part fork point of the ring finger and the little finger and the outline of the palm part close to the little finger in the palm part image as third points, and taking the intersection points of the extended lines of the connecting lines of the two fork points of the middle finger and the finger root part fork point of the index finger and the thumb and the outline of the palm part close to the thumb in the palm part image as fourth points; removing the right hand palm image from the right hand palm image according to each finger root part fork point, the third point and the fourth point of the right hand palm image to obtain a right hand finger image;
the second gray level average value operator unit is used for respectively calculating the gray level average values of the neighborhood pixels with the first point, the second point, the third point and the fourth point as centers and R as radius, and forming a matrix M of 1 × 4 by using the 4 gray level average values;
the second finger gray scale calculation operator unit is used for taking the image of the tip in the left palm image as an image corresponding to each fingertip of the left hand, taking the image of the tip in the right palm image as an image corresponding to each fingertip of the right hand, calculating the gray scale mean value of a neighborhood taking the fingertip position in the image corresponding to each fingertip as the center and R as the radius, and forming a matrix N of 10 × 1 by using the 10 gray scale mean values;
the second matrix calculation subunit is used for calculating an eigenvalue A' of the matrix obtained by N × M, namely an eigenvector b;
a second coordinate system establishing subunit, configured to establish a rectangular plane coordinate system of the left-hand finger image with the second point as an origin, and the first point as a coordinate faAnd fbEstablishing a plane rectangular coordinate system of the right-hand finger image by taking the third point as an origin, wherein the coordinate of the fourth point is f'aAnd f'b
A second cross correction coefficient calculation subunit for calculating a cross correction coefficient α ═ a "× (1-a) of each pixel in the left and right finger imagesGrey scale 2×(1-x×lg(f’a/fa))/(1-y×lg(fb/f’b) X, y are the abscissa and ordinate values of each pixel in the left-hand and right-hand coordinate systems, respectively).
2. The composite fingerprint and palm print verification system according to claim 1, wherein the second intermediate image acquisition unit comprises:
and aiming at a certain finger, looking up the finest line according to the thickness degree of the lines vertical to the extending direction of each finger along the direction from the fingertip to the joint of the finger and the palm, and taking the line as a boundary to obtain a region from the boundary to the corresponding fingertip as a second intermediate image corresponding to the finger.
3. The composite fingerprint and palm print verification system according to claim 1, wherein the palm print image acquisition unit comprises:
a first area calculation subunit for calculating a left-hand palm area B1 and a right-hand palm area B2 in the first frame, the palm-to-finger boundary being connected with reference to the finger root part cross-point of each hand;
the second area calculation subunit is used for calculating a left-hand palm area B '1 and a right-hand palm area B' 2 in the second frame, and the boundaries of the palms and the fingers refer to the finger root part fork point connecting lines of each hand;
a finger area calculating subunit, configured to accumulate, for fingers of the left hand and the right hand, the areas of the first intermediate image and the second intermediate image, respectively, so as to obtain an area sum C1 of the first intermediate image and an area sum C2 of the second intermediate image;
the reference matrix calculation subunit is used for calculating the eigenvector a × eigenvector b to obtain a matrix E;
a palm area factor calculating subunit for calculating the palm area factors of the left hand and the right hand, respectively: p is a radical ofLeft palm=ln(r×(B1/(2×B’1))),pRight palm=ln(R×(B2/(2×B’2)));
An image noise filtering subunit, configured to filter the palm image, where a filtering factor β is:
Figure FDA0002202287490000071
and performing exponential filtering on the second intermediate image corresponding to each finger of the first frame palm image and the second frame palm image according to an image noise filter with a filtering factor of β, wherein the filtering parameter is the filtering factor β.
4. The composite fingerprint and palm print verification system of claim 1, wherein the value of R ranges from 0.02 to 0.1, and the value of R ranges from 0.08 to 0.3.
5. The composite fingerprint and palm print verification system of claim 1, wherein R is 3 times R.
6. The composite fingerprint and palm print verification system of claim 1, wherein said initial gray level is RGB (255, 255, 255).
CN201810069348.4A 2018-01-24 2018-01-24 Composite finger and palm print verification system Expired - Fee Related CN108280428B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810069348.4A CN108280428B (en) 2018-01-24 2018-01-24 Composite finger and palm print verification system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810069348.4A CN108280428B (en) 2018-01-24 2018-01-24 Composite finger and palm print verification system

Publications (2)

Publication Number Publication Date
CN108280428A CN108280428A (en) 2018-07-13
CN108280428B true CN108280428B (en) 2020-08-04

Family

ID=62804900

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810069348.4A Expired - Fee Related CN108280428B (en) 2018-01-24 2018-01-24 Composite finger and palm print verification system

Country Status (1)

Country Link
CN (1) CN108280428B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761466A (en) * 2014-02-14 2014-04-30 上海云享科技有限公司 Method and device for identity authentication
CN103955674A (en) * 2014-04-30 2014-07-30 广东瑞德智能科技股份有限公司 Palm print image acquisition device and palm print image positioning and segmenting method
CN106650703A (en) * 2017-01-06 2017-05-10 厦门中控生物识别信息技术有限公司 Palm anti-counterfeiting method and apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9659205B2 (en) * 2014-06-09 2017-05-23 Lawrence Livermore National Security, Llc Multimodal imaging system and method for non-contact identification of multiple biometric traits

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761466A (en) * 2014-02-14 2014-04-30 上海云享科技有限公司 Method and device for identity authentication
CN103955674A (en) * 2014-04-30 2014-07-30 广东瑞德智能科技股份有限公司 Palm print image acquisition device and palm print image positioning and segmenting method
CN106650703A (en) * 2017-01-06 2017-05-10 厦门中控生物识别信息技术有限公司 Palm anti-counterfeiting method and apparatus

Also Published As

Publication number Publication date
CN108280428A (en) 2018-07-13

Similar Documents

Publication Publication Date Title
CN108230383B (en) Hand three-dimensional data determination method and device and electronic equipment
KR102596897B1 (en) Method of motion vector and feature vector based fake face detection and apparatus for the same
Wang et al. AIPNet: Image-to-image single image dehazing with atmospheric illumination prior
AU2015317344B2 (en) Mobility empowered biometric appliance a tool for real-time verification of identity through fingerprints
EP3153943A1 (en) Air gesture input method and apparatus
CN106529414A (en) Method for realizing result authentication through image comparison
JP2005202562A (en) Image processing method, image processor and image processing program
CN109871845B (en) Certificate image extraction method and terminal equipment
CN106056064A (en) Face recognition method and face recognition device
CN108875623B (en) Face recognition method based on image feature fusion contrast technology
WO2022135574A1 (en) Skin color detection method and apparatus, and mobile terminal and storage medium
WO2019210707A1 (en) Image sharpness evaluation method, device and electronic device
JP4901229B2 (en) Red-eye detection method, apparatus, and program
CN110458792A (en) Method and device for evaluating quality of face image
CN110728201A (en) Image processing method and device for fingerprint identification
CN105631285A (en) Biological feature identity recognition method and apparatus
CN108280428B (en) Composite finger and palm print verification system
WO2018121552A1 (en) Palmprint data based service processing method, apparatus and program, and medium
CN108256528B (en) Finger and palm print security system
CN108154141B (en) Biological parameter identification system using finger veins
CN108073916B (en) Palm print information collecting device
CN112565674A (en) Exhibition hall central control system capable of realizing remote video monitoring and control
CN109886320B (en) Human femoral X-ray intelligent recognition method and system
Liu et al. Illumination compensation and feedback of illumination feature in face detection
Fathy et al. Benchmarking of pre-processing methods employed in facial image analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200630

Address after: 6 / F, Wanxun building, no.6, Keji Beisan Road, North District, hi tech Industrial Park, Xili street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Wanbo Intelligent Control Technology Co., Ltd

Address before: 610000, No. 20, No. 3, No. 10, 88, five road, hi tech Zone, Chengdu, Sichuan

Applicant before: SICHUAN ZHENG'ANTONG TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200804

Termination date: 20210124

CF01 Termination of patent right due to non-payment of annual fee