CN111460915A - Light weight neural network-based finger vein verification method and system - Google Patents

Light weight neural network-based finger vein verification method and system Download PDF

Info

Publication number
CN111460915A
CN111460915A CN202010174412.2A CN202010174412A CN111460915A CN 111460915 A CN111460915 A CN 111460915A CN 202010174412 A CN202010174412 A CN 202010174412A CN 111460915 A CN111460915 A CN 111460915A
Authority
CN
China
Prior art keywords
neural network
finger vein
light weight
image
random
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010174412.2A
Other languages
Chinese (zh)
Other versions
CN111460915B (en
Inventor
胡永健
郑浩聪
王宇飞
刘琲贝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Sino Singapore International Joint Research Institute
Original Assignee
South China University of Technology SCUT
Sino Singapore International Joint Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT, Sino Singapore International Joint Research Institute filed Critical South China University of Technology SCUT
Priority to CN202010174412.2A priority Critical patent/CN111460915B/en
Publication of CN111460915A publication Critical patent/CN111460915A/en
Application granted granted Critical
Publication of CN111460915B publication Critical patent/CN111460915B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Multimedia (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a finger vein verification method and a finger vein verification system based on a lightweight neural network, wherein the method comprises the following steps: extracting a region of interest of the finger vein image; constructing a light weight neural network for finger vein feature extraction; randomly selecting a plurality of types of fingers in each training batch, and randomly selecting a plurality of images of the region of interest in each type of fingers to construct batch images; performing real-time data amplification on the batch images and inputting the batch images into a light weight neural network; constructing a classification component for classifying finger vein features, and inputting output features of the light weight neural network into the classification component; constructing a batch hard loss function and a cross entropy loss function after label smoothing and regularization to obtain an integral loss function; and (4) extracting features of the trained light weight neural network, calculating cosine similarity between the features, and outputting a finger vein verification result. The method can better acquire the fine-grained characteristics of the finger vein image, and reduces the storage volume and the calculation consumption of the model.

Description

Light weight neural network-based finger vein verification method and system
Technical Field
The invention relates to the technical field of finger vein identification, in particular to a finger vein verification method and system based on a lightweight neural network.
Background
Finger vein biometric identification is a biometric identification authentication method, which uses vein images under the surface of finger skin for identification, and the finger vein identification has many challenges, and the finger vein collected images have low contrast and are easily affected by uneven illumination, temperature variation, 2D (two-dimensional) and 3D (three-dimensional) rotation of fingers, noise, shadow and light fluctuation, so that even though the finger vein image enhancement, feature description and matching methods are artificially and meticulously designed, the low-quality finger vein images are still difficult to be correctly classified.
In the related research, Fan et al uses a two-channel network to combine two images to be verified into one two-channel image, and simultaneously proposes a mini-ROI (mini-Region of Interest, mini-ROI, minimum Region of Interest) method to locate a small ROI block with the most similar matching degree in order to eliminate the influence of finger displacement, but extra time is spent on extracting the mini-ROI, and Hu et al modifies based on a VGGFace-Net network structure, uses L argeMargin-Softmax L oss as a loss function, extracts features with spatial information through a network, and performs template matching on the features so as to solve the problem that finger vein images are not aligned, but the used backbone network has a large scale, the number of network weights is large, and large storage capacity and more computing resources are required.
Although the method improves the performance of finger vein verification to a certain extent, the calculation is time-consuming, the generalization capability is not well improved, the overfitting is easy, and the practicability and the application value are reduced.
Disclosure of Invention
In order to overcome the defects and shortcomings in the prior art, the invention provides a finger vein verification method and a finger vein verification system based on a lightweight neural network.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides a finger vein verification method based on a lightweight neural network, which comprises the following steps:
extracting a region of interest of the finger vein image;
constructing a light weight neural network for finger vein feature extraction;
randomly selecting a plurality of types of fingers in each training batch, and randomly selecting a plurality of images of the region of interest in each type of fingers to construct batch images;
performing real-time data amplification on the batch of images and inputting the batch of images into the lightweight neural network;
constructing a classification component for classifying finger vein features, and inputting output features of the light weight neural network into the classification component;
constructing bulk hard loss functions
Figure BDA0002410285060000021
Constructing cross entropy loss function after label smoothing and regularization
Figure BDA0002410285060000022
Obtaining an integral loss function
Figure BDA0002410285060000023
Comprises the following steps:
Figure BDA0002410285060000024
training a light weight neural network, updating the weight of the light weight neural network according to the loss value, and storing the weight of the current light weight neural network after training;
and (4) extracting the features of the light weight neural network after the weight is updated, calculating cosine similarity between the features, and outputting a finger vein verification result.
As a preferred technical solution, the extracting of the region of interest of the finger vein image specifically includes:
detecting the upper edge and the lower edge of the image by adopting an extended Prewitt edge detection operator, and respectively marking masks for detecting the upper edge and the lower edge as masksuAnd MaskdThe concrete formula is as follows:
Figure BDA0002410285060000031
Figure BDA0002410285060000032
and fitting the central lines of the upper edge and the lower edge by adopting a least square method, performing rotation correction on the finger vein image according to the deflection angle of the central line, finally extracting a circumscribed rectangle of the edge line after rotation as an image of the region of interest, and unifying the size of the image of the region of interest by adopting bilinear interpolation.
As a preferred technical solution, the lightweight neural network for finger vein feature extraction is constructed, and the neural network after the first maximally pooling layer is removed by using the neural network shuffle V2 is used as the lightweight neural network.
As a preferred technical solution, the batch of images are subjected to real-time data amplification and then input to the lightweight neural network, and the real-time data amplification includes steps of random brightness change, random clipping, random rotation and random erasing;
the random brightness variation step adopts a random factor to adjust the brightness of the interested area of the finger vein image;
in the random cutting step, a cutting frame with a fixed size is adopted to perform random cutting in the appointed range of the image;
the random rotation step carries out random angle rotation on the image within a set angle threshold value, and images with the random angle rotation are unified in size by adopting bilinear interpolation;
and in the random erasing step, an erasing block is arranged to randomly erase each position of the image.
As a preferred technical scheme, the construction of the batch hard loss function
Figure BDA0002410285060000033
The concrete formula is as follows:
Figure BDA0002410285060000034
wherein F represents the output characteristics of the lightweight neural network,
Figure BDA0002410285060000035
features of the x-th image representing a finger of the y-th class, subscripts a, p, n representing anchor, positive and negative sample points of the triplet, respectively, m representing the minimum separation between pairs of positive and negative samples, and D representing a distance metric function.
As an optimal technical solution, the cross entropy loss function after the label smoothing regularization is constructed
Figure BDA0002410285060000041
The concrete formula is as follows:
Figure BDA0002410285060000042
wherein l represents the true probability distribution, q represents the predicted probability distribution, M represents the total number of categories of the finger vein training set, and lvRepresenting true probability for class v, qvRepresenting the prediction probability for the v-th class, and r represents the true label of the image, representing a constant.
As a preferred technical solution, in the training of the lightweight neural network, the threshold of the training times is set to e-500, an Adam optimizer is used as a training optimizer, and the learning rate adopts a gradual preheating strategy, which is specifically set as:
Figure BDA0002410285060000043
where lr (t) represents the learning rate, and t represents the iteration round.
In a preferred technical solution, the training of the lightweight neural network calls the network weights trained by the ImageNet data set at an initial stage of the lightweight neural network training to initialize the lightweight neural network weights.
The invention also provides a finger vein verification system based on the lightweight neural network, which comprises: the system comprises a region-of-interest extraction module, a light weight neural network construction module, a batch image construction module, a real-time data amplification module, a classification component construction module, a loss function construction module, a light weight neural network training module and a finger vein verification module;
the interested region extraction module is used for extracting an interested region of the finger vein image;
the light weight neural network construction module is used for constructing a light weight neural network, and the light weight neural network is used for finger vein feature extraction;
the batch image construction module is used for randomly selecting a plurality of types of fingers in each training batch and randomly selecting a plurality of interested area images in each type of fingers to construct a batch image;
the real-time data amplification module is used for performing real-time data amplification on the batch of images and then inputting the batch of images into the lightweight neural network;
the classification component construction module is used for constructing a classification component, the classification component is used for classifying the finger vein features, and the output features of the light weight neural network are input into the classification component;
the loss function construction module is used for constructing a batch hard loss function
Figure BDA0002410285060000051
And constructing a cross entropy loss function after label smoothing and regularization
Figure BDA0002410285060000052
Obtaining an integral loss function
Figure BDA0002410285060000053
The light weight neural network training module is used for training the light weight neural network training module, updating the weight of the light weight neural network according to the loss value, and storing the weight of the current light weight neural network after training;
the finger vein verification module is used for extracting the features of the light weight neural network after the weight is updated, calculating the cosine similarity between the features and outputting a finger vein verification result.
As a preferred technical scheme, the real-time data amplification module comprises a random brightness change unit, a random cutting unit, a random rotation unit and a random erasing unit;
the random brightness change unit adjusts the brightness of the interested region of the finger vein image by adopting a random factor;
the random cutting unit adopts a cutting frame with a fixed size to perform random cutting in the appointed range of the image;
the random rotation unit is used for carrying out random angle rotation on the image within a set angle threshold value, and unifying the size of the image after the random angle rotation by adopting bilinear interpolation;
the random erasing unit is used for setting an erasing block to randomly erase each position of the image.
Compared with the prior art, the invention has the following advantages and beneficial effects:
(1) according to the method, the cross entropy loss function with the label smoothing and regularization and the batch hard loss function are combined to be used as a training target function, on one hand, the requirement of a model on training data volume is relieved through the batch hard loss function, on the other hand, the convergence speed of the model is accelerated and the overfitting degree of the model is reduced through the cross entropy loss with the label smoothing and regularization, and the accuracy and the training speed of the model are guaranteed.
(2) According to the method, the neural network with the first largest pooling layer removed from the ShuffleNet V2 is used as the main network of the model, a larger feature map can be output, the loss of fine-grained information caused by lower resolution of the finger vein image is reduced, and the good extraction capability of the network on the fine-grained features of the finger vein image is ensured.
(3) The invention adopts a new finger vein image amplification method, improves the robustness of the model by combining random brightness, random cutting, random rotation and random erasing, solves the problem that finger vein training data is less and difficult to train, and better improves the verification precision of the finger vein model.
Drawings
FIG. 1 is a training flowchart of a light weight neural network-based finger vein authentication method according to the present embodiment;
FIG. 2 is a test flowchart of a finger vein verification method based on a lightweight neural network according to the embodiment;
FIG. 3 is a schematic diagram of a region of interest according to the present embodiment;
fig. 4 is a schematic diagram of the overall model structure of the finger vein authentication method based on the lightweight neural network according to the embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the present embodiment, the database uses three databases of SDUM L A-FV, FV-USM, MMCBNU, SDUM L A-FV dataset collected by Shandong university and finger vein images from 636 fingers of 106 subjects, 6 from each of left and right index, middle, and ring fingers, totaling 3816 finger vein images, FV-USM database collected by Malaysia university and composed of images of right and left index, middle, and ring fingers of 123 subjects, wherein the subjects include male 83, female 40, aged 20 to 52 years, the images were collected at two different times, 6 Korean images were collected at each finger at each time, MMCBNU was collected by northern university, composed of right and left index, middle, and ring fingers of 100 subjects, 10 finger collections for each finger, detailed information of finger vein database is shown in Table 1, the present embodiment is implemented mainly based on the framework Totorch1.1.0, the XP card used for experiments is TINN, the system is UNUB 18.04, CUdDA version 4.52.
Table 1 detailed information table of vein database
SDUMLA-FV FV-USM MMCBNU
Number of people 106 123 100
Number of fingers/person 6 4 6
Image number/finger 6 12 10
Image size 320×240 640×480 640×480
Total number of images 3816 5904 6000
As shown in fig. 1 and fig. 2, and in combination with fig. 3, the present embodiment provides a finger vein authentication method based on a lightweight neural network, including the following steps:
s1, extracting an interested area of the finger vein image as an object to be processed;
in this embodiment, all the region-of-interest images in the above database are extracted as the database of the embodiment, for FV-USM, 90 rotations are performed counterclockwise to distribute the upper and lower edges thereof horizontally, the region-of-interest extraction of finger veins is to detect the upper and lower edges of the image by using an extended Prewitt edge detection operator, and the masks for detecting the upper and lower edges are respectively recorded as MaskuAnd MaskdThe calculation is as follows:
Figure BDA0002410285060000071
Figure BDA0002410285060000072
then fitting the central lines of the upper and lower edges by using a least square method, performing rotation correction on the finger vein image according to the deflection angle of the central lines, finally extracting a circumscribed rectangle of the rotated edge line as an interested region image, and uniformly adjusting the obtained interested region image to the size of 128 × 64 by using bilinear interpolation, as shown in fig. 3, the image is the extracted interested region image, the circumscribed rectangle of the finger vein image extracted by the embodiment is helpful for positioning the finger vein and reducing the loss of the finger vein information;
in the embodiment, the extracted interested database is subjected to data division,and dividing the test set into a training set and a test set, dividing one part of the test set into a registration template image library, and dividing the rest part of the test set into a query image library. Specifically, for a database with g fingers of each type of T fingers, randomly selecting T/2 fingers as a training set, using the rest T-T/2 fingers as a test set, randomly selecting half of the fingers of each type of the test set as registration images, using the rest half of the fingers as query images, using the query images and the matched registration images as positive sample pairs, and using the query images and the unmatched registration images as negative sample pairs, wherein the query images and the unmatched registration images share the same sample pair
Figure BDA0002410285060000081
In alignment with the sample, the sample is,
Figure BDA0002410285060000082
for negative samples;
s2, constructing a light weight neural network N for finger vein feature extraction;
in the embodiment, the lightweight neural network is obtained based on the ShuffleNet V2 improvement, and the first maximum pooling layer of the neural network is removed as the improved neural network N;
s3, initially loading network weight trained on an ImageNet data set in advance in a training stage by a light weight neural network N, constructing a sampling module for each batch in the training stage, randomly selecting S types of fingers each time, randomly selecting k interested region images by each type of fingers to form batch images with b being S × k, carrying out real-time data amplification on the batch images, and taking the finger vein images subjected to real-time amplification as input of the light weight neural network N, wherein S is 32, k is 4, and b is 128;
in this embodiment, the real-time data amplification technology in the training phase includes random brightness variation, random cropping, random rotation and random erasure, first, the brightness of the image of the region of interest is adjusted by a random factor of 0.7-1.3, and then a cropping frame I with a fixed size is usedeRandomly clipping in the designated range of the image, assuming that the original size is Wo×HoSetting the length-width ratio of the cutting frame to be consistent with the original image, wherein the size of the cutting frame is 0.9 times of the original imageThen cutting out the frame IeIs 0.9Wo×0.9HoLet the coordinate of the upper left corner of the cutting box be (x)e,ye) Wherein x iseAt 0 to 0.1WoRandom value within the range, yeAt 0 to 0.1H0Randomly taking value in the range, and setting the area of the cutting frame as Ie=(xe,ye,xe+0.9Wo,ye+0.9Ho) Aiming at the problem of finger rotation, randomly rotating an image at an angle of-3 to +3, uniformly adjusting the rotated image to the size of 128 × 64 by using bilinear interpolation, and finally, aiming at the problem of partial loss of finger textures, amplifying the image by using random erasure, wherein the erasure block of the embodiment randomly appears at each position of the image, the length and the width of the erasure block are randomly 0.1 to 0.5 of the length and the width of the image, the pixel value of the erasure block is randomly 0 or 255, and the data amplification technology amplifies the generalization capability of the model;
s4, in the training stage, constructing a component C for classifying the finger vein features, taking the output features F of the light weight neural network N as the input of the classification component C, and constructing a batch hard loss function
Figure BDA0002410285060000091
Comprises the following steps:
Figure BDA0002410285060000092
wherein the content of the first and second substances,
Figure BDA0002410285060000093
representing the characteristics of the x image of the y finger, subscripts a, p and n respectively represent an anchor point, a positive sample point and a negative sample point of a triple, m represents the minimum interval between a positive sample pair and a negative sample pair, D is a distance measurement function, m takes a value of 0.3, D adopts an Euclidean distance, the function finds out a same sample which is farthest from other characteristics in a batch and a different sample which is closest to other characteristics in the batch for any sample in the batch, constructs the triple and then calculates the triple loss of the whole batch;
the characteristics F are subjected to batch normalization to obtain characteristics E, the characteristics E are subjected to full-connection layer and normalization index function to obtain prediction probability distribution q, and a cross entropy loss function after label smoothing and regularization is constructed
Figure BDA0002410285060000094
Comprises the following steps:
Figure BDA0002410285060000095
wherein M refers to the total number of categories in the vein training set, lvRepresenting true probability for class v, qvRepresenting the prediction probability for the v-th class, r is the true label of the image, and is a small constant, which is used to reduce the confidence of the model for the training set, and is used to calculate the classification loss, and this embodiment is 0.1.
Substituting the characteristic F into
Figure BDA0002410285060000101
Substituting the true probability distribution l and the predicted probability distribution q into
Figure BDA0002410285060000102
Finally, the loss function of the whole system is obtained as
Figure BDA0002410285060000103
By a loss function
Figure BDA0002410285060000104
The minimization is a target, a bp algorithm is adopted to train a model, the network weight is updated according to the loss value by utilizing a back propagation mechanism of the convolutional neural network, the training is stopped when the training times reach the specified training times e, and the weight of the current neural network is stored;
as shown in fig. 4, in this embodiment, the specified training time e is 500, the optimizer of the training model is set as Adam optimizer, the learning rate adopts a gradual preheating strategy, the initial learning rate is set as 3.5e-3, and the learning rate lr (t) is set as:
Figure BDA0002410285060000105
wherein t represents an iteration round;
s5, an inference stage, namely extracting the characteristics E of the samples by using the lightweight neural network N loaded with the storage weight, and measuring the similarity among the samples through the cosine similarity among the characteristics;
in this embodiment, the registered image library and the query image library are used as input, features of all the registered image libraries and the query image library are extracted, each image of the query image library and each image of the corresponding registered template construct a positive sample pair, each image of the registered image of the non-corresponding template construct a negative sample pair, cosine similarity between corresponding features of all the positive sample pairs is calculated, cosine similarity between corresponding features of all the negative sample pairs is calculated, and equal error Rate (EER Rate, EER) is calculated according to the above two.
The embodiment alleviates the requirement of a depth model on the amount of training data by constructing more triple samples, uses cross entropy loss for global constraint in order to accelerate the convergence speed of the network, and reduces model overfitting through label smoothing and regularization.
In the embodiment, tests are performed on three databases of SDUM L A-FV, FV-USM and MMCBNU, and the equal error rate is used to evaluate the verification performance, the equal error rate is the rate when the error matching rate and the error non-matching rate are equal, the smaller the equal error rate is, the better the verification effect of the vein algorithm is.
To prove the effectiveness of this embodiment, the method is compared with the Selective-Network method proposed in "A novel finger present verification system on two-stream dependent Network learning" and the FV-Net method proposed in "FV-Net: learning a finger present detection based on a CNN", wherein the verification performance is shown in Table 2 below, and the computational complexity is shown in Table 3 below.
TABLE 2 comparison of equal error rates (%) for different methods
Figure BDA0002410285060000111
TABLE 3 comparison of computational complexity for different methods
Figure BDA0002410285060000112
As can be seen from the above experimental results, the present embodiment is better in verification performance than the existing method; the optimal results are obtained in the calculation amount, the weight number and the feature extraction time, the feature matching time is slightly slower than that of the Selective-Network, and the experimental results prove the effectiveness of the method.
The present embodiment also provides a finger vein authentication system based on a lightweight neural network, including: the system comprises a region-of-interest extraction module, a light weight neural network construction module, a batch image construction module, a real-time data amplification module, a classification component construction module, a loss function construction module, a light weight neural network training module and a finger vein verification module;
in this embodiment, the region-of-interest extracting module is configured to extract a region of interest of the finger vein image; the light weight neural network construction module is used for constructing a light weight neural network, and the light weight neural network is used for finger vein feature extraction; the batch image construction module is used for randomly selecting a plurality of types of fingers in each training batch and randomly selecting a plurality of interested area images in each type of fingers to construct a batch image; the real-time data amplification module is used for performing real-time data amplification on the batch images and then inputting the batch images into the light-weight neural network; the classification component construction module is used for constructing a classification component, the classification component is used for classifying the finger vein features, and the output features of the light weight neural network are input into the classification component; loss function building blockFor constructing bulk hard penalty functions
Figure BDA0002410285060000121
And constructing a cross entropy loss function after label smoothing and regularization
Figure BDA0002410285060000122
Obtaining an integral loss function
Figure BDA0002410285060000123
The light weight neural network training module is used for training the light weight neural network training module, and the weight of the current light weight neural network is saved after the training is finished; and the finger vein verification module is used for extracting the features of the light weight neural network after the weight is updated, calculating the cosine similarity between the features and outputting a finger vein verification result.
In this embodiment, the real-time data amplification module includes a random brightness variation unit, a random clipping unit, a random rotation unit, and a random erasing unit;
in the embodiment, the random brightness variation unit adjusts the brightness of the region of interest of the finger vein image by using a random factor; the random cutting unit is used for cutting the cutting frame with fixed size in the appointed range of the image at random; the random rotation unit is used for carrying out random angle rotation on the image within a set angle threshold value, and unifying the size of the image after the random angle rotation by adopting bilinear interpolation; the random erasing unit is used for setting the erasing block to randomly erase each position of the image.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (10)

1. A finger vein authentication method based on a lightweight neural network is characterized by comprising the following steps:
extracting a region of interest of the finger vein image;
constructing a light weight neural network for finger vein feature extraction;
randomly selecting a plurality of types of fingers in each training batch, and randomly selecting a plurality of images of the region of interest in each type of fingers to construct batch images;
performing real-time data amplification on the batch of images and inputting the batch of images into the lightweight neural network;
constructing a classification component for classifying finger vein features, and inputting output features of the light weight neural network into the classification component;
constructing bulk hard loss functions
Figure FDA0002410285050000011
Constructing cross entropy loss function after label smoothing and regularization
Figure FDA0002410285050000012
Obtaining an integral loss function
Figure FDA0002410285050000013
Comprises the following steps:
Figure FDA0002410285050000014
training a light weight neural network, updating the weight of the light weight neural network according to the loss value, and storing the weight of the current light weight neural network after training;
and (4) extracting the features of the light weight neural network after the weight is updated, calculating cosine similarity between the features, and outputting a finger vein verification result.
2. The method for verifying the finger vein based on the lightweight neural network as claimed in claim 1, wherein the specific steps of extracting the region of interest of the finger vein image comprise:
detecting the upper edge and the lower edge of the image by adopting an extended Prewitt edge detection operator, and respectively recording masks for detecting the upper edge and the lower edgeIs MaskuAnd MaskdThe concrete formula is as follows:
Figure FDA0002410285050000015
Figure FDA0002410285050000016
and fitting the central lines of the upper edge and the lower edge by adopting a least square method, performing rotation correction on the finger vein image according to the deflection angle of the central line, finally extracting a circumscribed rectangle of the edge line after rotation as an image of the region of interest, and unifying the size of the image of the region of interest by adopting bilinear interpolation.
3. The finger vein authentication method based on the lightweight neural network is characterized in that the lightweight neural network for finger vein feature extraction is constructed, and the neural network after the first maximum pooling layer is removed by adopting the neural network ShuffleNet V2 is used as the lightweight neural network.
4. The finger vein authentication method based on the lightweight neural network is characterized in that the batch of images are input into the lightweight neural network after being subjected to real-time data amplification, and the real-time data amplification comprises the steps of random brightness change, random cutting, random rotation and random erasing;
the random brightness variation step adopts a random factor to adjust the brightness of the interested area of the finger vein image;
in the random cutting step, a cutting frame with a fixed size is adopted to perform random cutting in the appointed range of the image;
the random rotation step carries out random angle rotation on the image within a set angle threshold value, and images with the random angle rotation are unified in size by adopting bilinear interpolation;
and in the random erasing step, an erasing block is arranged to randomly erase each position of the image.
5. The light weight neural network-based finger vein authentication method of claim 1, wherein the building of a batch hard loss function
Figure FDA0002410285050000021
The concrete formula is as follows:
Figure FDA0002410285050000022
wherein F represents the output characteristics of the lightweight neural network,
Figure FDA0002410285050000023
features of the x-th image representing a finger of the y-th class, subscripts a, p, n representing anchor, positive and negative sample points of the triplet, respectively, m representing the minimum separation between pairs of positive and negative samples, and D representing a distance metric function.
6. The light weight neural network-based finger vein verification method according to claim 1, wherein the cross entropy loss function after the construction of the label smoothing regularization
Figure FDA0002410285050000024
The concrete formula is as follows:
Figure FDA0002410285050000031
wherein l represents the true probability distribution, q represents the predicted probability distribution, M represents the total number of categories of the finger vein training set, and lvRepresenting true probability for class v, qvRepresenting the prediction probability for the v-th class, and r represents the true label of the image, representing a constant.
7. A finger vein verification method based on a lightweight neural network according to claim 1, wherein the training lightweight neural network is trained, a threshold value of training times is set to e-500, an Adam optimizer is used as a training optimizer, and a gradual preheating strategy is adopted for learning rate, and specifically, the method is set as follows:
Figure FDA0002410285050000032
where lr (t) represents the learning rate, and t represents the iteration round.
8. A light weight neural network-based finger vein verification method according to any one of claims 1-7, wherein the training light weight neural network calls ImageNet dataset trained network weights for light weight neural network weight initialization at an initial stage of light weight neural network training.
9. A finger vein authentication system based on a lightweight neural network, comprising: the system comprises a region-of-interest extraction module, a light weight neural network construction module, a batch image construction module, a real-time data amplification module, a classification component construction module, a loss function construction module, a light weight neural network training module and a finger vein verification module;
the interested region extraction module is used for extracting an interested region of the finger vein image;
the light weight neural network construction module is used for constructing a light weight neural network, and the light weight neural network is used for finger vein feature extraction;
the batch image construction module is used for randomly selecting a plurality of types of fingers in each training batch and randomly selecting a plurality of interested area images in each type of fingers to construct a batch image;
the real-time data amplification module is used for performing real-time data amplification on the batch of images and then inputting the batch of images into the lightweight neural network;
the classification component construction module is used for constructing a classification component, the classification component is used for classifying the finger vein features, and the output features of the light weight neural network are input into the classification component;
the loss function construction module is used for constructing a batch hard loss function
Figure FDA0002410285050000041
And constructing a cross entropy loss function after label smoothing and regularization
Figure FDA0002410285050000042
Obtaining an integral loss function
Figure FDA0002410285050000043
The light weight neural network training module is used for training the light weight neural network training module, and the weight of the current light weight neural network is saved after the training is finished;
the finger vein verification module is used for extracting the features of the light weight neural network after the weight is updated, calculating the cosine similarity between the features and outputting a finger vein verification result.
10. The finger vein verification system based on the lightweight neural network of claim 9, wherein the real-time data amplification module comprises a random brightness variation unit, a random clipping unit, a random rotation unit and a random erasing unit;
the random brightness change unit adjusts the brightness of the interested region of the finger vein image by adopting a random factor;
the random cutting unit adopts a cutting frame with a fixed size to perform random cutting in the appointed range of the image;
the random rotation unit is used for carrying out random angle rotation on the image within a set angle threshold value, and unifying the size of the image after the random angle rotation by adopting bilinear interpolation;
the random erasing unit is used for setting an erasing block to randomly erase each position of the image.
CN202010174412.2A 2020-03-13 2020-03-13 Light weight neural network-based finger vein verification method and system Active CN111460915B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010174412.2A CN111460915B (en) 2020-03-13 2020-03-13 Light weight neural network-based finger vein verification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010174412.2A CN111460915B (en) 2020-03-13 2020-03-13 Light weight neural network-based finger vein verification method and system

Publications (2)

Publication Number Publication Date
CN111460915A true CN111460915A (en) 2020-07-28
CN111460915B CN111460915B (en) 2023-04-18

Family

ID=71685871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010174412.2A Active CN111460915B (en) 2020-03-13 2020-03-13 Light weight neural network-based finger vein verification method and system

Country Status (1)

Country Link
CN (1) CN111460915B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112116012A (en) * 2020-09-23 2020-12-22 大连海事大学 Finger vein instant registration and identification method and system based on deep learning
CN113012132A (en) * 2021-03-22 2021-06-22 平安科技(深圳)有限公司 Image similarity determining method and device, computing equipment and storage medium
CN113076927A (en) * 2021-04-25 2021-07-06 华南理工大学 Finger vein identification method and system based on multi-source domain migration
CN113221911A (en) * 2021-04-09 2021-08-06 华南理工大学 Vehicle weight identification method and system based on dual attention mechanism
CN114821201A (en) * 2022-06-28 2022-07-29 江苏广坤铝业有限公司 Hydraulic corner impacting machine for aluminum processing and using method thereof
CN117496562A (en) * 2024-01-02 2024-02-02 深圳大学 Finger vein recognition method and device based on FV-MViT and related medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110263659A (en) * 2019-05-27 2019-09-20 南京航空航天大学 A kind of finger vein identification method and system based on triple loss and lightweight network

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110263659A (en) * 2019-05-27 2019-09-20 南京航空航天大学 A kind of finger vein identification method and system based on triple loss and lightweight network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张丽萍;李卫军;宁欣;董肖莉;刘文杰;: "一种基于2DHOL特征与(2D)~2FPCA结合的手指静脉识别方法" *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112116012A (en) * 2020-09-23 2020-12-22 大连海事大学 Finger vein instant registration and identification method and system based on deep learning
CN112116012B (en) * 2020-09-23 2024-03-19 大连海事大学 Finger vein instant registration and identification method and system based on deep learning
CN113012132A (en) * 2021-03-22 2021-06-22 平安科技(深圳)有限公司 Image similarity determining method and device, computing equipment and storage medium
CN113012132B (en) * 2021-03-22 2023-08-25 平安科技(深圳)有限公司 Image similarity determination method and device, computing equipment and storage medium
CN113221911A (en) * 2021-04-09 2021-08-06 华南理工大学 Vehicle weight identification method and system based on dual attention mechanism
CN113076927A (en) * 2021-04-25 2021-07-06 华南理工大学 Finger vein identification method and system based on multi-source domain migration
CN114821201A (en) * 2022-06-28 2022-07-29 江苏广坤铝业有限公司 Hydraulic corner impacting machine for aluminum processing and using method thereof
CN114821201B (en) * 2022-06-28 2022-09-20 江苏广坤铝业有限公司 Hydraulic corner impacting machine for aluminum processing and using method thereof
CN117496562A (en) * 2024-01-02 2024-02-02 深圳大学 Finger vein recognition method and device based on FV-MViT and related medium
CN117496562B (en) * 2024-01-02 2024-03-29 深圳大学 Finger vein recognition method and device based on FV-MViT and related medium

Also Published As

Publication number Publication date
CN111460915B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN111460915B (en) Light weight neural network-based finger vein verification method and system
US10891511B1 (en) Human hairstyle generation method based on multi-feature retrieval and deformation
CN108388896B (en) License plate identification method based on dynamic time sequence convolution neural network
CN107610087B (en) Tongue coating automatic segmentation method based on deep learning
CN110473196B (en) Abdomen CT image target organ registration method based on deep learning
Zhu et al. Discriminative 3D morphable model fitting
WO2019071976A1 (en) Panoramic image saliency detection method based on regional growth and eye movement model
CN110069989B (en) Face image processing method and device and computer readable storage medium
CN110909636B (en) Face recognition method based on non-uniform distribution
WO2015131468A1 (en) Method and system for estimating fingerprint pose
CN103971122B (en) Three-dimensional face based on depth image describes method
CN103886335B (en) Classification of Polarimetric SAR Image method based on Fuzzy particle swarm artificial and scattering entropy
CN107862680B (en) Target tracking optimization method based on correlation filter
CN107609571B (en) Adaptive target tracking method based on LARK features
CN109934258B (en) Image retrieval method based on feature weighting and region integration
CN110516533A (en) A kind of pedestrian based on depth measure discrimination method again
CN113947814A (en) Cross-visual angle gait recognition method based on space-time information enhancement and multi-scale saliency feature extraction
CN103745197A (en) Detection method of license plate and device thereof
CN115457277A (en) Intelligent pavement disease identification and detection method and system
CN109840529B (en) Image matching method based on local sensitivity confidence evaluation
CN110969101A (en) Face detection and tracking method based on HOG and feature descriptor
CN112926592B (en) Trademark retrieval method and device based on improved Fast algorithm
Yi et al. Illumination normalization of face image based on illuminant direction estimation and improved retinex
CN112489089B (en) Airborne ground moving target identification and tracking method for micro fixed wing unmanned aerial vehicle
CN109977892B (en) Ship detection method based on local saliency features and CNN-SVM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant