CN114780769B - Personnel checking method based on bloom filter - Google Patents

Personnel checking method based on bloom filter Download PDF

Info

Publication number
CN114780769B
CN114780769B CN202210662860.6A CN202210662860A CN114780769B CN 114780769 B CN114780769 B CN 114780769B CN 202210662860 A CN202210662860 A CN 202210662860A CN 114780769 B CN114780769 B CN 114780769B
Authority
CN
China
Prior art keywords
image
iris
fingerprint
rectangular
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210662860.6A
Other languages
Chinese (zh)
Other versions
CN114780769A (en
Inventor
谢丹
张纪周
牛曦恺
陈文波
毛群飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hezhong Data Technology Co ltd
Original Assignee
Hangzhou Hezhong Data Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hezhong Data Technology Co ltd filed Critical Hangzhou Hezhong Data Technology Co ltd
Priority to CN202210662860.6A priority Critical patent/CN114780769B/en
Publication of CN114780769A publication Critical patent/CN114780769A/en
Application granted granted Critical
Publication of CN114780769B publication Critical patent/CN114780769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/532Query formulation, e.g. graphical querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Mathematical Physics (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention discloses a bloom filter-based personnel checking method, which is used for defining detail characteristic areas of fingerprint images, face images and iris images in a gridding mode and calculating
Figure 812359DEST_PATH_IMAGE001
Figure 57396DEST_PATH_IMAGE002
Figure 167959DEST_PATH_IMAGE003
Figure 891064DEST_PATH_IMAGE004
The method comprises the steps of representing detail feature differences, and dividing each person under the same crowd classification into corresponding data subsets according to the differences, wherein the divided data subsets are more scientific; calculating by using the corresponding relation between the bit array point position and the deviation degree range of the bloom filter
Figure 33332DEST_PATH_IMAGE005
Figure 918112DEST_PATH_IMAGE006
Figure 778620DEST_PATH_IMAGE007
Figure 570996DEST_PATH_IMAGE008
The deviation degree of each of the first to fourth image sets can be quickly matched with the image set corresponding to the corresponding point according to the deviation degree range in which the deviation degree falls, the acquisition speed of the image set serving as a basis for personnel checking and matching is improved, and the method is favorable for acquiring the image setsThe speed of personnel checking is further improved; the preset matching rules are used for checking and matching layer by layer, so that the matching accuracy is ensured, and meanwhile, the matching speed is considered.

Description

Personnel checking method based on bloom filter
Technical Field
The invention relates to the technical field of personnel checking, in particular to a personnel checking method based on a bloom filter.
Background
At present, the information means of people verification which is relatively common is mainly fingerprint identification, face identification and iris identification, and the verification mode is generally that the fingerprint information or face information or iris information of people to be verified is compared with the fingerprint/face/iris information of people in a database, and when the comparison is successful, the comparison result is output. However, the database integrates the identity recognition features of all people groups of different ages, sexes and the like, and when identity verification is performed on specific personnel, particularly on specific batch personnel, the comparison process needs to take a long time. In order to solve the technical problem, in the prior art, people are generally classified by defining a data subset, and when people belonging to the same class are subjected to identity verification, identity characteristics of the people are firstly compared with identity characteristics of all people in the data subset, so that the comparison speed is increased. However, the simple incorporation of the identity characteristics of persons identified as the same class into the same data subset does not take into account the detail differences in identity characteristics between different persons, and this kind of determination is not scientific enough, and directly affects the comparison speed and the comparison accuracy.
Disclosure of Invention
The invention provides a bloom filter-based personnel checking method aiming at improving the identity comparison speed and the comparison accuracy of personnel checking.
In order to achieve the purpose, the invention adopts the following technical scheme:
the personnel checking method based on the bloom filter comprises the following steps:
s1, calculating eachDistance value and value of fingerprint image
Figure 858677DEST_PATH_IMAGE001
Or accumulated value of quantity
Figure 653458DEST_PATH_IMAGE002
(ii) a And calculating the ratio and value of each face image
Figure 488690DEST_PATH_IMAGE003
(ii) a And calculating the sum ratio of each iris image
Figure 609092DEST_PATH_IMAGE004
S2, drawing each fingerprint image into
Figure 529775DEST_PATH_IMAGE001
The first image set corresponding to the falling sum value interval or each fingerprint image is marked into
Figure 859738DEST_PATH_IMAGE002
The second image set corresponding to the falling number interval; drawing each face image into
Figure 877372DEST_PATH_IMAGE003
The third image set corresponding to the falling sum value interval; drawing each iris image into
Figure 840780DEST_PATH_IMAGE004
The fourth image set corresponding to the falling ratio interval;
s3, based on a preset deviation degree range, mapping each of the first to fourth image sets as elements of a bloom filter to corresponding points of a bit array;
s4, calculating the corresponding fingerprint image in each first image set
Figure 311076DEST_PATH_IMAGE001
Average value of (1), noteIs composed of
Figure 650921DEST_PATH_IMAGE005
(ii) a And calculating the corresponding fingerprint image in each second image set
Figure 523062DEST_PATH_IMAGE002
Is the average value of
Figure 657372DEST_PATH_IMAGE006
(ii) a And calculating the corresponding face image in each third image set
Figure 612034DEST_PATH_IMAGE003
Is the average value of
Figure 489991DEST_PATH_IMAGE007
(ii) a And calculating the corresponding of each iris image in each fourth image set
Figure 216639DEST_PATH_IMAGE004
Is the average value of
Figure 584166DEST_PATH_IMAGE008
S5, acquiring the fingerprint image, the face image and the iris image of the person to be checked, and then calculating the distance value and the value of the fingerprint image of the person to be checked
Figure 232316DEST_PATH_IMAGE009
Or accumulated value of quantity
Figure 648385DEST_PATH_IMAGE010
And calculating the ratio and value of the face image of the person to be checked
Figure 492189DEST_PATH_IMAGE011
And calculating the sum ratio of the iris images of the person to be checked
Figure 968300DEST_PATH_IMAGE012
S6, calculating
Figure 634905DEST_PATH_IMAGE009
Figure 120244DEST_PATH_IMAGE010
Figure 555905DEST_PATH_IMAGE011
Figure 530814DEST_PATH_IMAGE012
A degree of deviation from each of the first to fourth image sets;
s7, matching an image set corresponding to the point location according to the corresponding relation between the deviation degree range in which the deviation degree falls and the point location in the point location array;
and S8, according to a preset personnel checking method, performing personnel checking matching on the acquired fingerprint image, face image and iris image of the personnel to be checked and each image set matched in the step S7 according to a preset matching rule, and outputting a checking result.
Preferably, in step S1, the distance value and the value of each fingerprint image
Figure 622398DEST_PATH_IMAGE001
The method is calculated by the following method steps:
a1, equally dividing each side into a plurality of sections in an equally-spaced mode for the width and the height of each fingerprint image classified by the same crowd framed in a rectangular frame selection mode;
a2, starting from each bisector, connecting lines to the opposite bisector of the opposite side in a mode of being perpendicular to the side where the starting point is located, so as to disperse the fingerprint image into a plurality of rectangular sub-blocks, taking the rectangular sub-block at the left top corner of the image as a starting sub-block of a standard sequence, and marking each rectangular sub-block according to a convolution sequence in a mode of circling the standard sequence anticlockwise inwards;
a3, filtering the rectangular subblocks which do not carry fingerprint information and are fully loaded with fingerprint information in the fingerprint image, wherein the fully loaded fingerprint information indicates that pixels representing the fingerprint information touch each side of the rectangular subblocks to which the pixels belong;
a4, filtering each of the remaining rectangular sub-blocks in the step A3, and searching boundary pixel points of fingerprint information in each rectangular sub-block;
a5, calculating the distance between each boundary pixel point and the left vertex of the rectangular sub-block, and summing the distances according to the following formula (1) to obtain the distance value corresponding to each rectangular sub-block
Figure 705236DEST_PATH_IMAGE013
Figure 198665DEST_PATH_IMAGE014
In the formula (1), the first and second groups,
Figure 344476DEST_PATH_IMAGE015
representing the second in the fingerprint image
Figure 720094DEST_PATH_IMAGE016
A distance value of each of the rectangular sub-blocks;
Figure 547235DEST_PATH_IMAGE017
denotes the first
Figure 957488DEST_PATH_IMAGE016
The first of the rectangular sub-blocks
Figure 474532DEST_PATH_IMAGE018
The distance between each boundary pixel point and the left vertex of the rectangular sub-block;
Figure 337446DEST_PATH_IMAGE019
is shown as
Figure 702700DEST_PATH_IMAGE016
The number of the boundary pixel points in each rectangular sub-block;
a6, calculating the distance value and the value of all the rectangular sub-blocks remaining after the filtering of step A3 by the following formula (2)
Figure 967459DEST_PATH_IMAGE020
Figure 189493DEST_PATH_IMAGE021
In the formula (2), the first and second groups of the compound,
Figure 742965DEST_PATH_IMAGE022
to represent
Figure 974226DEST_PATH_IMAGE023
In the calculation of
Figure 762666DEST_PATH_IMAGE020
The weight occupied by the hour;
Figure 890022DEST_PATH_IMAGE024
representing the number of the rectangular sub-blocks remaining filtered by step a 3.
Preferably, in step S1, each of the fingerprint images
Figure 993107DEST_PATH_IMAGE025
The method is calculated by the following method steps:
b1 calculated for each of the fingerprint image and the standard fingerprint image
Figure 496901DEST_PATH_IMAGE026
And the two rectangular sub-blocks having the same rank number calculate the difference in distance value by the following formula (3)
Figure 205094DEST_PATH_IMAGE027
Figure 706614DEST_PATH_IMAGE028
In the formula (3), the first and second groups of the compound,
Figure 317503DEST_PATH_IMAGE029
indicating participation
Figure 828250DEST_PATH_IMAGE030
A calculated number one of the fingerprint images
Figure 125370DEST_PATH_IMAGE031
A distance value of each of the rectangular sub-blocks;
Figure 594529DEST_PATH_IMAGE032
indicating participation
Figure 344310DEST_PATH_IMAGE033
The calculated second one of the standard fingerprint image and the fingerprint image
Figure 721065DEST_PATH_IMAGE031
The first sub-blocks of the rectangle have the same rank number
Figure 135341DEST_PATH_IMAGE034
A distance value of each of the rectangular sub-blocks;
b2, pair
Figure 244242DEST_PATH_IMAGE035
The rectangular subblocks in the fingerprint image which are smaller than the difference value threshold are listed as quantity accumulation objects, and each fingerprint image conforms to the requirements
Figure 746899DEST_PATH_IMAGE036
The quantity accumulation of the rectangular blocks under the quantity accumulation condition is carried out to obtain the quantity accumulation value associated with each fingerprint image
Figure 865028DEST_PATH_IMAGE036
Preferably, each rectangular sub-block in the standard fingerprint image is divided into a plurality of sub-blocks
Figure 933478DEST_PATH_IMAGE037
The value is calculated by the following equation (4):
Figure 213281DEST_PATH_IMAGE038
in the formula (4), the first and second groups,
Figure 997041DEST_PATH_IMAGE039
representing a first of said first or second set of images
Figure 387702DEST_PATH_IMAGE040
Opening the sum in the fingerprint image
Figure 576238DEST_PATH_IMAGE041
The corresponding rectangular sub-blocks have the distance values of the rectangular sub-blocks with the same row sequence number;
Figure 823680DEST_PATH_IMAGE042
representing a number of the fingerprint images stored in the first image set or the second image set.
Preferably, in step S1, each of the face images
Figure 35350DEST_PATH_IMAGE043
The method is calculated by the following method steps:
c1, shooting the face of each person under the same crowd classification at a fixed distance and a fixed angle to obtain the face image of each person with the same size;
c2, equally dividing each edge into a plurality of segments in an equally-spaced mode for the width and the height of each face image;
c3, starting from each bisector, connecting lines to the opposite bisectors of the opposite sides in a mode of being perpendicular to the side where the starting point is located, so as to disperse the face image into a plurality of rectangular blocks, using the rectangular block at the left top corner of the image as a starting block for marking, and marking each rectangular block according to a circling sequence in a mode of circling the rectangular blocks anticlockwise inwards to mark;
c4, filtering out the rectangular blocks which do not carry face information and are fully loaded with the face information in the face image, wherein the fully loaded face information means that pixels representing the face information touch each edge of the rectangular block to which the pixels belong;
c5, searching the face pixels in each rectangular block which is filtered by the step C4 and calculating the number of the searched face pixels and the number of the face pixels in the face image
Figure 292019DEST_PATH_IMAGE044
The ratio of the number of the pixel points in each rectangular block is recorded as
Figure 269814DEST_PATH_IMAGE045
C6, calculating the ratio and value of all the rectangular blocks remaining after the filtering of step C4 by the following equation (5)
Figure 953737DEST_PATH_IMAGE046
Figure 387123DEST_PATH_IMAGE047
In the formula (5), the first and second groups of the chemical reaction materials are selected from the group consisting of,
Figure 447483DEST_PATH_IMAGE048
to represent
Figure 79453DEST_PATH_IMAGE049
In the calculation of
Figure 137539DEST_PATH_IMAGE050
The weight occupied by (c);
Figure 120538DEST_PATH_IMAGE051
representing participation in the face image
Figure 653763DEST_PATH_IMAGE050
The number of the rectangular blocks calculated.
Preferably, in step S1, each of the iris images
Figure 405818DEST_PATH_IMAGE052
The method is calculated by the following method steps:
d1, shooting eye images at a fixed distance and a fixed angle for each person under the same crowd classification, and framing out an iris image from each eye image in a rectangular frame selection mode;
d2, halving each side of the width and the height of each iris image, and connecting the unequal points which are not opposite to each other to obtain a space quadrangle;
d3, calculating the area of the space quadrangle and the rectangular frame of the iris image, and respectively recording as
Figure 634806DEST_PATH_IMAGE053
Figure 105101DEST_PATH_IMAGE054
D4, halving each side of the space quadrangle, and then dividing each side from each halving point
Figure 444947DEST_PATH_IMAGE055
Connecting lines to the iris boundaries of the iris images in a manner of being vertical to the edges, and marking the connected points as vertexes
Figure 51509DEST_PATH_IMAGE056
Figure 448467DEST_PATH_IMAGE057
Figure 406059DEST_PATH_IMAGE058
Respectively represent the first on the space quadrangle
Figure 284016DEST_PATH_IMAGE059
Bisector point of the sides, and from the bisector point
Figure 10664DEST_PATH_IMAGE055
Vertex connecting to iris boundary;
d5, from vertex
Figure 581454DEST_PATH_IMAGE060
To the space quadrilateral at first
Figure 760762DEST_PATH_IMAGE059
Connecting two end points of the sides to obtain a triangle, and marking as
Figure 442410DEST_PATH_IMAGE061
Triangle shape
Figure 286214DEST_PATH_IMAGE061
Will be provided with
Figure 762326DEST_PATH_IMAGE061
The outer iris region is separated into two arc iris regions, which are respectively marked as
Figure 694510DEST_PATH_IMAGE062
Figure 711007DEST_PATH_IMAGE063
D6, calculating triangles
Figure 615509DEST_PATH_IMAGE061
Area of (d) is marked as
Figure 528102DEST_PATH_IMAGE064
D7 in the form of the triangle
Figure 882335DEST_PATH_IMAGE061
The two waists are
Figure 968103DEST_PATH_IMAGE065
And the edges are used for equally dividing each waist, connecting lines from the equal division points to the iris boundary of the arc iris region in a mode of being vertical to the waist to obtain connection vertexes, and connecting the connection vertexes to the triangle
Figure 461532DEST_PATH_IMAGE061
The two end points of the waist are connected to obtain a triangle, and the triangle is marked as
Figure 341764DEST_PATH_IMAGE066
q=1 or 2;
d8, calculating triangles
Figure 717381DEST_PATH_IMAGE067
Area of (d) is marked as
Figure 544523DEST_PATH_IMAGE068
D9 in the form of the triangle
Figure 220355DEST_PATH_IMAGE069
The two waists are
Figure 495258DEST_PATH_IMAGE065
And (3) connecting the lines to obtain triangles and calculating the areas of the triangles by the method described in the steps D7-D8 until reaching the preset number of times of triangle construction, and then calculating the area of the iris image by the following formula (6)
Figure 358172DEST_PATH_IMAGE070
Figure 989004DEST_PATH_IMAGE071
In the formula (6), the first and second groups of the compound,
Figure 457026DEST_PATH_IMAGE072
representing that the first division of any division block of the iris image is performed by taking the central point of the iris image as the origin of an XY axis coordinate system
Figure 147901DEST_PATH_IMAGE072
Next to
Figure 763690DEST_PATH_IMAGE065
Dividing the edges equally;
Figure 664126DEST_PATH_IMAGE073
representing pairs of arc-shaped iris areas
Figure 48971DEST_PATH_IMAGE074
Or to arc-shaped iris areas
Figure 645168DEST_PATH_IMAGE075
To carry out
Figure 748253DEST_PATH_IMAGE065
Number of edge equal divisions;
Figure 455309DEST_PATH_IMAGE076
representing pairs of arc-shaped iris areas
Figure 694661DEST_PATH_IMAGE074
Or to arc-shaped iris areas
Figure 724409DEST_PATH_IMAGE075
To proceed with
Figure 252474DEST_PATH_IMAGE065
The edges being equally spaced
Figure 763221DEST_PATH_IMAGE065
The number of edges;
Figure 122658DEST_PATH_IMAGE077
representing the first obtained by equally dividing the iris image
Figure 60658DEST_PATH_IMAGE077
Dividing the divided blocks;
d10, calculating the iris image by the following formula (7)
Figure 872756DEST_PATH_IMAGE078
Figure 184264DEST_PATH_IMAGE079
Preferably, in step S6, the calculation is performed by the following formula (7)
Figure 398208DEST_PATH_IMAGE080
Figure 241530DEST_PATH_IMAGE081
Figure 275345DEST_PATH_IMAGE082
Figure 862315DEST_PATH_IMAGE083
Degree of deviation from each of the first to fourth image sets:
Figure 665186DEST_PATH_IMAGE084
in the formula (7), the first and second groups,
Figure 942059DEST_PATH_IMAGE085
to represent
Figure 728750DEST_PATH_IMAGE086
The degree of deviation from each first image set;
Figure 119411DEST_PATH_IMAGE087
represent
Figure 307947DEST_PATH_IMAGE088
A degree of deviation from each of said second image sets;
Figure 758651DEST_PATH_IMAGE089
to represent
Figure 32637DEST_PATH_IMAGE090
A degree of deviation from each of said third image sets;
Figure 224060DEST_PATH_IMAGE091
to represent
Figure 1523DEST_PATH_IMAGE092
A degree of deviation from each of said fourth image sets.
Preferably, in step S8, the human audit matching is performed by a matching rule expressed by the following method steps:
e1, judging whether the first image set or the second image set is matched as the checking basis for the person to be checked,
if yes, go to step E2;
if not, jumping to the step E3;
e2, fingerprint comparison is carried out on the fingerprint image of the person to be checked and each fingerprint image in the first image set or the second image set which is matched with the fingerprint image of the person to be checked,
if the comparison is successful, outputting a fingerprint comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E3;
e3, determining whether the third image set is matched as the checking basis for the person to be checked,
if yes, go to step E4;
if not, jumping to step E5;
e4, comparing the face image of the person to be checked with each face in the third image set,
if the comparison is successful, outputting a face comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E5;
e5, determining whether the fourth image set is matched as the checking basis for the person to be checked,
if yes, go to step E6;
if not, jumping to step E7;
e6, comparing the iris image of the person to be checked with each iris image in the fourth image set,
if the comparison is successful, outputting an iris comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E7;
e7, comparing the fingerprint image of the person to be checked with the fingerprints of all the first image sets or the second image sets, comparing the face image of the person to be checked with the faces of all the third image sets, comparing the iris image of the person to be checked with the irises of all the fourth image sets,
if any item is successfully compared, outputting a comparison result and terminating the personnel checking process, otherwise, outputting a comparison failure result.
The invention has the following beneficial effects:
1. defining the detail characteristic regions of fingerprint image, human face image and iris image in a gridding mode, and calculating
Figure 685445DEST_PATH_IMAGE093
Figure 118832DEST_PATH_IMAGE094
Figure 179192DEST_PATH_IMAGE095
Figure 14424DEST_PATH_IMAGE096
The detail feature differences are characterized, and each person under the same crowd classification is classified into a corresponding data subset (namely each image set) according to the differences, and the classified data subset is more scientific;
2. calculating by using the corresponding relation between the bit array point position and the deviation degree range of the bloom filter
Figure 869247DEST_PATH_IMAGE097
Figure 52579DEST_PATH_IMAGE098
Figure 651051DEST_PATH_IMAGE099
Figure 340789DEST_PATH_IMAGE100
The deviation degree of each of the first to fourth image sets can be matched with the image set corresponding to the corresponding point position quickly according to the deviation degree range in which the deviation degree falls, so that the acquisition speed of the image set serving as a personnel checking matching basis is increased, and the personnel checking speed is further increased;
3. and the preset matching rules are utilized, the identity matching is carried out on the personnel to be checked layer by layer, the matching result is output and the subsequent matching process is terminated as long as a certain layer is successfully matched, and the matching accuracy is ensured while the matching speed is considered.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments of the present invention will be briefly described below. It is obvious that the drawings described below are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
FIG. 1 is a diagram illustrating implementation steps of a bloom filter-based people checking method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a fingerprint image being equally divided into a number of rectangular sub-blocks and the rectangular sub-blocks being sorted;
FIG. 3 is a schematic diagram of calculating the distance between the boundary pixel point and the left vertex of the rectangular sub-block in the fingerprint image;
FIG. 4 is a diagram of calculating sum ratio of iris images
Figure 366514DEST_PATH_IMAGE101
Schematic representation of (a).
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
Wherein the showings are for the purpose of illustration only and are shown by way of illustration only and not in actual form, and are not to be construed as limiting the present patent; to better illustrate the embodiments of the present invention, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if the terms "upper", "lower", "left", "right", "inner", "outer", etc. are used for indicating the orientation or positional relationship based on the orientation or positional relationship shown in the drawings, it is only for convenience of description and simplification of description, but it is not indicated or implied that the referred device or element must have a specific orientation, be constructed in a specific orientation and be operated, and therefore, the terms describing the positional relationship in the drawings are only used for illustrative purposes and are not to be construed as limitations of the present patent, and the specific meanings of the terms may be understood by those skilled in the art according to specific situations.
In the description of the present invention, unless otherwise explicitly specified or limited, the term "connected" or the like, if appearing to indicate a connection relationship between components, is to be understood broadly, for example, as being either fixedly connected, detachably connected, or integrated; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be connected through any combination of two or more members or structures. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The personnel checking method based on the bloom filter provided by the embodiment of the invention is described in figure 1, and comprises the following steps:
s1, calculating the distance value and the value of each fingerprint image
Figure 40072DEST_PATH_IMAGE102
Or accumulated value of quantity
Figure 176655DEST_PATH_IMAGE103
(ii) a And calculating the ratio and value of each face image
Figure 983550DEST_PATH_IMAGE104
(ii) a And calculating the sum ratio of each iris image
Figure 445755DEST_PATH_IMAGE105
Distance value and value of each fingerprint image
Figure 75451DEST_PATH_IMAGE106
The method is calculated by the following method steps:
a1, equally dividing each edge into a plurality of segments at equal intervals for the width and the height of each fingerprint image under the same crowd classification (such as male teenagers with age groups of 13-17) framed in a rectangular frame selection mode, and referring to fig. 2 for an example of equal division;
a2, starting from each bisector, connecting lines to the opposite bisector of the opposite side in a manner of being perpendicular to the side where the starting point is located, so as to disperse the fingerprint image into a plurality of rectangular sub-blocks, taking the rectangular sub-block at the left top corner of the image as an initial sub-block of a standard sequence (the sub-block with the sequence of '1' in the figure 2 represents the initial sub-block), and marking each rectangular sub-block according to a convolution sequence in a manner of circling the standard sequence counterclockwise inwards;
a3, filtering out the fingerprint information (which is the pixels carrying the fingerprint information and not representing the fingerprint information in the sub-blocks, such as the rectangular sub-blocks with the sequence of "1", "2", "12", "18", "29" in FIG. 2) and the rectangular sub-blocks with full fingerprint information (which is the rectangular sub-blocks with the sequence of "8", "36", "22", "24" in FIG. 2) which indicate that the pixels with full fingerprint information touch each side of the sub-blocks);
a4, for each of the rectangular sub-blocks filtered in step A3, searching boundary pixels of the fingerprint information in each rectangular sub-block (as indicated by marks P1 and P2 in fig. 3), where there are many existing methods for searching boundary pixels, such as identifying a point where the fingerprint information inside the rectangular sub-block is interrupted as a boundary pixel;
a5, calculating the distance (as represented by L1 and L2 in figure 2) between each boundary pixel point and the left vertex (as represented by the mark P0 in figure 2) of the rectangular sub-block, and summing each distance according to the following formula (1) to obtain the distance value corresponding to each rectangular sub-block
Figure 281304DEST_PATH_IMAGE107
Figure 945635DEST_PATH_IMAGE108
In the formula (1), the first and second groups,
Figure 313162DEST_PATH_IMAGE109
representing the second in a fingerprint image
Figure 528741DEST_PATH_IMAGE110
Distance values of the individual rectangular sub-blocks;
Figure 272706DEST_PATH_IMAGE111
is shown as
Figure 588281DEST_PATH_IMAGE110
The first of the rectangular sub-blocks
Figure 798813DEST_PATH_IMAGE112
The distance between each boundary pixel point and the left vertex of the rectangular sub-block;
Figure 668680DEST_PATH_IMAGE113
denotes the first
Figure 216336DEST_PATH_IMAGE110
The number of boundary pixel points in each rectangular sub-block;
a6, calculating the distance value and the value of all the rectangular sub-blocks remaining after the filtering of the step A3 by the following formula (2)
Figure 852330DEST_PATH_IMAGE114
Figure 561660DEST_PATH_IMAGE115
In the formula (2), the first and second groups,
Figure 715560DEST_PATH_IMAGE116
to represent
Figure 739011DEST_PATH_IMAGE109
In the calculation of
Figure 294758DEST_PATH_IMAGE117
The weight occupied by the hour;
Figure 112672DEST_PATH_IMAGE118
represents the remainder of the filtration via step A3The number of rectangular sub-blocks.
It should be noted that, the degree of influence of the fingerprint information in each of the rectangular sub-blocks remaining after the filtering in step a3 on the accuracy of the fingerprint comparison result is not the same, for example, the rectangular sub-block with the "4" order in fig. 2 has richer fingerprint features than the rectangular sub-block with the "3" order, so the distance value and the value are the same
Figure 753869DEST_PATH_IMAGE119
Time, can give the bigger weight of the rectangle subblock that the standard order is "4", carry out the image set to the fingerprint image and divide the in-process, through considering the different detail characteristic of fingerprint edge to the different influence degree of fingerprint identification rate of accuracy, be favorable to promoting the division meticulous degree of fingerprint image set, and then be favorable to promoting subsequent fingerprint and compare speed and the degree of accuracy.
The method for dividing the fingerprint image into the corresponding first image set comprises the steps of firstly judging the fingerprint image
Figure 578081DEST_PATH_IMAGE119
And then according to the corresponding relation between the preset sum value interval and the corresponding first image set, dividing the fingerprint image into the corresponding first image set. For example, a predetermined sum value interval is 100-150, the sum value interval corresponds to the first image set 1, if the fingerprint image has
Figure 988334DEST_PATH_IMAGE119
120, then it falls exactly into the sum interval and the fingerprint image is then scored into the first image set 1. For another example, if a certain fingerprint image
Figure 242729DEST_PATH_IMAGE119
If the sum value interval is 180 and falls into the preset sum value interval of 150-200, and the sum value interval corresponds to the first image set 2, the fingerprint image is classified into the first image set 2.
This embodiment provides two different fingerprint image clustering methods, one method
Figure 371222DEST_PATH_IMAGE119
Is divided according to another
Figure 736475DEST_PATH_IMAGE120
The division is made on the basis. To be provided with
Figure 266813DEST_PATH_IMAGE120
The method for dividing according to the method comprises the following steps:
b1 calculated for each fingerprint image and standard fingerprint image
Figure 689180DEST_PATH_IMAGE121
And the distance value difference is calculated by the following formula (3) for two rectangular sub-blocks with the same row number
Figure 39390DEST_PATH_IMAGE122
Figure 208334DEST_PATH_IMAGE123
In the formula (3), the first and second groups,
Figure 796441DEST_PATH_IMAGE124
indicating participation
Figure 127060DEST_PATH_IMAGE125
In the calculated fingerprint image
Figure 433407DEST_PATH_IMAGE126
Distance values of the individual rectangular sub-blocks;
Figure 199851DEST_PATH_IMAGE127
indicating participation
Figure 376885DEST_PATH_IMAGE128
In the calculated standard fingerprint image and in the fingerprint image
Figure 675142DEST_PATH_IMAGE126
The first rectangular sub-blocks having the same row number
Figure 203207DEST_PATH_IMAGE129
Distance values of the individual rectangular sub-blocks;
it should be noted here that although each image is calculated
Figure 713954DEST_PATH_IMAGE130
And
Figure 73391DEST_PATH_IMAGE131
when the people belonging to each fingerprint image are defined as a group under the same classification, such as a group of male teenagers in the ages of 13-18, so as to reduce the influence of different people on the sorting of the rectangular sub-blocks due to different fingerprint sizes as much as possible, but the problem that the sorting of the two rectangular sub-blocks with corresponding positions in the fingerprint image and the standard fingerprint image is the same is difficult to achieve, for example, the rectangular sub-block with the sequence of 4 in fig. 2 has a corresponding relation with the rectangular sub-block with the sequence of 5 in the standard fingerprint image, but the sorting numbers of the two rectangular sub-blocks are different, one is 4, the other is 4, and the calculation is performed according to the step B1
Figure 8461DEST_PATH_IMAGE132
According to the rule, the two rectangular sub-blocks cannot be corresponded, in order to solve the problem, each fingerprint image is firstly amplified to be the same in size, then the fingerprint image is selected by using the rectangular frame with the same width and height and the central point of the fingerprint image as the frame-selected central point frame, and therefore it is ensured that each fingerprint image has the same number of rectangular sub-blocks and the same row serial number corresponding to the position.
And of each rectangular sub-block in the standard fingerprint image
Figure 820560DEST_PATH_IMAGE133
The value is calculated by the following formula (4):
Figure 134997DEST_PATH_IMAGE134
in the formula (4), the first and second groups,
Figure 286624DEST_PATH_IMAGE135
representing the first image set or the second image set
Figure 457842DEST_PATH_IMAGE136
In a fingerprint image
Figure 694920DEST_PATH_IMAGE137
The corresponding rectangular sub-blocks have the distance values of the rectangular sub-blocks with the same row sequence number;
Figure 810119DEST_PATH_IMAGE138
representing the number of fingerprint images stored in the first image set or the second image set.
Due to each rectangular sub-block in the standard fingerprint image
Figure 347411DEST_PATH_IMAGE139
The value is the average of the distance values of the corresponding rectangular sub-blocks of all the fingerprint images in the first image set or the second image set, thereby realizing that each rectangular sub-block in the standard index image has a corresponding rectangular sub-block in each fingerprint image.
B2, pair
Figure 689530DEST_PATH_IMAGE140
The rectangular sub-block columns in the fingerprint image which are less than the difference value threshold value are quantity accumulation objects (
Figure 617166DEST_PATH_IMAGE140
The smaller the size, the higher the similarity of two rectangular sub-blocks with corresponding position), and the fingerprint image is matched with the two rectangular sub-blocks
Figure 538986DEST_PATH_IMAGE141
Number accumulation barThe number of each rectangular block of the fingerprint image is accumulated to obtain the accumulated number value associated with each fingerprint image
Figure 927854DEST_PATH_IMAGE141
For each face image
Figure 440875DEST_PATH_IMAGE142
The method is calculated by the following method steps:
c1, shooting the face of each person under the same crowd classification at a fixed distance and a fixed angle to obtain the face image of each person with the same size;
c2, equally dividing each side into a plurality of sections in an equally-spaced mode for the width and the height of each face image;
c3, starting from each bisector, connecting lines to the opposite bisectors of the opposite sides in a mode of being perpendicular to the side where the starting point is located, so as to disperse the face image into a plurality of rectangular blocks, using the rectangular block at the left top corner of the image as a starting block for marking, and marking each rectangular block according to a circling sequence in a mode of circling the rectangular blocks anticlockwise inwards to mark;
c4, filtering out rectangular blocks which do not carry face information and are fully loaded with the face information in the face image, wherein the fully loaded face information means that pixels representing the face information touch each edge of the rectangular block to which the pixels belong;
the human face image discretization method adopted in the steps C2-C4 is the same as the discretization method of the fingerprint image recorded in the steps A1-A3, and therefore, the description is omitted.
C5, searching the face pixels in each rectangular block filtered by the step C4, and calculating the number of the searched face pixels and the number of the face pixels in the face image
Figure 183703DEST_PATH_IMAGE143
The ratio of the number of pixels in each rectangular block is recorded as
Figure 378055DEST_PATH_IMAGE144
(ii) a For example, the number of pixels characterizing a face in a rectangular block is 100, soThe first of genus
Figure 624360DEST_PATH_IMAGE143
A total of 200 pixels in each rectangular block, then
Figure 42703DEST_PATH_IMAGE145
C6, calculating the ratio and value of all the rectangular blocks remaining after the filtering of step C4 by the following formula (5)
Figure 803986DEST_PATH_IMAGE146
Figure 822536DEST_PATH_IMAGE147
In the formula (5), the first and second groups,
Figure 923348DEST_PATH_IMAGE148
to represent
Figure 778171DEST_PATH_IMAGE149
In the calculation of
Figure 698854DEST_PATH_IMAGE150
The weight occupied in (c);
Figure 297325DEST_PATH_IMAGE051
representing participation in face images
Figure 252643DEST_PATH_IMAGE150
The number of the rectangular blocks calculated.
Similarly, in calculating
Figure 9859DEST_PATH_IMAGE150
Introduction of
Figure 417838DEST_PATH_IMAGE151
The influence degree of different edge areas of the human face on the human face recognition result is considered to be different,for example, the cheekbone position and chin position in the edge area of the human face generally have a larger influence on the result of face recognition than the face position and forehead position.
Of iris images
Figure 226525DEST_PATH_IMAGE152
The method is calculated by the following method steps:
d1, shooting the eye images of each person under the same crowd classification at a fixed distance and a fixed angle, and selecting an iris image from each eye image in a rectangular frame selection mode, wherein the iris image selected by the frame is shown in figure 4, the circle in figure 4 is an iris, and the external rectangle is a rectangular frame of the frame selected iris;
d2, halving each side of the width and the height of each iris image, and connecting the unequal points which are not opposite to each other to obtain a space quadrangle (indicated by the reference sign Q1 in figure 4);
d3, calculating the area of the space quadrangle and the rectangular frame of the iris image, and respectively recording as
Figure 98666DEST_PATH_IMAGE153
Figure 232975DEST_PATH_IMAGE154
D4, bisecting each side of the space quadrilateral (such as bisecting the side labeled "q 1" shown in FIG. 4), and then from each bisecting point
Figure 187637DEST_PATH_IMAGE155
Starting from a line perpendicular to the edge, the line is connected to the iris boundary (e.g. the iris boundary indicated by "R1" in FIG. 4) of the iris image, and the point of connection is denoted as the vertex
Figure 331174DEST_PATH_IMAGE156
Figure 57821DEST_PATH_IMAGE157
Figure 363032DEST_PATH_IMAGE158
Respectively representing the first on a spatial quadrilateral
Figure 807920DEST_PATH_IMAGE159
Bisector point of the sides, and from the bisector point
Figure 489568DEST_PATH_IMAGE160
Vertex connecting to iris boundary;
d5, from vertex
Figure 70722DEST_PATH_IMAGE161
To the spatial quadrangle
Figure 809483DEST_PATH_IMAGE162
The two ends of the edge are connected to obtain a triangle (such as the triangle denoted by the reference number "U1" in FIG. 4), which is marked as
Figure 476088DEST_PATH_IMAGE163
Triangle shape
Figure 961427DEST_PATH_IMAGE163
Will be provided with
Figure 397088DEST_PATH_IMAGE164
The outer iris region is separated into two arc iris regions, which are respectively marked as
Figure 309680DEST_PATH_IMAGE165
(e.g., as represented by the reference numeral area1 in FIG. 4),
Figure 463581DEST_PATH_IMAGE166
(indicated by reference character area2 in FIG. 4);
d6, calculating triangles
Figure 484102DEST_PATH_IMAGE167
Area of (d) is marked as
Figure 39848DEST_PATH_IMAGE168
D7 in the form of triangle
Figure 123342DEST_PATH_IMAGE167
Two waists of (A) are
Figure 764539DEST_PATH_IMAGE162
Dividing each waist equally, connecting lines from the equal division points to the iris boundary of the arc iris region in a mode of being vertical to the waist to obtain connection vertexes, and connecting the connection vertexes to the triangle
Figure 591680DEST_PATH_IMAGE167
The two end points of the waist are connected to obtain a triangle (such as the triangle marked by the reference number "U11" in FIG. 4), which is marked as
Figure 1933DEST_PATH_IMAGE169
Since there are two discrete arc iris areas per triangle, the iris shaping method is not limited to the above-mentioned methodq=1 or 2;
d8, calculating triangles
Figure 253398DEST_PATH_IMAGE170
Area of (d) is marked as
Figure 381891DEST_PATH_IMAGE171
D9 in the form of triangle
Figure 747145DEST_PATH_IMAGE172
The two waists are
Figure 277483DEST_PATH_IMAGE162
And (3) connecting the lines to obtain a triangle by the method of the steps D7-D8, calculating the area of the triangle until reaching the preset number of times of triangle construction, and then calculating the area of the iris image by the following formula (6)
Figure 437200DEST_PATH_IMAGE173
Figure 52989DEST_PATH_IMAGE174
In the formula (6), the first and second groups,
Figure 221934DEST_PATH_IMAGE175
an arbitrary bisecting block (such as the region indicated by the reference numeral "V1" and selected by the bold solid line frame in FIG. 4) for bisecting the iris image with the center point of the iris image as the origin of the XY axis coordinate system is shown as the second bisecting block
Figure 603849DEST_PATH_IMAGE175
Next to
Figure 200046DEST_PATH_IMAGE159
Dividing the edges equally;
Figure 37552DEST_PATH_IMAGE176
representing pairs of arc-shaped iris areas
Figure 744608DEST_PATH_IMAGE177
Or to arc-shaped iris areas
Figure 249539DEST_PATH_IMAGE178
To carry out
Figure 16638DEST_PATH_IMAGE162
Number of edge equal divisions;
Figure 807352DEST_PATH_IMAGE179
representing pairs of arc-shaped iris areas
Figure 318099DEST_PATH_IMAGE177
Or to arc-shaped iris areas
Figure 677536DEST_PATH_IMAGE180
To proceed with
Figure 881115DEST_PATH_IMAGE162
The edges being equally divided
Figure 630897DEST_PATH_IMAGE159
The number of edges;
Figure 742072DEST_PATH_IMAGE181
representing the first obtained by equally dividing the iris image
Figure 383048DEST_PATH_IMAGE181
Dividing the blocks into equal parts;
d10, calculating an iris image by the following formula (7)
Figure 757529DEST_PATH_IMAGE182
Figure 729027DEST_PATH_IMAGE183
Referring to fig. 1, the distance value and the value of each fingerprint image are calculated
Figure 643893DEST_PATH_IMAGE184
Or accumulated value of quantity
Figure 650027DEST_PATH_IMAGE185
And calculating the ratio and value of each human face image
Figure 989217DEST_PATH_IMAGE186
Calculating to obtain the sum ratio of each iris image
Figure 713590DEST_PATH_IMAGE187
Then, the staff verification method based on the bloom filter provided by this embodiment proceeds to the following steps:
s2, drawing each fingerprint image
Figure 166568DEST_PATH_IMAGE184
The first image set corresponding to the falling sum value interval or each fingerprint image is marked into
Figure 292787DEST_PATH_IMAGE185
The second image set corresponding to the falling number interval; drawing each face image into
Figure 540229DEST_PATH_IMAGE186
The third image set corresponding to the falling sum value interval; draw each iris image into
Figure 814215DEST_PATH_IMAGE187
The fourth image set corresponding to the falling ratio interval is obtained, namely the application passes
Figure 8568DEST_PATH_IMAGE184
Or
Figure 251943DEST_PATH_IMAGE185
The correspondence of the section in which the correspondence falls to the corresponding first image set or second image set, and
Figure 935865DEST_PATH_IMAGE186
the correspondence of the section in which the correspondence falls and the corresponding third image set, and
Figure 900410DEST_PATH_IMAGE187
the image set division of the fingerprint image, the face image and the iris image is realized by corresponding relation between the falling interval and the corresponding fourth image set, the provided brand new division mode considers the edge detail characteristics of each fingerprint image, face image and iris image, the core identification characteristics are not taken into the consideration range of the image set division basis, and the division speed of the data set is improved while the division accuracy is considered.
S3, based on the preset deviation degree range, mapping each of the first to fourth image sets as the pair of the bloom filter element to the bit arrayThe application point is up; the technical core of the bloom filter is as follows: the element is mapped to a point of a bit array, and it is known whether the element is in the data set by looking at this point to see if it is a "1" or a "0". The method utilizes the characteristic of the bloom filter to form corresponding relation between each bit array point position and the deviation degree range, and then calculates
Figure 429611DEST_PATH_IMAGE188
Figure 530422DEST_PATH_IMAGE189
Figure 385246DEST_PATH_IMAGE190
Figure 568578DEST_PATH_IMAGE191
The deviation degree of each of the first to fourth image sets can be matched with the image set corresponding to the corresponding point position quickly according to the deviation degree range in which the deviation degree falls, so that the acquisition speed of the image set serving as a personnel checking matching basis is increased, and the personnel checking speed is further increased;
in the staff checking method based on bloom filter provided by the present application, in steps S4-S8, a calculation method of staff deviation degree and a method for quickly matching out an image set corresponding to a corresponding point location based on a corresponding relationship between a deviation degree range and a point location array point location according to a deviation degree range in which the deviation degree falls are described in detail, and specifically referring to fig. 1, the method includes:
s4, calculating the corresponding fingerprint image in each first image set
Figure 104733DEST_PATH_IMAGE192
Is the average value of
Figure 122367DEST_PATH_IMAGE193
(ii) a And calculating the corresponding of each fingerprint image in each second image set
Figure 85775DEST_PATH_IMAGE194
Average value of (2) is
Figure 556071DEST_PATH_IMAGE195
(ii) a And calculating the corresponding of each face image in each third image set
Figure 692654DEST_PATH_IMAGE196
Is the average value of
Figure 765128DEST_PATH_IMAGE197
(ii) a And calculating the corresponding relation of each iris image in each fourth image set
Figure 899437DEST_PATH_IMAGE198
Is the average value of
Figure 857029DEST_PATH_IMAGE199
S5, acquiring the fingerprint image, face image and iris image of the person to be checked, and then calculating the distance value and value of the fingerprint image of the person to be checked
Figure 734986DEST_PATH_IMAGE200
Or accumulated value of quantity
Figure 461634DEST_PATH_IMAGE201
And calculating the ratio and value of the face image of the person to be checked
Figure 32423DEST_PATH_IMAGE202
And calculating the sum ratio of the iris images of the person to be checked
Figure 477311DEST_PATH_IMAGE203
S6, calculated by the following formula (7)
Figure 890450DEST_PATH_IMAGE200
Figure 471605DEST_PATH_IMAGE201
Figure 213296DEST_PATH_IMAGE202
Figure 879900DEST_PATH_IMAGE203
Degree of deviation from each of the first to fourth image sets:
Figure 161977DEST_PATH_IMAGE204
in the formula (7), the first and second groups,
Figure 800900DEST_PATH_IMAGE205
to represent
Figure 772879DEST_PATH_IMAGE206
The degree of deviation from each first image set;
Figure 864463DEST_PATH_IMAGE207
to represent
Figure 153493DEST_PATH_IMAGE088
A degree of deviation from each of said second image sets;
Figure 912502DEST_PATH_IMAGE208
to represent
Figure 261575DEST_PATH_IMAGE209
A degree of deviation from each of said third image sets;
Figure 106034DEST_PATH_IMAGE210
to represent
Figure 930246DEST_PATH_IMAGE211
A degree of deviation from each of said fourth image sets.
S7, matching an image set corresponding to the point in the position array according to the corresponding relation between the deviation degree range in which the deviation degree falls and the point;
s8, according to a preset personnel checking method, the acquired fingerprint image, face image and iris image of the personnel to be checked are matched with each image set matched in the step S7 according to a preset matching rule, and a checking result is output, wherein the specific checking method comprises the following steps:
e1, judging whether the first image set or the second image set is matched as the checking basis of the person to be checked,
if yes, go to step E2;
if not, jumping to step E3;
e2, fingerprint comparison is carried out on the fingerprint image of the person to be checked and each fingerprint image in the first image set or the second image set,
if the comparison is successful, outputting a fingerprint comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E3;
e3, judging whether a third image set serving as the checking basis of the person to be checked is matched,
if yes, go to step E4;
if not, jumping to step E5;
e4, comparing the face image of the person to be checked with each face in the matched third image set,
if the comparison is successful, outputting a face comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E5;
e5, judging whether the fourth image set serving as the checking basis of the person to be checked is matched,
if yes, go to step E6;
if not, jumping to step E7;
e6, comparing the iris image of the person to be checked with each iris image in the fourth image set,
if the comparison is successful, outputting an iris comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E7;
e7, comparing the fingerprint image of the person to be checked with the fingerprints of all the first image sets or the second image sets, comparing the face image of the person to be checked with the faces of all the third image sets, comparing the iris image of the person to be checked with the irises of all the fourth image sets,
if any item is successfully compared, outputting a comparison result and terminating the personnel checking process, otherwise, outputting a comparison failure result.
It should be understood that the above-described embodiments are merely preferred embodiments of the invention and the technical principles applied thereto. It will be understood by those skilled in the art that various modifications, equivalents, changes, and the like can be made to the present invention. However, such variations are within the scope of the invention as long as they do not depart from the spirit of the invention. In addition, certain terminology used in the description and claims of the present application is not limiting, but is used for convenience only.

Claims (4)

1. A personnel checking method based on a bloom filter is characterized by comprising the following steps:
s1, calculating the distance value and the value of each fingerprint image
Figure 191300DEST_PATH_IMAGE003
Or accumulated value of quantity
Figure 631771DEST_PATH_IMAGE005
(ii) a And calculating the ratio and value of each face image
Figure 437921DEST_PATH_IMAGE007
(ii) a And calculating the sum ratio of each iris image
Figure 534316DEST_PATH_IMAGE009
S2, drawing each fingerprint image into
Figure 857850DEST_PATH_IMAGE003
The first image set corresponding to the falling sum value interval or each fingerprint image is marked into
Figure 152827DEST_PATH_IMAGE005
The second image set corresponding to the falling number interval; drawing each face image into
Figure 615032DEST_PATH_IMAGE007
The third image set corresponding to the falling sum value interval; drawing each iris image into
Figure 759575DEST_PATH_IMAGE009
The fourth image set corresponding to the falling ratio interval;
s3, based on a preset deviation degree range, mapping each of the first to fourth image sets as elements of a bloom filter to corresponding points of a bit array;
s4, calculating the corresponding fingerprint image in each first image set
Figure 325948DEST_PATH_IMAGE003
Is the average value of
Figure 239546DEST_PATH_IMAGE011
(ii) a And calculating the corresponding fingerprint image in each second image set
Figure 888964DEST_PATH_IMAGE005
Average value of (2) is
Figure 661748DEST_PATH_IMAGE013
(ii) a And computing each of said third imagesCorresponding to each face image in the set
Figure 94129DEST_PATH_IMAGE007
Is the average value of
Figure 331075DEST_PATH_IMAGE015
(ii) a And calculating the corresponding of each iris image in each fourth image set
Figure 608517DEST_PATH_IMAGE009
Is the average value of
Figure 727652DEST_PATH_IMAGE017
S5, acquiring the fingerprint image, the face image and the iris image of the person to be checked, and then calculating the distance value and the value of the fingerprint image of the person to be checked
Figure 839090DEST_PATH_IMAGE019
Or accumulated value of quantity
Figure 461701DEST_PATH_IMAGE021
And calculating the ratio and value of the face image of the person to be checked
Figure 921763DEST_PATH_IMAGE023
And calculating the sum ratio of the iris images of the person to be checked
Figure 528194DEST_PATH_IMAGE025
S6, calculating
Figure 240060DEST_PATH_IMAGE019
Figure 717178DEST_PATH_IMAGE021
Figure 285825DEST_PATH_IMAGE023
Figure 113972DEST_PATH_IMAGE025
A degree of deviation from each of the first to fourth image sets;
s7, matching an image set corresponding to the point location according to the corresponding relation between the deviation degree range in which the deviation degree falls and the point location in the point location array;
s8, according to a preset personnel checking method, carrying out personnel checking matching on the acquired fingerprint image, face image and iris image of the personnel to be checked and each image set matched in the step S7 according to a preset matching rule, and outputting a checking result;
in step S1, distance value and value of each fingerprint image
Figure 737852DEST_PATH_IMAGE003
The method is calculated by the following method steps:
a1, equally dividing each side into a plurality of sections in an equally-spaced mode for the width and the height of each fingerprint image classified by the same crowd framed in a rectangular frame selection mode;
a2, starting from each bisector, connecting lines to the opposite bisector of the opposite side in a mode of being perpendicular to the side where the starting point is located, so as to disperse the fingerprint image into a plurality of rectangular sub-blocks, taking the rectangular sub-block at the left top corner of the image as a starting sub-block of a standard sequence, and marking each rectangular sub-block according to a convolution sequence in a mode of circling the standard sequence anticlockwise inwards;
a3, filtering the rectangular subblocks which do not carry fingerprint information and are fully loaded with fingerprint information in the fingerprint image, wherein the fully loaded fingerprint information indicates that the pixel representing the fingerprint information touches each side of the rectangular subblock to which the pixel belongs;
a4, filtering each of the remaining rectangular sub-blocks in the step A3, and searching boundary pixel points of fingerprint information in each rectangular sub-block;
a5, calculating the rectangle in which each boundary pixel point is locatedThe distances of the left vertexes of the sub-blocks are summed according to the following formula (1) to obtain the distance value corresponding to each rectangular sub-block
Figure 102099DEST_PATH_IMAGE027
Figure 74603DEST_PATH_IMAGE029
In the formula (1), the first and second groups,
Figure 829195DEST_PATH_IMAGE031
representing the second in the fingerprint image
Figure 443716DEST_PATH_IMAGE033
A distance value of each of the rectangular sub-blocks;
Figure 724787DEST_PATH_IMAGE035
is shown as
Figure 602613DEST_PATH_IMAGE033
The first of the rectangular sub-blocks
Figure 918536DEST_PATH_IMAGE037
The distance between each boundary pixel point and the left vertex of the rectangular sub-block;
Figure 884218DEST_PATH_IMAGE039
is shown as
Figure 190435DEST_PATH_IMAGE033
The number of the boundary pixel points in each rectangular sub-block;
a6, calculating the distance value and the value of all the rectangular sub-blocks remaining after the filtering of step A3 by the following formula (2)
Figure 271786DEST_PATH_IMAGE041
Figure 561822DEST_PATH_IMAGE043
In the formula (2), the first and second groups,
Figure 957293DEST_PATH_IMAGE045
to represent
Figure 383595DEST_PATH_IMAGE047
In the calculation of
Figure 963743DEST_PATH_IMAGE003
The weight occupied by the hour;
Figure 741075DEST_PATH_IMAGE049
representing the number of the rectangular sub-blocks remaining from the filtering of step a 3;
in step S1, each of the fingerprint images
Figure 471396DEST_PATH_IMAGE051
The method is calculated by the following method steps:
b1 calculated for each of the fingerprint image and the standard fingerprint image
Figure 955467DEST_PATH_IMAGE053
And the two rectangular sub-blocks having the same rank number calculate the difference in distance value by the following formula (3)
Figure 644200DEST_PATH_IMAGE055
Figure 456298DEST_PATH_IMAGE057
In the formula (3), the first and second groups,
Figure 223266DEST_PATH_IMAGE059
indicating participation
Figure 125625DEST_PATH_IMAGE061
A calculated number one of the fingerprint images
Figure DEST_PATH_IMAGE063
A distance value of each of the rectangular sub-blocks;
Figure DEST_PATH_IMAGE065
indicating participation
Figure DEST_PATH_IMAGE067
The calculated second one of the standard fingerprint image and the fingerprint image
Figure 176802DEST_PATH_IMAGE063
The first sub-blocks of the rectangle have the same rank number
Figure DEST_PATH_IMAGE069
A distance value of each of the rectangular sub-blocks;
b2, pair
Figure 758087DEST_PATH_IMAGE067
The rectangular subblocks in the fingerprint image which are smaller than the difference value threshold are listed as quantity accumulation objects, and each fingerprint image conforms to the requirements
Figure DEST_PATH_IMAGE071
The quantity accumulation of the rectangular blocks under the quantity accumulation condition is carried out to obtain the quantity accumulation value associated with each fingerprint image
Figure 79479DEST_PATH_IMAGE071
In step S1, each of the face images
Figure DEST_PATH_IMAGE073
The method is calculated by the following method steps:
c1, shooting the face of each person under the same crowd classification at a fixed distance and a fixed angle to obtain the face image of each person with the same size;
c2, equally dividing each edge into a plurality of segments in an equally-spaced mode for the width and the height of each face image;
c3, starting from each bisector, connecting lines to the opposite bisectors of the opposite sides in a mode of being perpendicular to the side where the starting point is located, so as to disperse the face image into a plurality of rectangular blocks, using the rectangular block at the left top corner of the image as a starting block for marking, and marking each rectangular block according to a circling sequence in a mode of circling the rectangular blocks anticlockwise inwards to mark;
c4, filtering out the rectangular blocks which do not carry face information and are fully loaded with the face information in the face image, wherein the fully loaded face information means that pixels representing the face information touch each edge of the rectangular block to which the pixels belong;
c5, searching the face pixels in each rectangular block which is filtered by the step C4 and calculating the number of the searched face pixels and the number of the face pixels in the face image
Figure DEST_PATH_IMAGE075
The ratio of the number of the pixel points in each rectangular block is recorded as
Figure DEST_PATH_IMAGE077
C6, calculating the ratio and value of all the rectangular blocks remaining after the filtering of step C4 by the following formula (5)
Figure DEST_PATH_IMAGE079
Figure DEST_PATH_IMAGE081
In the formula (5), the first and second groups of the chemical reaction materials are selected from the group consisting of,
Figure DEST_PATH_IMAGE083
to represent
Figure DEST_PATH_IMAGE085
In the calculation of
Figure DEST_PATH_IMAGE087
The weight occupied by (c);
Figure DEST_PATH_IMAGE089
representing participation in the face image
Figure 104853DEST_PATH_IMAGE087
The number of the rectangular blocks is calculated;
in step S1, for each of the iris images
Figure DEST_PATH_IMAGE091
The method is calculated by the following method steps:
d1, shooting eye images at a fixed distance and a fixed angle for each person under the same crowd classification, and framing out an iris image from each eye image in a rectangular frame selection mode;
d2, halving each side of the width and the height of each iris image, and connecting the unequal points which are not opposite to each other to obtain a space quadrangle;
d3, calculating the area of the space quadrangle and the rectangular frame of the iris image, and respectively recording the area as
Figure DEST_PATH_IMAGE093
Figure DEST_PATH_IMAGE095
D4, halving each side of the space quadrilateral, and then dividing each side from each halving point
Figure DEST_PATH_IMAGE097
Starting from a line connecting to the iris boundary of the iris image in a mode of being vertical to the edge, and marking the connected point as a vertex
Figure DEST_PATH_IMAGE099
Figure DEST_PATH_IMAGE101
Figure DEST_PATH_IMAGE103
Respectively represent the first on the space quadrangle
Figure DEST_PATH_IMAGE105
Bisector point of the sides, and from the bisector point
Figure DEST_PATH_IMAGE107
Vertex connecting to iris boundary;
d5, from vertex
Figure DEST_PATH_IMAGE109
To the space quadrangle
Figure 212354DEST_PATH_IMAGE105
Connecting two end points of the sides to obtain a triangle, and marking as
Figure DEST_PATH_IMAGE111
Triangle shape
Figure 389257DEST_PATH_IMAGE111
Will be provided with
Figure 733913DEST_PATH_IMAGE111
The outer iris region is separated into two arc iris regions, which are respectively marked as
Figure DEST_PATH_IMAGE113
Figure DEST_PATH_IMAGE115
D6, calculating triangles
Figure 453607DEST_PATH_IMAGE111
Area of (d) is marked as
Figure DEST_PATH_IMAGE117
D7 in the form of the triangle
Figure 61568DEST_PATH_IMAGE111
The two waists are
Figure 725768DEST_PATH_IMAGE105
And the edges are used for equally dividing each waist, connecting lines from the equal division points to the iris boundary of the arc iris region in a mode of being vertical to the waist to obtain connection vertexes, and connecting the connection vertexes to the triangle
Figure 393554DEST_PATH_IMAGE111
Two end points of the waist are connected to obtain a triangle, and the triangle is marked as
Figure DEST_PATH_IMAGE119
Figure DEST_PATH_IMAGE121
D8, calculating triangles
Figure 905438DEST_PATH_IMAGE119
Area of (d) is marked as
Figure DEST_PATH_IMAGE123
D9 in the form of the triangle
Figure 933568DEST_PATH_IMAGE119
The two waists are
Figure 117687DEST_PATH_IMAGE105
And (3) connecting the lines to obtain triangles and calculating the areas of the triangles by the method described in the steps D7-D8 until reaching the preset number of times of triangle construction, and then calculating the area of the iris image by the following formula (6)
Figure DEST_PATH_IMAGE125
Figure DEST_PATH_IMAGE126
In the formula (6), the first and second groups,
Figure DEST_PATH_IMAGE128
representing that the first division of any division block of the iris image is performed by taking the central point of the iris image as the origin of an XY axis coordinate system
Figure DEST_PATH_IMAGE130
Next to
Figure DEST_PATH_IMAGE132
Dividing the edges equally;
Figure DEST_PATH_IMAGE134
representing pairs of arc-shaped iris areas
Figure DEST_PATH_IMAGE136
Or to arc-shaped iris areas
Figure DEST_PATH_IMAGE138
To carry out
Figure 836769DEST_PATH_IMAGE132
Number of edge equal divisions;
Figure DEST_PATH_IMAGE140
representing pairs of arc-shaped iris areas
Figure 762348DEST_PATH_IMAGE136
Or to arc-shaped iris areas
Figure 40008DEST_PATH_IMAGE138
To proceed with
Figure 554166DEST_PATH_IMAGE132
The edges being equally divided
Figure 339588DEST_PATH_IMAGE132
The number of edges;
Figure DEST_PATH_IMAGE142
representing the first obtained by equally dividing the iris image
Figure 435852DEST_PATH_IMAGE142
Dividing the divided blocks;
d10, calculating the iris image by the following formula (7)
Figure DEST_PATH_IMAGE144
Figure DEST_PATH_IMAGE146
2. The bloom filter-based people inspection method of claim 1, wherein each rectangular sub-block in the standard fingerprint image is a block of the fingerprint image
Figure DEST_PATH_IMAGE148
The value is calculated by the following equation (4):
Figure DEST_PATH_IMAGE150
in the formula (4), the first and second groups,
Figure DEST_PATH_IMAGE152
representing a first of said first or second set of images
Figure DEST_PATH_IMAGE154
Opening the sum in the fingerprint image
Figure DEST_PATH_IMAGE156
The corresponding rectangular sub-blocks have the distance values of the rectangular sub-blocks with the same row sequence number;
Figure DEST_PATH_IMAGE158
representing a number of the fingerprint images stored in the first image set or the second image set.
3. The bloom filter-based people-check method according to claim 1, wherein the calculation is performed by the following formula (8) in step S6
Figure DEST_PATH_IMAGE160
Figure DEST_PATH_IMAGE162
Figure DEST_PATH_IMAGE164
Figure DEST_PATH_IMAGE166
Degree of deviation from each of the first to fourth image sets:
Figure DEST_PATH_IMAGE167
in the formula (8), the first and second groups,
Figure DEST_PATH_IMAGE169
to represent
Figure 970564DEST_PATH_IMAGE160
The degree of deviation from each first image set;
Figure DEST_PATH_IMAGE171
to represent
Figure 457171DEST_PATH_IMAGE162
A degree of deviation from each of said second image sets;
Figure DEST_PATH_IMAGE173
to represent
Figure 937962DEST_PATH_IMAGE164
A degree of deviation from each of said third image sets;
Figure DEST_PATH_IMAGE175
to represent
Figure 701781DEST_PATH_IMAGE166
With each of said fourth imagesDegree of deviation of the set.
4. The bloom filter-based people-check method according to claim 1, wherein in step S8, people-check matching is performed by a matching rule expressed by the following method steps:
e1, judging whether the first image set or the second image set is matched as the checking basis for the person to be checked,
if yes, go to step E2;
if not, jumping to step E3;
e2, fingerprint comparison is carried out on the fingerprint image of the person to be checked and each fingerprint image in the first image set or the second image set,
if the comparison is successful, outputting a fingerprint comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E3;
e3, judging whether the third image set serving as the checking basis for the person to be checked is matched or not,
if yes, go to step E4;
if not, jumping to step E5;
e4, comparing the face image of the person to be checked with each face in the third image set,
if the comparison is successful, outputting a face comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E5;
e5, judging whether the fourth image set serving as the checking basis for the person to be checked is matched or not,
if yes, go to step E6;
if not, jumping to step E7;
e6, comparing the iris image of the person to be checked with each iris image in the fourth image set,
if the comparison is successful, outputting an iris comparison result and terminating the personnel checking process;
if the comparison fails, jumping to step E7;
e7, performing fingerprint comparison on the fingerprint image of the person to be checked and all the first image sets or the second image sets, performing face comparison on the face image of the person to be checked and all the third image sets, performing iris comparison on the iris image of the person to be checked and all the fourth image sets,
if any comparison is successful, outputting a comparison result and terminating the personnel checking process, otherwise, outputting a comparison failure result.
CN202210662860.6A 2022-06-13 2022-06-13 Personnel checking method based on bloom filter Active CN114780769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210662860.6A CN114780769B (en) 2022-06-13 2022-06-13 Personnel checking method based on bloom filter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210662860.6A CN114780769B (en) 2022-06-13 2022-06-13 Personnel checking method based on bloom filter

Publications (2)

Publication Number Publication Date
CN114780769A CN114780769A (en) 2022-07-22
CN114780769B true CN114780769B (en) 2022-09-13

Family

ID=82420940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210662860.6A Active CN114780769B (en) 2022-06-13 2022-06-13 Personnel checking method based on bloom filter

Country Status (1)

Country Link
CN (1) CN114780769B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109376725A (en) * 2018-12-21 2019-02-22 北京无线电计量测试研究所 A kind of identification check method and apparatus based on iris recognition
WO2021027540A1 (en) * 2019-08-13 2021-02-18 深圳壹账通智能科技有限公司 Lost person information matching method and apparatus, computer device, and storage medium
CN112562151A (en) * 2020-12-03 2021-03-26 浪潮云信息技术股份公司 Access control system based on bloom filter
CN113705455A (en) * 2021-08-30 2021-11-26 平安银行股份有限公司 Identity verification method and device, electronic equipment and readable storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722696B (en) * 2012-05-16 2014-04-16 西安电子科技大学 Identity authentication method of identity card and holder based on multi-biological characteristics
CN103632132B (en) * 2012-12-11 2017-02-15 广西科技大学 Face detection and recognition method based on skin color segmentation and template matching
US9224042B2 (en) * 2013-04-17 2015-12-29 Honeywell International Inc. Cross-sensor iris matching
CN108768966B (en) * 2018-05-14 2019-05-03 北京邮电大学 Block platform chain and member node and node identities authentication method
US20200036708A1 (en) * 2018-06-15 2020-01-30 Proxy, Inc. Biometric credential improvement methods and apparatus
CN108932775B (en) * 2018-07-10 2020-08-07 蒋钱 Fingerprint lock identity identification system
CN111435558A (en) * 2018-12-26 2020-07-21 杭州萤石软件有限公司 Identity authentication method and device based on biological characteristic multi-mode image
CN111339885B (en) * 2020-02-19 2024-05-28 平安科技(深圳)有限公司 User identity determining method and related device based on iris recognition
CN112329519B (en) * 2020-09-21 2024-01-02 中国人民武装警察部队工程大学 Safe online fingerprint matching method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109376725A (en) * 2018-12-21 2019-02-22 北京无线电计量测试研究所 A kind of identification check method and apparatus based on iris recognition
WO2021027540A1 (en) * 2019-08-13 2021-02-18 深圳壹账通智能科技有限公司 Lost person information matching method and apparatus, computer device, and storage medium
CN112562151A (en) * 2020-12-03 2021-03-26 浪潮云信息技术股份公司 Access control system based on bloom filter
CN113705455A (en) * 2021-08-30 2021-11-26 平安银行股份有限公司 Identity verification method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN114780769A (en) 2022-07-22

Similar Documents

Publication Publication Date Title
CN104866829B (en) A kind of across age face verification method based on feature learning
Chen et al. 3D free-form object recognition in range images using local surface patches
US7512255B2 (en) Multi-modal face recognition
WO2017059591A1 (en) Finger vein identification method and device
CN109934195A (en) A kind of anti-spoofing three-dimensional face identification method based on information fusion
CN106709450A (en) Recognition method and system for fingerprint images
CN106339707B (en) A kind of gauge pointer image-recognizing method based on symmetric characteristics
CN110688901A (en) Face recognition method and device
CN107145727A (en) The medical image processing devices and method of a kind of utilization convolutional neural networks
CN102254165B (en) Hand back vein identification method based on fusion of structural coding features and texture coding features
CN107066961B (en) Fingerprint method for registering and device
CN108764041A (en) The face identification method of facial image is blocked for lower part
CN111460950B (en) Cognitive distraction method based on head-eye evidence fusion in natural driving conversation behavior
CN111339930A (en) Face recognition method combining mask attribute loss function
CN106355139B (en) Face method for anti-counterfeit and device
CN107958443A (en) A kind of fingerprint image joining method based on crestal line feature and TPS deformation models
CN111126240A (en) Three-channel feature fusion face recognition method
CN106934377A (en) A kind of improved face detection system
CN104298995A (en) Three-dimensional face identification device and method based on three-dimensional point cloud
CN110991258A (en) Face fusion feature extraction method and system
CN110287847A (en) Vehicle grading search method based on Alexnet-CLbpSurf multiple features fusion
Lin et al. Local feature tensor based deep learning for 3d face recognition
CN114780769B (en) Personnel checking method based on bloom filter
CN106980845B (en) Face key point positioning method based on structured modeling
CN109815772A (en) Fingerprint enhancement, recognition methods, device and Fingerprint enhancement identifying system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant