CN107909031B - Crime scene fingerprint line leaving area frequency dynamic reconstruction method - Google Patents
Crime scene fingerprint line leaving area frequency dynamic reconstruction method Download PDFInfo
- Publication number
- CN107909031B CN107909031B CN201711128381.1A CN201711128381A CN107909031B CN 107909031 B CN107909031 B CN 107909031B CN 201711128381 A CN201711128381 A CN 201711128381A CN 107909031 B CN107909031 B CN 107909031B
- Authority
- CN
- China
- Prior art keywords
- fingerprint
- data
- line
- area
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/34—Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
- G06V10/993—Evaluation of the quality of the acquired pattern
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1347—Preprocessing; Feature extraction
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Quality & Reliability (AREA)
- Collating Specific Patterns (AREA)
Abstract
The invention relates to a crime scene fingerprint line leaving area frequency dynamic reconstruction method, which comprises the following steps: identifying and extracting data of peripheral contour lines of a field fingerprint line coverage area; reconstructing the remaining area of the fingerprint lines of the site on the fingerprint of the fingerprint seal; and dynamically reconstructing the frequency of the fingerprint line leaving area in the crime scene. The method is mainly used for a super-large-scale fingerprint system with a multi-comparison algorithm architecture, can effectively improve the image quality level of a fingerprint system database and the case solving efficiency of the fingerprint system, and has outstanding significance in the aspects of fingerprint system comparison omission risk analysis, omission target range locking, practical application feasibility evaluation in the court science field of various plane fingerprint image data pressing, and the like.
Description
Technical Field
The invention provides a dynamic reconstruction algorithm model of crime scene fingerprint legacy area frequency based on big data mining and pattern recognition technologies, and belongs to the field of biological feature recognition.
Background
An Automatic Fingerprint Identification System (AFIS) is the most widely applied individual feature identification system in the world at present, is a preferred sharer for all countries to identify the identity of case perpetrators and major case incident sufferers, is a preferred method for the police to quickly and accurately identify the identity of inspectors, unknown corpses, practitioners in special industries, persons with restricted behavior ability, lost persons, persons escaping suspects, key management objects and the like, and has the characteristics of huge case-breaking efficiency, quick response, direct identification of persons and accurate attack which are also generally called for the case handling organizations of all countries.
The database construction quality of the fingerprint database, particularly the quality of the fingerprint image data, is directly related to the system solution level. The storage of fingerprint data with incomplete stamping area, poor definition of lines and poor characteristic reaction can reduce the pre-ranking rate of target candidate fingerprints, obviously improve the missing rate and increase the difficulty of manual authentication, and even mislead a fingerprint expert to wrongly identify innocent persons or wrongly remove involved persons. In addition, the storage of a large amount of low-quality data can cause the reduction of the feedback speed of system comparison, the waiting time of foreground users is prolonged, and the enthusiasm of a first-line court scientific institution for developing the actual combat application of the fingerprint information is influenced.
Aiming at the characteristics of comparison algorithms of AFIS systems of a certain brand, the organization mode of a certain type of acquisition service is considered, and the design of a ten-finger fingerprint image quality control algorithm is not a difficult problem (almost all foreground acquisition workstations and background databases of the AFIS systems have the functions). However, police users in all countries are not satisfied with the quality control mode of the existing AFIS system, and closing the 'system foreground quality control module' to collect fingerprint data also becomes a common phenomenon in the industry. Complaints of police collectors of various countries are mainly focused on: all quality evaluation programs and evaluation algorithms provided by AFIS manufacturers are 'black box modules', evaluation mechanisms are opaque, actual performance cannot be measured and calculated, and the judgment of believing a machine is not the same as the judgment of believing a machine.
In addition, for years, the fingerprint evidence comparison experts of the court scientific departments of various countries have complained about the fingerprint quality evaluation 'black box module', and the main focus is that: the quality evaluation algorithms of the existing fingerprint manufacturers mainly aim at ensuring the comprehensiveness of fingerprint file images (namely ensuring that all part lines of images of each finger position of ten fingerprints are acquired) and improving the accuracy of file image lines (namely ensuring that the image quality reaches the minimum standard of machine feature identification and improving the top-ranking rate of target files in a comparison candidate list). The standard is used as a quality threshold of the ten-finger fingerprint collection work, the comprehensive index of the fingerprint collection is far beyond the level which can be reached by most of collection personnel (in the actual work, all the fingerprints of ten finger positions of a suspect can be collected difficultly) and the actual combat requirement of the fingerprint comparison work (in the actual work, no matter machine comparison or manual authentication, the concerned important point is the image quality level of the fingerprint object evidence left by a target suspect on the fingerprint position corresponding to the area fingerprint image in the crime scene, the influence of factors such as human finger structure, action habits and the like is caused, the frequency of leaving the fingerprints of the ten-finger fingerprints on the crime scene is different, and the leaving frequency of the fingerprints of the regional fingerprints of different finger positions of each finger position is greatly different, the finger position with the highest use frequency and the surface of the most frequently contacted with the scene object are adopted, and the finger areas with the left lines on the spot are the parts of the fingerprint archive image quality control which should be focused on, and the fingerprint image quality levels of the areas are the key points of the fingerprint quality control. In other words, the overall and high-quality acquisition standard pursued by the mainstream quality evaluation algorithm of the existing AFIS system is greatly disconnected from the actual requirements of the police acquisition department and the forensic science department.
Therefore, the court science industry needs to find a dynamic reconstruction technology of crime scene fingerprint streak legacy region frequency, which provides a real-time and accurate weighting parameter model for the contribution calculation of the 'same finger position different region streak image quality' and 'same person same finger position streak image quality' level to the overall quality rating of the ten finger fingerprints in the quality evaluation of the ten finger fingerprint images, thereby solving the problem that the existing fingerprint image quality evaluation algorithm 'quality evaluation algorithm of each region streak acquisition quality to the quality rating weighting parameter of each finger position fingerprint image' and 'each finger position quality weight to the overall quality rating weighting parameter of the ten finger fingerprint' has no statistical data support.
Disclosure of Invention
In order to solve the problems, the method is based on the technology of 'big data mining' and 'pattern recognition', and a set of method aiming at dynamically reconstructing the frequency of the fingerprint line leaving area of the crime scene is established.
The technical scheme adopted by the invention for solving the technical problems is as follows: a crime scene fingerprint line leaving area frequency dynamic reconstruction method comprises the following steps:
identifying and extracting data of peripheral contour lines of a field fingerprint line coverage area;
reconstructing the remaining area of the fingerprint lines of the site on the fingerprint of the fingerprint seal;
and dynamically reconstructing the frequency of the fingerprint line leaving area in the crime scene.
The identification and data extraction of the peripheral contour line of the field fingerprint line coverage area comprises the following steps:
s02.1, extracting the field fingerprint legacy lines and converting the field fingerprint legacy lines into structured line data Sgql;
and S02.3, extracting peripheral contour line data of the field fingerprint line coverage area.
The step S02.1 comprises the steps of:
(1) removing image noise from the live fingerprint image Pql to obtain a live fingerprint enhancement map Peql;
(2) extracting foreground lines, judging the area according to the quality, and removing scale lines;
(3) extraction and calibration of the directional field: firstly, solving the Peql direction field without the scale by adopting a gradient algorithm, smoothing, and then covering the direction field data of a Qrql high-value area in an adjacent quality area with the direction field data of a Qrql low-value area according to the Qrql value to obtain field fingerprint integral direction field data Dql;
(4) calculating the streak line frequency: projecting along the direction of the Peql lines by using the direction field parameters provided by Dql; measuring an extreme point of the one-dimensional projection graph, and solving the frequency of the extreme point, namely the striae frequency Frql;
(5) adopting a Gabor filtering algorithm, carrying out line sharpening processing on the Peql by taking Dql and Frql as parameters, and reconnecting disconnected lines to obtain Gabor filtering result data Pgql of the Peql;
(6) binarization: carrying out binarization on Pgql to obtain a binary image Bgql;
(7) line thinning: converting Bgql into a thinning map with a set pixel width, namely a line map Tgql, by using a Hilditch algorithm;
(8) tracking and data storage of the refined lines: and detecting all the striae end points in the Tgql, tracking each striae of the Tgql one by taking all the end points as starting points, and storing the coordinates of each pixel point forming the striae according to the striae tracking sequence to obtain the structured striae data Sgql of the Tgql.
And (2) extracting foreground lines, performing region segmentation according to quality judgment, and excluding the scale lines, wherein the step comprises the following steps:
a) processing each point of the Peql by adopting a direction consistency algorithm, extracting a texture area with direction regularity, namely a direction consistency parameter in a set range, taking the texture area as a fingerprint line leaving area Rrql, and taking other areas except the fingerprint line leaving area Rrql as background areas Riql irrelevant to fingerprint lines;
b) setting a threshold value of a connectivity analysis algorithm and a threshold value of a direction consistency algorithm, and correcting Rrql and Riql data according to the two algorithms to form continuous distribution;
c) acquiring image quality indexes of the pixel definition, the contrast and the contrast of the Rql, and weighting and summing the indexes to obtain a comprehensive quality index Qrql reflecting the comprehensive quality of each pixel of the foreground Prql; dividing Prql into a plurality of quality areas Pgql according to the Qrql value;
d) detecting whether a plurality of straight line segments which are arranged in parallel and at equal intervals exist in the Peql, namely a scale; if the parallel equidistant straight line sections exist, the area where the parallel equidistant straight line sections exist is marked as a scale area, and the scale is removed.
The step S02.2 includes the following steps;
calling Sgql data, starting from the starting point of the left side line and ending at the ending point of the right side line, and extracting a certain point on the line at intervals of d as a control pointDefining the starting point at the left side and the ending point at the right side of the line at the same time as
All the components are extractedThe data are stored in a relational database SDB1 according to a control point structure, and then Sgql data are stored in a relational database SDB1 according to a Cgql curve structure.
Step S02.3 comprises the following steps:
(2) setting a scanning sliding window;
(3) establishing a field fingerprint MDOCgql model for storing data of the DOCgql data distribution range;
Performing Z-type scanning on all unit data of the MDOCgql by using a scanning sliding window; record the scan result of each line as the leftmost sideAnd the rightmost side of the scanning unitAnd is noted asSuch as scanning lineThe unit of (2) jumps to the next row; if there are only 1 scan lineThe coordinates of the cell are also recorded asCoordinates;
after all line scans are completed, allMultiplying the x value and the y value of the coordinate data by 8 respectively to obtain a pointAnd then mapped to Pql images(8x,8y);
Connecting all the components in a clockwise sequence from the lowest point of the y value one by oneForming DOCql;
The method for reconstructing the field fingerprint line remaining area on the fingerprint of the medium-pressure stamp comprises the following steps:
s03.1, obtaining reference data Tm of the best detail feature point matching reference coordinate;
step S03.2, analyzing and calculating Tm to obtain the best rigid motion relation M reconstructed in the field fingerprint line leaving area;
step S03.3 maps the docql data to the corresponding area of the ruled line on the corresponding Pkt image using M as a parameter, and obtains reconstructed data of the peripheral contour.
Step S03.1 comprises the following steps:
(1) defining data structure and parameters of a field fingerprint-right-pressing fingerprint feature matching weighted bipartite graph model BGltm:
establishing an array minu pMnt [ N ] to represent all detail characteristic points, wherein N is the number of the detail characteristic points; each vertex in the bipartite graph represents a detail feature point; defining a two-dimensional array pW [ M ] [ N ] to represent the matching weight of each detail feature point in the bipartite graph;
(2) modeling of weighted bipartite graphs
Using IFV to represent the rotational-translational independence between each pair of matching minutiae between feature data Mql of a live fingerprint and feature data Mkt of a right-falling fingerprintI.e. local patterns of detail feature matching;
constructing a rotation-translation independent quantity IFVi with the detail feature point i closest to the detail feature point i by taking Pql as a center; constructing a rotation and translation independent quantity IFVj by taking a detail feature point j corresponding to and matched with i on the Pkt as a center and a detail feature point closest to the j;
In the above formula, qiImage quality Qrql, q for the coordinate position of the minutiae feature point ijThe image quality Qrql of the coordinate position of the detail feature point j is obtained;
with Pql and Pkt full detail feature points as vertices, wijConstructing a weighted bipartite graph for the weights;
(3) data Tm was derived using the Kuhn-Munkres algorithm for the BGltm model.
Step S03.2 includes the following steps:
matching pairs p for detail feature points within TmijUnder the rigid body transformation condition, the rotation angle T of i, j takes Pql and Pkt coordinate origin as the rotation centerθThe following relationship is satisfied:
(x, y) and (x ', y') represent matching pair coordinates, T, respectivelyxAnd TyIndicating translation distance, TθRepresents the rotation angle;
the parameters for calculating rigid body motion are:
Tx=xj-(xicosTθ-yisinTθ)
TyyjxisinTθ+yicosTθ)
Tθ=θj-θi
definition Mij=(Tx,Ty,Tθ) Motion parameters i to j;
the calculation yields all possible matching sets pijMotion parameter set of { M }ijThe best rigid motion relation M ═ T is obtained by averagingx,Ty,Tθ),pijRepresenting possible matching points.
The dynamic reconstruction of the frequency of the criminal scene fingerprint line leaving area comprises the following steps:
s04.01, establishing a DRR template: the DRR template has a structure of ten square images which respectively correspond to finger positions of ten finger fingerprints;
partitioning all the squares according to the grids Md, and coding all the Md according to positions;
s04.02, performing regional frequency reconstruction of DRR to obtain dynamic reconstruction data Fdrr of regional frequency of fingerprint striae in the crime scene;
(1) extraction and conversion of DRR region data
Scanning DRR data by using a scanning sliding window, and recording Md coordinate point data corresponding to a field fingerprint ridge remaining reconstruction area; after scanning is finished, converting the DRR area into grid data of a group of Md coordinate points;
(2) reconstructing all DRR data on each finger position to obtain frequency dynamic reconstruction data Fdrr of a crime scene fingerprint line leaving area;
s04.03, graphically displaying the Fdrr data.
The invention has the following beneficial effects and advantages:
1. in the step 2 of the method, pattern recognition technologies such as image enhancement, region segmentation, directional field extraction, ridge frequency analysis, Gabor filtering, binaryzation, ridge refinement, ridge tracking, ridge flow direction consistency check, ridge continuity analysis, Hough transformation and the like are adopted to carry out big data mining on a field fingerprint image which stores 'comparison data' in the automatic police fingerprint recognition system, so that the interference caused by 'background irrelevant ridges' and 'scale images' to the automatic extraction of the field fingerprint ridges is effectively eliminated while the automatic extraction objectivity and accuracy of the field fingerprint ridges are ensured, and the peripheral outline data of the 'field ridge left-over region' is accurately extracted;
2. in the step 3 of the method, firstly, detailed characteristic point data of the file fingerprints corresponding to the relation between the field fingerprints and the comparison thereof are analyzed by adopting an empowerment bipartite graph model and a Kuhn-Munkres algorithm, and optimal reference coordinate reference data is preferably selected; secondly, calculating to obtain 'optimal rigid body motion relation data' reconstructed in a field fingerprint line remaining region by adopting a parameter control algorithm of rigid body transformation; finally, by taking the 'optimal rigid body motion relation data' as a parameter, mapping the 'peripheral outline data of the field line-left area' to a line-corresponding area on a fingerprint image of a corresponding archive, and obtaining objective and accurate 'peripheral outline reconstruction area data of the field line-left area';
3. in the 4 th step of the method, a specially designed statistical matrix model is adopted, and frequency statistics is carried out on the 'data of the peripheral contour reconstruction area of the fingerprint striae remaining area' obtained in the 3 rd step to obtain 'data of dynamic frequency reconstruction of the fingerprint striae remaining area' of the crime scenedrrThe method provides a real-time and accurate weighting parameter model for 'the quality of the lines of different areas of the same finger position' and 'the quality of the lines of different finger positions of the same finger fingerprint of the same person' in the quality evaluation of the ten finger fingerprint image and the level of contribution to the 'grading and calculation of the integral quality of the ten finger fingerprint', thereby solving the problem that the existing fingerprint image quality evaluation algorithm 'the quality of the line collection quality of each area grades the weighting parameter of the quality of the fingerprint image of each finger position' and 'the quality weight of each finger position grades the weighting parameter of the integral quality of the ten finger fingerprint' has no statistical data support.
4. In addition, in the 4 th step of the method, a specially designed conversion algorithm of 'streak area remaining frequency-visible spectrum frequency' and a conversion algorithm of 'streak area remaining frequency-three-dimensional solid curved surface diagram' are adopted, so that dynamic simulation of the remaining frequency of each finger position whole area streak crime scene is realized. The simulation method and the background data thereof can be directly used for evaluating the feasibility of actual combat application in the court science field of various plane fingerprint image data (such as entry and exit, identity cards, fingerprint registration of drivers and examinees and the like).
5. The method is mainly used for a super-large-scale fingerprint system with a multi-comparison algorithm architecture, can effectively improve the image quality level of a fingerprint system database and the case solving efficiency of the fingerprint system, and has outstanding significance in aspects of comparison and missed-check risk analysis of the fingerprint system, locking of missed-check target range, and feasibility evaluation of actual combat application in court science fields of various types of fingerprint image data (such as entry and exit, identity cards, driver and examinee fingerprint registration) and the like.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2a is a live fingerprint image Pql extracted by the big data method;
FIG. 2b is a graph of the Sgql effect extracted by Pql;
FIG. 3 is a diagram of the highlighting effect of an area left with fingerprint lines in place;
FIG. 4 is a schematic diagram of a local pattern of detail feature matching;
FIG. 5 is a diagram of a field fingerprint and its weighted minutiae against a mid-file fingerprint;
FIG. 6a is a first diagram illustrating the reconstruction effect of the peripheral contour of the left-over area of the fingerprint ridge in situ;
FIG. 6b is a first diagram illustrating the reconstruction effect of the peripheral contour of the left-over area of the fingerprint lines in situ;
FIG. 7 is a schematic diagram of a DRR template for a finger position;
FIG. 8 is a graph showing the effect of the "visible light wave wavelength heat map" of the Fdrr data of ten fingers;
fig. 9 is a display effect diagram of "three-dimensional frequency map" of certain finger Fdrr data.
Detailed Description
The present invention will be described in further detail with reference to examples.
As shown in fig. 1, the method comprises the following steps:
step S01: large data samples.
The step needs to dynamically sample big data of the information stored in the existing AFIS database of the court scientific department. The data to be extracted mainly include:
1. relation data in fingerprint ratio. The relation data in comparison is data record that a fingerprint system user (generally, a court science department) gradually accumulates in actual combat comparison work, records a certain field fingerprint extracted from a certain case crime scene, and proves that the field fingerprint material evidence and a fingerprint position of a certain person (generally, a case suspector) presses down the fingerprint of a printed archive in the same relation through the comparison of the fingerprint system and the independent manual identification of a plurality of groups of fingerprint experts (at least two persons in each group).
2. The court science department extracts and establishes a database at the crime scene and compares the scene fingerprint evidence data, wherein the data comprises image data (Pql) of the scene fingerprint of the matched scene, text information data (related to cases and evidence) and characteristic data (Mql) of the scene fingerprint.
3. The police department (judicial authority) collects and creates database and compares the ten finger print data, wherein the data comprises the image data (abbreviated as Pkt hereinafter) of a certain finger position fingerprint of the document to be stamped, the character information data (stamped person) and the characteristic data (abbreviated as Mkt hereinafter) of the fingerprint to be stamped.
Step S02: and identifying and extracting data of peripheral contour lines of the field fingerprint line coverage area.
In the step, algorithm automatic identification and coordinate data extraction are carried out on peripheral contour line data of a field fingerprint line coverage area of the Pql data under the support of a pattern recognition technology, and the key technical points are as follows:
s02.1, automatic extraction technology of on-site fingerprint remaining lines
Due to the limitation of the left-over condition of the material evidence, the definition of fingerprint lines on Pql images is poor, and therefore, the automatic extraction of field lines is a relatively complex problem. Therefore, the step adopts various processing modes such as image enhancement, image segmentation, directional field extraction, ridge frequency calculation, Gabor filtering, binarization, thinning, ridge tracking storage and the like, and the objectivity and the accuracy of extracting the fingerprint ridge data on site are ensured. As shown in fig. 2a, 2 b.
The specific method of the step is as follows:
(1) image enhancement
Removing image noise by using algorithms such as gray level normalization, histogram equalization, mean filtering, high-speed filtering, directional filtering and the like to obtain an enhanced image Peql of the Pql image;
(2) extraction of foreground lines, quality determination, region segmentation and scale line exclusion
On-site fingerprint material evidence can be left on the surfaces of various objects on a crime scene, and when an exploratory takes a picture of extracting on-site fingerprint images, the original various lines (generally formed by processing, decoration, abrasion and the like) on the surfaces of the mark bearing objects are taken as background images and can be extracted together with the fingerprint material evidence lines. In addition, according to the fingerprint material evidence image extraction standard of the court science industry, in order to ensure that the proportion of the field fingerprint image recorded into the AFIS system is correct, technicians must attach 1: 1 to row ruler image. Therefore, in order to objectively and accurately extract the on-site fingerprint ridge data, the "false ridge" generated by the "background irrelevant ridge" and the "scale ridge" must be identified and eliminated, and the specific method is as follows:
a) processing Peql points by adopting a direction consistency algorithm, extracting a texture region with direction regularity as a fingerprint line left region (Rrql), and defining a region with serious noise and disordered direction consistency as a background region (Riql);
wherein xi,yiThe horizontal and vertical coordinate values of a certain pixel point (i, j) in the fingerprint image. ThetaijIs the direction of the pixel point (i, j). h. k respectively represent the accumulated iteration variables,which means that the partial derivative is taken over x,representing partial derivative of y; r isijIndicating directional consistency.
b) Performing 'foreground/background attribute judgment' on isolated small regions of partial sporadic distribution in the Rrql and the Riql by adopting a connectivity analysis algorithm, and repeatedly correcting the data of the Rrql and the Riql according to a judgment result until the Rrql and the Riql form complete continuous distribution;
c) image quality indexes such as the pixel definition, the contrast and the contrast of the Rrql are obtained through the calculation of the direction consistency, the gray level mean value and the variance, and the indexes are weighted and summed to obtain a comprehensive quality index (Qrql for short) reflecting the comprehensive quality of each pixel of the foreground Prql. Dividing Prql into a plurality of areas with different qualities (Pgql for short) according to the Qrql value;
d) and detecting whether a plurality of parallel equidistant straight line segments (scales) exist in the Peql by adopting a Hough transform algorithm, if so, marking the area where the parallel equidistant straight line segments exist as a scale area, and removing the scales.
(3) Extraction and calibration of the orientation field
Firstly, solving a Peql direction field of a removed scale by adopting a gradient algorithm, smoothing, and then calibrating (covering) direction field data of a Qrql high-value area adjacent to the Qrql high-value area by using the direction field data of the Qrql high-value area (in an adjacent quality area) based on the Qrql height to obtain accurate 'field fingerprint integral direction field data' (hereinafter referred to as Dql), wherein a direction field formula is obtained by adopting a gradient method as follows:
wherein xi,yiThe horizontal and vertical coordinate values of a certain pixel point (i, j) in the fingerprint image. ThetaijIs the direction of the pixel point (i, j). h. k respectively represent the accumulated iteration variables,which means that x is the partial derivative of x,representing partial derivative of y; r isijIndicating directional consistency.
(4) Calculation of the frequency of the striae
Projected in the direction of the Peql lines using the directional field parameters provided by Dql. Measuring an extreme point of the one-dimensional projection graph, and calculating the frequency of the extreme point, namely the fringe frequency (Frql);
(5) gabor filtering
Adopting a Gabor filtering algorithm, taking Dql and Frql as parameters, carrying out line sharpening processing on Peql, and reconnecting disconnected lines to obtain Gabor filtering result data (Pgql for short) of Peql;
(6) binarization
Performing binarization processing on Pgql, setting a gray threshold value as T, setting the gray value of a pixel point with the gray value larger than T in the Pgql as 0 (representing a non-line), and setting the gray value of the pixel point with the gray value smaller than or equal to T in the Pgql as 1 (representing a line), so that the Pgql is changed into a binary image (hereinafter referred to as Bgql) with the gray value of 0, 1;
(7) line thinning
Converting Bgql into a detailed graph with the width of 1 pixel, namely a line graph (Tgql for short) by using a Hilditch algorithm;
(8) tracking and data storage of refined streaks
And detecting all the striae end points in the Tgql, tracking each striae of the Tgql one by taking all the end points as starting points, and storing the coordinates of each pixel point forming the striae according to the striae tracking sequence to obtain the structured striae data Sgql of the Tgql.
The Sgql information amount calculated by the Rrql is large, and storage and next positioning processing are inconvenient. Therefore, the step uses a specially designed structural definition, and extracts the control point of Sgql dataAnd Cgql curves, and stored in relational database SDB1 for use. The specific method of the step is as follows:
(1) sgql control pointStructure definition, Cgql curve solving method and structure definition thereof
typedef struct tagGFRT_SplineCtrlPoint
{
double fx; // control Point x coordinate
double fy; // control Point y coordinate
GFRT _ SPLINECTRLPOINT; // control Point architecture
(the expression method of the x, y coordinates of the control points is defined by referring to a detail feature position coordinate System in the public safety industry Standard of people's republic of China "feature point and Fingerprint Direction coordinate expression method" Fingerprint feature and Direction System GA-775(2008), hereinafter referred to as Cc for short).
The Cgql curve equation is:
Ni,0(u) denotes the ith 0 th B-spline basis function, Ni,p(u) denotes the ith p-th B-spline basis function, uiComponent values of an ith node vector are represented, p represents the number of times of the basis function, u represents the node vector, and i represents the order of the node ui in the node vector u.
The structure of the Cgql curve data is defined as:
Calling Sgql data obtained in the step S02.1, starting from the starting point of the left side line and ending at the ending point of the right side line at intervalsExtracting 1 point on the thinned stria as a control point (hereinafter referred to as "control point")) Defining the left starting point and the right ending point of the refined grain line simultaneously as
All the components are extractedStoring the control point data structure defined in the step S02.2 (1) in a relational database SDB1, and then storing the Sgql data in the Cgq defined in the step S02.2 (1)The l data structure is stored in the relational database SDB 1.
S02.3, extracting peripheral contour line data of field fingerprint line coverage area
"peripheral contour line data of field fingerprint line coverage area" (hereinafter referred to as "DOCgql"), which means "peripheral contour line control point of field fingerprint legacy area" (hereinafter referred to as "Orql") simulating distribution space position of "peripheral contour line of field fingerprint line coverage area" (hereinafter referred to as "Orql"), "peripheral contour line control point of field fingerprint legacy area" (hereinafter referred to as "peripheral contour line control point of field fingerprint legacy area") (hereinafter referred to as "simulating) Pql corresponding pixel points on live fingerprint image (hereinafter referred to as "live fingerprint image")) The coordinate data of (2).
(1) Establishing an MCp model
The model is a special design for creating a reflection "Point existence situation data "(hereinafter referred to simply as" point existence situation data ")Data), the data structure and the assignment method are as follows:
is a matrix of 512 by 512 cells. Each element of the matrix is 2 bits for storage(hereinafter, abbreviated as). Defined with reference to Cc coordinate system, forAll of the unitsAnd (4) assignment is carried out:coordinate correspondence unitAssigned a value of 01, notCoordinate correspondence unitThe value is assigned 00;
(2) design of parameters for scanning sliding windows
Considering that a large gap exists between Cgql curves, a scanning sliding window is too small, which causes the distribution continuity of scanning result data (hereinafter referred to as docql) of a peripheral contour line (hereinafter referred to as ocql) of a field fingerprint line coverage area to be reduced; the scan sliding window is too large, which results in distortion of docql data. Through repeated verification, 8 elements by 8 elements are adopted, and the sliding window with the step length value of 8 elements is moderate.
(3) Establishing an on-site fingerprint MDOCgql model
The MDOCgql model is a specially designed matrix model for storing DOCgql data distribution range data, and the data structure and the assignment method are as follows:
a matrix of 64 x 64 cells is built, each element being 2 bits. Defined with reference to Cc coordinate system, starting from the origin of coordinatesPerforming Z-type scanning on the model data, and taking the current sliding windowInputting the maximum value into an MDOCgql model;
The MDOCgql full cell data is Z-scanned using a sweep sliding window of 1 x 1 cell with a step size of 1, defined with reference to the Cc coordinate system. Record the scan result of each line as "leftmost sideThe "and" rightmost scan cell coordinates ofScanning unit coordinates of (hereinafter referred to as "scanning unit coordinates")). Such as scanning lineThen jump to the next row. If there are only 1 scan lineThe coordinates of the cell are also recorded asAnd (4) coordinates.
After all 64 lines of scanning are finished, all the lines are defined according to the Cc coordinate systemThe x-value and y-value of the coordinate data are multiplied by 8, respectively, and then mapped to Pql images(8x,8y)。
Connecting all the components in a clockwise sequence from the lowest point of the y value one by oneThe docql may be formed.
(5) Pql method for highlighting inner zone of image DOCql
After the DOCgql extraction is completed, if background interference of a non-streak area needs to be eliminated, a complementary set of the small rectangular grid with the same size and the area in the DOCgql can be used, and a shadow mask layer is arranged on the complementary set area by adopting a GDI + method, so that the streak-remaining area image of the field fingerprint can be highlighted. As shown in fig. 3.
Step S03: and reconstructing the remaining area of the fingerprint lines of the live fingerprint on the fingerprint printed in the live fingerprint area.
In the step, a pattern recognition technology is adopted to reconstruct the peripheral outline data of the field fingerprint ridge legacy area to a corresponding ridge corresponding area on a corresponding Pkt image, and the specific technical method is as follows:
s03.1, calculation of optimal detail feature point matching "reference coordinate reference data" Tm
In this step, Mql and the relation thereof in the S01 step corresponding to Mkt are analyzed by using an "weighted bipartite graph model" (hereinafter referred to as bipartite graph), Kuhn-Munkres algorithm (hereinafter referred to as K-M algorithm) and a "rigid body change model", and an optimal minutiae matching local pattern is preferably used as "reference coordinate reference data" (hereinafter referred to as Tm) for mapping docql to the corresponding Pkt corresponding region. The method comprises the following specific steps:
(1) data structure and parameter definition of field fingerprint-fingerprint feature matching weighted bipartite graph model (BGltm model for short)
And establishing an array minu pMnt [ N ] to represent all detail feature points, wherein N is the number of the detail feature points. Each vertex in the bipartite graph represents a detail feature point. And defining a two-dimensional array pW [ M ] [ N ] to represent the matching weight of each detail feature point in the bipartite graph.
(2) Modeling of weighted bipartite graphs
IFV (Invariant Feature vector) is used for representing rotation and translation irrelevant quantity between each pair of matched minutiaeI.e. local patterns of minutiae matching. As shown in fig. 4.
The feature point i closest to the feature point i is used as a center Pql to construct a rotation-translation independent quantity IFVi. And constructing the rotation and translation independent quantity IFVj by taking the detail feature point j corresponding to and matched with i on the Pkt as the center and the detail feature point nearest to j.
Let wijIs the matching similarity between IFVi and IFVj.
In the above formula, qiImage quality Qrql, q for the coordinate position of the minutiae feature point ijImage quality Qrql, q for the coordinate position of the detail feature point jjThe algorithm of (3) is the same as the method described in step S02.1(2).
With Pql and Pkt full detail feature points as vertices, wijFor weighting, a weighted bipartite graph is constructed, as shown in fig. 5.
(3) The best Tm was derived by preferential calculation using the Kuhn-Munkres algorithm for the BGltm model.
S03.2, analyzing Tm by using rigid body transformation model, calculating to obtain the best rigid body motion relation reconstructed from the field fingerprint line left region (hereinafter referred to as M)
Matching pairs p for detail feature points within TmijUnder the rigid body transformation condition, the rotation angle T of i, j takes Pql and Pkt coordinate origin as the rotation centerθThe following relationship is satisfied:
in the above formula, (x, y) and (x ', y') represent matching pair coordinates, i.e. coordinates in live and ten finger fingerprint images, respectively, TxAnd TyIndicating translation distance, TθRepresenting the angle of rotation.
The parameters for calculating rigid body motion are:
Tx=xj-(xi cos Tθ-yi sin Tθ)
Ty=yj-xisinTθ+yicos Tθ)
Tθ=θj-θi
xi,yiand xj,yjRespectively represent the point pairs p to be matchedi,pjAbscissa and ordinate, θi,θjRepresents pi,pjAngle of detail feature point of (1), TθRepresenting the angle of rotation. p is a radical ofi,pjRepresenting coordinates in live and ten finger fingerprint images, respectively.
Definition Mij=(Tx,Ty,Tθ) Motion parameters i to j.
The calculation yields all possible matching sets pijMotion parameter set of { M }ijThe best rigid motion relation M ═ T is obtained by averagingx,Ty,Tθ)。
S03.3 maps the docql data to the corresponding region of the ruled line on the corresponding Pkt image using M as a parameter, and obtains "peripheral contour reconstruction data".
The motion relation M obtained in the previous step is (T)x,Ty,Tθ) And performing rigid body transformation on the peripheral outline data of the field fingerprint to obtain reconstructed data (hereinafter referred to as DRR) of a field fingerprint ridge remaining area on a corresponding ten-finger fingerprint image, wherein the data storage format is unchanged. As shown in fig. 6a and 6 b.
Step S04: and dynamically reconstructing the frequency of the fingerprint line leaving area in the crime scene.
In the step, a special statistical model is established, and the regional frequency of the DRR data is dynamically reconstructed, and the specific technical method comprises the following steps:
s04.01, establishing a DRR template
The structure of the DRR template is 5 × 2 1280 pixels pure white square images (hereinafter referred to as D1, D2, D3 … … D10) corresponding to 01-10 digits of a ten-finger fingerprint, respectively. All the squares are partitioned according to a standard grid (hereinafter abbreviated as Md) of 160 rows by 160 columns (namely, the length and width of each grid is 8 pixels × 8 pixels), all the Md are encoded according to positions, and the statistics of the counter (hereinafter abbreviated as Mcd) of all the Md is set to be 0. And establishing a plane rectangular coordinate system by taking the upper left corner of each square area as the origin of coordinates, and setting the position coordinates of Md as the corresponding horizontal and vertical grid numbers (x, y). See table one. As shown in fig. 7.
Numbering | Finger position | Numbering | Finger position |
D1 | Thumb of right hand | D6 | Thumb of left hand |
D2 | Index finger of right hand | D7 | Food for left handFinger-shaped |
D3 | Middle finger of right hand | D8 | Middle finger of left hand |
D4 | Right hand ring finger | D9 | Left hand ring finger |
D5 | Little finger of right hand | D10 | Little finger of left hand |
S04.02, DRR regional frequency reconstruction (obtaining 'crime scene fingerprint line leaving regional frequency dynamic reconstruction data', hereinafter referred to as Fdrr)
(1) Extraction and conversion of DRR region data
And (3) performing Z-type scanning on the DRR data by using a 1 x 1 unit and a scanning sliding window with the step value of 1, and recording Md coordinate point data corresponding to a field fingerprint ridge remaining reconstruction area. After the scanning is completed, the DRR region is converted into grid data of a set of Md coordinate points.
(2) Mcd (cyber's center) assignment and computing method for' dynamic reconstruction of data of crime scene fingerprint line legacy area frequency
Repeating the steps S01, S02 and S03 until DRR region information of all comparison relation data (and related site fingerprints and fingerprint of the fingerprint system) stored in the fingerprint system is processed, and reconstructing all DRR region data on D1, D2 and D3 … … D10, so that the 'Frdr' of crime site fingerprint line left region frequency dynamic reconstruction data can be obtained.
S04.03 and Fdrr data graph display method
The Fdrr data can be displayed by using a 'plane multicolor frequency heat map mode' (converting the Fdrr distribution rule of the Mcd into a 'visible light wave wavelength heat map' and displaying the heat map to a user) or by using a 'three-dimensional frequency map' (converting the Fdrr distribution rule of the Mcd into a height of a height curved surface and displaying the height curved surface to the user), so that a graphical analysis result which dynamically reflects the 'frequency of the fingerprint line leaving area of the crime scene' in real time can be formed.
(1) Showing method of 'visible light wave wavelength heat map' of Fdrr data
a) Setting the maximum value of the Mcd value as max, grouping Md by finger position, abscissa and ordinate, setting Md frequency ratio as Mcd/max, and storing the Md frequency ratio as a reference value of Mcd color display in a database.
b) When displayed, the Mcd distribution of all Md is displayed in a visible full spectrum. The Md with ratio 0 is white, and as the frequency of the Mcd increases, the Md is displayed in color with 16-system color codes '# FFFF', 'FF 0000', 'FF 7F00', 'FFFF 00', '00 FF00', '00 FFFF', '0000 FF', and 'FF 00FF' of orange, yellow, green, blue, indigo, and violet, respectively. Therefore, full-spectrum dynamic image display of the frequency distribution rule of the streak line leaving positions is realized. As shown in fig. 8.
(2) "three-dimensional stereo frequency map" of Fdrr data "
a) Setting the maximum value of the Mcd value as max, grouping Md by finger position, abscissa and ordinate, setting Md frequency ratio as Mcd/max, and storing the Md frequency ratio as a reference value of the Mcd height display in a database.
b) And during 3D display, taking max as the maximum height of the colored curved surface, and displaying the corresponding height of the colored curved surface according to the ratio corresponding to the Md, so that three-dimensional display of the frequency distribution rule of the line leaving position can be realized. As shown in fig. 9.
Claims (9)
1. A crime scene fingerprint line leaving area frequency dynamic reconstruction method is characterized by comprising the following steps:
identifying and extracting data of peripheral contour lines of a field fingerprint line coverage area;
reconstructing the remaining area of the fingerprint lines of the site on the fingerprint of the fingerprint seal;
dynamically reconstructing the frequency of the fingerprint line leaving area in the crime scene;
the dynamic reconstruction of the frequency of the criminal scene fingerprint line leaving area comprises the following steps:
s04.01, establishing a DRR template: the DRR template has a structure of ten square images which respectively correspond to finger positions of ten finger fingerprints;
partitioning all the squares according to the grids Md, and coding all the Md according to positions;
s04.02, performing regional frequency reconstruction of DRR to obtain dynamic reconstruction data Fdrr of regional frequency of fingerprint striae in the crime scene;
(1) extraction and conversion of DRR region data
According to the motion relation M, rigid body transformation is carried out on peripheral outline data of the field fingerprint to obtain reconstructed data of a field fingerprint line remaining area on a corresponding ten-finger fingerprint image;
scanning DRR data by using a scanning sliding window, and recording Md coordinate point data corresponding to a field fingerprint ridge remaining reconstruction area; after scanning is finished, converting the DRR area into grid data of a group of Md coordinate points;
(2) reconstructing all DRR data on each finger position to obtain frequency dynamic reconstruction data Fdrr of a crime scene fingerprint line leaving area;
s04.03, graphically displaying the Fdrr data.
2. The method according to claim 1, wherein the step of identifying and extracting the peripheral contour line of the fingerprint striae covering area at the crime scene comprises the steps of:
s02.1, extracting the field fingerprint legacy lines and converting the field fingerprint legacy lines into structured line data Sgql;
s02.2, extracting and storing control points of the structured line data Cp and a B-spline basis function Cgql curve;
and S02.3, extracting peripheral contour line data of the field fingerprint line coverage area.
3. The method according to claim 2, wherein the step S02.1 includes the following steps:
(1) removing image noise from the live fingerprint image Pql to obtain a live fingerprint enhancement map Peql;
(2) extracting foreground lines, judging the area according to the quality, and removing scale lines;
(3) extraction and calibration of the directional field: firstly, solving the Peql direction field without the scale by adopting a gradient algorithm, smoothing, covering the direction field data of a Qrql high-value area in an adjacent quality area to the direction field data of a Qrql low-value area according to a 'comprehensive quality index' Qrql value reflecting the comprehensive quality of each pixel of the foreground Prql, and acquiring field fingerprint integral direction field data Dql;
(4) calculating the streak line frequency: projecting along the direction of the Peql lines by using the direction field parameters provided by Dql; measuring an extreme point of the one-dimensional projection graph, and solving the frequency of the extreme point, namely the striae frequency Frql;
(5) adopting a Gabor filtering algorithm, carrying out line sharpening processing on the Peql by taking Dql and Frql as parameters, and reconnecting disconnected lines to obtain Gabor filtering result data Pgql of the Peql;
(6) binarization: carrying out binarization on Pgql to obtain a binary image Bgql;
(7) line thinning: converting Bgql into a thinning map with a set pixel width, namely a line map Tgql, by using a Hilditch algorithm;
(8) tracking and data storage of the refined lines: and detecting all the striae end points in the Tgql, tracking each striae of the Tgql one by taking all the end points as starting points, and storing the coordinates of each pixel point forming the striae according to the striae tracking sequence to obtain the structured striae data Sgql of the Tgql.
4. The method according to claim 3, wherein the step (2) of extracting foreground striae and performing region segmentation according to quality judgment comprises the following steps:
a) processing each point of the Peql by adopting a direction consistency algorithm, extracting a texture area with direction regularity, namely a direction consistency parameter in a set range, taking the texture area as a fingerprint line leaving area Rrql, and taking other areas except the fingerprint line leaving area Rrql as background areas Riql irrelevant to fingerprint lines;
b) setting a threshold value of a connectivity analysis algorithm and a threshold value of a direction consistency algorithm, and correcting Rrql and Riql data according to the two algorithms to form continuous distribution;
c) acquiring image quality indexes of the pixel definition, the contrast and the contrast of the Rql, and weighting and summing the indexes to obtain a comprehensive quality index Qrql reflecting the comprehensive quality of each pixel of the foreground Prql; dividing Prql into a plurality of quality areas Pgql according to the Qrql value;
d) detecting whether a plurality of straight line segments which are arranged in parallel and at equal intervals exist in the Peql, namely a scale; if the parallel equidistant straight line sections exist, the area where the parallel equidistant straight line sections exist is marked as a scale area, and the scale is removed.
5. The method according to claim 2, wherein the step S02.2 includes the following steps;
(1) defining a Cp control point structure and a Cgql curve structure of the structured line data;
(2) extracting and storing a Cp control point and a Cgql curve;
calling Sgql data, starting from the starting point of the left side thread line to the ending point of the right side thread line, extracting a certain point on the thread line at intervals of a distance d to be used as a control point Cp, and defining the starting point of the left side and the ending point of the right side of the thread line as Cp;
and firstly, storing all extracted Cp in a relational database SDB1 according to a control point structure, and then storing Sgql data in a relational database SDB1 according to a Cgql curve structure.
6. The method according to claim 2, wherein step S02.3 comprises the following steps:
(1) establishing an MCp model: establishing a matrix model for reflecting the Cp point existence condition data ECp;
(2) setting a scanning sliding window;
(3) establishing a field fingerprint MDOCgql model for storing data of the DOCgql data distribution range;
(4) extracting and storing a peripheral contour line control point CCECp of a field fingerprint legacy area;
performing Z-type scanning on all unit data of the MDOCgql by using a scanning sliding window; recording the scanning unit coordinate of the leftmost side ECp ═ 1 and the scanning unit coordinate of the rightmost side ECp ═ 1 of each line of scanning results, and recording as CECp; if the scanning line has no ECp-1 unit, jumping to the next line; if the scan line has only 1 unit of ECp ═ 1, then the coordinate of that unit is also recorded as the CECp coordinate;
after all line scans are completed, multiplying the x and y values of all the CECp coordinate data by 8 respectively to obtain a point CCECp, and then mapping it to the CCECp (8x, 8y) of Pql images;
connecting all CCECps one by one in a clockwise sequence from the lowest point of the y value to form DOCql;
all the above CCECp coordinates are stored in a database.
7. The method according to claim 1, wherein said reconstructing the region with fingerprint lines left in the crime scene from the fingerprint printed in the fingerprint database comprises the following steps:
s03.1, obtaining reference data Tm of the best detail feature point matching reference coordinate;
step S03.2, analyzing and calculating Tm to obtain the best rigid motion relation M reconstructed in the field fingerprint line leaving area;
step S03.3 maps DOCgql data in the peripheral contour line data of the fingerprint line coverage area of the spot fingerprint to the corresponding area of the line corresponding to the Pkt image of the image data Pkt corresponding to a fingerprint of a certain fingerprint position of the compared fingerprint file, using M as a parameter, to obtain reconstructed data of the peripheral contour.
8. The method according to claim 7, wherein the step S03.1 comprises the following steps:
(1) defining data structure and parameters of a field fingerprint-right-pressing fingerprint feature matching weighted bipartite graph model BGltm:
establishing an array minu pMnt [ N ] to represent all detail characteristic points, wherein N is the number of the detail characteristic points; each vertex in the bipartite graph represents a detail feature point; defining a two-dimensional array pW [ M ] [ N ] to represent the matching weight of each detail feature point in the bipartite graph;
(2) modeling of weighted bipartite graphs
Using IFV to represent the rotational-translational independence between each pair of matching minutiae between feature data Mql of a live fingerprint and feature data Mkt of a right-falling fingerprintI.e. local patterns of minutiae matching, where thetaiThe direction angle of the detail characteristic point i;the direction angle of the detail characteristic point closest to the distance i; disi is the Euclidean distance between the two detail feature points;
constructing a rotation-translation independent quantity IFVi by taking the minutiae i of the image data Pql of the live fingerprints in the comparison as the center and the minutiae closest to the i; constructing a rotation and translation independent quantity IFVj by taking a detail feature point j corresponding to and matched with i on the Pkt as a center and a detail feature point closest to the j;
In the above formula, qiImage quality Qrql, q for the coordinate position of the minutiae feature point ijFor the coordinates of the detail feature point jThe image quality of the location Qrql;
with Pql and Pkt full detail feature points as vertices, wijConstructing a weighted bipartite graph for the weights;
(3) data Tm was derived using the Kuhn-Munkres algorithm for the BGltm model.
9. The method according to claim 8, wherein the step S03.2 comprises the following steps:
matching pairs p for detail feature points within TmijUnder the rigid body transformation condition, the rotation angle T of i, j takes Pql and Pkt coordinate origin as the rotation centerθThe following relationship is satisfied:
(x, y) and (x ', y') represent matching pair coordinates, T, respectivelyxAnd TyIndicating translation distance, TθRepresents the rotation angle;
the parameters for calculating rigid body motion are:
Tx=xj-(xicosTθ-yisinTθ)
Ty=yj-(xisinTθ+yicosTθ)
Tθ=θj-θi
definition Mij=(Tx,Ty,Tθ) Is a motion parameter, x, of i to ji,yiAnd xj,yjRespectively represent the point pairs p to be matchedi,pjAbscissa and ordinate, θi,θjRepresents pi,pjThe detail feature point angle of (1);
the calculation yields all possible matching sets pijMotion parameter set of { M }ijThe best rigid motion relation M ═ T is obtained by averagingx,Ty,Tθ),pijRepresenting possible matching points.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711128381.1A CN107909031B (en) | 2017-11-15 | 2017-11-15 | Crime scene fingerprint line leaving area frequency dynamic reconstruction method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711128381.1A CN107909031B (en) | 2017-11-15 | 2017-11-15 | Crime scene fingerprint line leaving area frequency dynamic reconstruction method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107909031A CN107909031A (en) | 2018-04-13 |
CN107909031B true CN107909031B (en) | 2021-06-08 |
Family
ID=61845517
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711128381.1A Active CN107909031B (en) | 2017-11-15 | 2017-11-15 | Crime scene fingerprint line leaving area frequency dynamic reconstruction method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107909031B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108416342B (en) * | 2018-05-28 | 2022-02-18 | 杭州电子科技大学 | Fingerprint identification method combining thin node and thin line structure |
CN109003259B (en) * | 2018-06-25 | 2022-02-18 | 张威 | Fingerprint quality evaluation method based on line quality expert visual cognition machine learning |
CN109271890B (en) * | 2018-08-29 | 2021-07-13 | 墨奇科技(北京)有限公司 | Method and apparatus for automatic fingerprint image extraction |
WO2020042035A1 (en) * | 2018-08-29 | 2020-03-05 | Moqi Technology (beijing) Co., Ltd. | Method and device for automatic fingerprint image acquisition |
CN109977879A (en) * | 2019-03-28 | 2019-07-05 | 山东省计算中心(国家超级计算济南中心) | A kind of acquisition of fingerprint on site matches control methods and system with long-range |
CN110674745B (en) * | 2019-09-24 | 2022-11-04 | 山东省计算中心(国家超级计算济南中心) | Fingerprint restoration method and system based on field legacy fingerprints |
CN113642102B (en) * | 2021-07-23 | 2024-03-15 | 一汽奔腾轿车有限公司 | Automatic modeling method for rigid body pairs in collision model |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1595428A (en) * | 2004-07-15 | 2005-03-16 | 清华大学 | Fingerprint identification method based on density chart model |
WO2005059809A2 (en) * | 2003-12-15 | 2005-06-30 | Infineon Technologies Ag | Method for identifying a fingerprint and fingerprint sensor |
CN101770567A (en) * | 2008-12-31 | 2010-07-07 | 杭州中正生物认证技术有限公司 | Method for identifying biological features |
CN103559476A (en) * | 2013-09-16 | 2014-02-05 | 中国联合网络通信集团有限公司 | Fingerprint matching method and device thereof |
CN104361331A (en) * | 2014-12-05 | 2015-02-18 | 南京信息工程大学 | Fingerprint matching method based on bipartite graph optimal matching |
CN104834923A (en) * | 2015-06-01 | 2015-08-12 | 西安电子科技大学 | Fingerprint image registering method based on global information |
CN105138959A (en) * | 2015-07-28 | 2015-12-09 | 苏州南光电子科技有限公司 | Image processing based fingerprint matching and control method |
CN105740753A (en) * | 2014-12-12 | 2016-07-06 | 比亚迪股份有限公司 | Fingerprint identification method and fingerprint identification system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2944581A1 (en) * | 2013-04-02 | 2014-10-09 | Clarkson University | Fingerprint pore analysis for liveness detection |
-
2017
- 2017-11-15 CN CN201711128381.1A patent/CN107909031B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005059809A2 (en) * | 2003-12-15 | 2005-06-30 | Infineon Technologies Ag | Method for identifying a fingerprint and fingerprint sensor |
CN1595428A (en) * | 2004-07-15 | 2005-03-16 | 清华大学 | Fingerprint identification method based on density chart model |
CN101770567A (en) * | 2008-12-31 | 2010-07-07 | 杭州中正生物认证技术有限公司 | Method for identifying biological features |
CN103559476A (en) * | 2013-09-16 | 2014-02-05 | 中国联合网络通信集团有限公司 | Fingerprint matching method and device thereof |
CN104361331A (en) * | 2014-12-05 | 2015-02-18 | 南京信息工程大学 | Fingerprint matching method based on bipartite graph optimal matching |
CN105740753A (en) * | 2014-12-12 | 2016-07-06 | 比亚迪股份有限公司 | Fingerprint identification method and fingerprint identification system |
CN104834923A (en) * | 2015-06-01 | 2015-08-12 | 西安电子科技大学 | Fingerprint image registering method based on global information |
CN105138959A (en) * | 2015-07-28 | 2015-12-09 | 苏州南光电子科技有限公司 | Image processing based fingerprint matching and control method |
Non-Patent Citations (2)
Title |
---|
Reconstructing Ridge Frequency Map from Minutiae Template of Fingerprints;Wei Tang 等;《2013 IEEE Sixth International Conference on Biometrics: Theory, Applications and Systems (BTAS)》;20131002;正文第5,6节,附图6 * |
嵌入式指纹识别系统的研究与设计;刘晓莉;《中国优秀硕士学位论文全文数据库 信息科技辑》;20170315(第03期);I138-5529 * |
Also Published As
Publication number | Publication date |
---|---|
CN107909031A (en) | 2018-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107909031B (en) | Crime scene fingerprint line leaving area frequency dynamic reconstruction method | |
CN106778586B (en) | Off-line handwritten signature identification method and system | |
CN103116763B (en) | A kind of living body faces detection method based on hsv color Spatial Statistical Character | |
CN103679675B (en) | Remote sensing image fusion method oriented to water quality quantitative remote sensing application | |
CN106503739A (en) | The target in hyperspectral remotely sensed image svm classifier method and system of combined spectral and textural characteristics | |
CN105469076B (en) | Face alignment verification method based on multi-instance learning | |
CN100385451C (en) | Deformed fingerprint identification method based on local triangle structure characteristic collection | |
CN106295124A (en) | Utilize the method that multiple image detecting technique comprehensively analyzes gene polyadenylation signal figure likelihood probability amount | |
CN101630364A (en) | Method for gait information processing and identity identification based on fusion feature | |
CN111126240B (en) | Three-channel feature fusion face recognition method | |
CN103984920B (en) | Three-dimensional face identification method based on sparse representation and multiple feature points | |
CN103034838A (en) | Special vehicle instrument type identification and calibration method based on image characteristics | |
CN116188880B (en) | Cultivated land classification method and system based on remote sensing image and fuzzy recognition | |
CN111666813B (en) | Subcutaneous sweat gland extraction method of three-dimensional convolutional neural network based on non-local information | |
CN110222660B (en) | Signature authentication method and system based on dynamic and static feature fusion | |
CN102163343B (en) | Three-dimensional model optimal viewpoint automatic obtaining method based on internet image | |
CN110866442B (en) | Real-time face recognition-based testimony-of-person integrated checking system and method | |
CN117275080A (en) | Eye state identification method and system based on computer vision | |
CN115147726B (en) | City form map generation method and device, electronic equipment and readable storage medium | |
Thottolil et al. | Automatic building footprint extraction using random forest algorithm from high resolution google earth images: A feature-based approach | |
CN109753912A (en) | A kind of multi-light spectrum palm print matching process based on tensor | |
Jain | Automatic Fingerprint Matching Using Extended Feature Set | |
Huang et al. | Classification of very high spatial resolution imagery based on the fusion of edge and multispectral information | |
CN108021874A (en) | A kind of EO-1 hyperion Endmember extraction preprocess method combined based on sky-spectrum | |
CN111401275B (en) | Information processing method and device for identifying grassland edge |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |