GB2526874A - Synthetic latent fingerprint images - Google Patents

Synthetic latent fingerprint images Download PDF

Info

Publication number
GB2526874A
GB2526874A GB1410063.0A GB201410063A GB2526874A GB 2526874 A GB2526874 A GB 2526874A GB 201410063 A GB201410063 A GB 201410063A GB 2526874 A GB2526874 A GB 2526874A
Authority
GB
United Kingdom
Prior art keywords
fingerprint image
image
feature points
synthetic
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1410063.0A
Other versions
GB201410063D0 (en
Inventor
Thomas Kirwan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
UK Secretary of State for Defence
Original Assignee
UK Secretary of State for Defence
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by UK Secretary of State for Defence filed Critical UK Secretary of State for Defence
Priority to GB1410063.0A priority Critical patent/GB2526874A/en
Publication of GB201410063D0 publication Critical patent/GB201410063D0/en
Priority to PCT/GB2015/000156 priority patent/WO2015185883A1/en
Publication of GB2526874A publication Critical patent/GB2526874A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1335Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)

Abstract

A method of manipulating a fingerprint image, with associated feature points, to create a synthetic latent fingerprint image. The method comprises; receiving an original image 2; transforming the initial image 4 onto the background of a synthetic image, but using different transformations on separate regions of the fingerprint; using the same transformations for each associated region, to map the original position features 6 to the correct points on the synthetic image; outputting 8 the synthetic latent fingerprint image. The transformations may include resizing, rotating, smearing, or adding image noise. An original image may be used to generate one or more different synthetic images, which in turn may each be used to generate more. Feature points may include pores, dots, incipient ridges, ridge end protrusions, or minutiae, which in turn may comprise ridge endings, ridge bifurcations, short ridges, islands, spurs or crossovers. This invention has possible use in forensic analysis training.

Description

SYNTHETIC LATENT FINGERPRINT IMAGES
Technical Field of the Invention
This invention relates to synthetic latent fingerprint images, and to the training of trainee latent fingerprint analysts.
Background to the Invention
Fingerprints are commonly left behind on objects after being handled, and may be analysed to help determine the identity of the person handling the objects. Latent fingerprint images of fingerprints may he may he developed by techniques such as powder dusting or cyanoacrylate ester fuming, as will he apparent to those skifled in the art, and then compared to database(s) of plain fingerprint images.
A plain fingerprint image of the person's finger may be taken under laboratory conditions, to include the whole area of the person's fingertip, and to give as clear a fingerprint as practically possible. A plain fingerprint image may for example comprise lines upon a uniform background, (he lines corresponding to ridges of the finger.
However, latent fingerprint images developed from fingerprints that are un-intentionally left behind on objects typically include the background colouration/texture of the object which was touched, and may have imperfections such as smearing and broken lines, poor contrast between lines and background, and may only comprise part of the person's full fingerprint. The term Latent Fingerprint thus pertains to recordings of fnction ridges (e.g. epidermal or papillary ridges as found on fingertips) under arbitrary and non-ideal conditions, and is in contrast to a Plain Fingerprint where the fingerprint is recorded under controlled and typically ideal conditions.
The matching of latent fingerprint images to plain fingerprint images is commonly done by identifying feature points in the latent fingerprint images. and checking whether the positions and the types of these feature points correspond to the positions and types of the feature points in the plain fingerprint images. The feature points may for example comprise points where fingerprint ridges end or converge, known in the art as minutiae points, or points where pores in the fingerprint are present.
The identification of feature points within latent fingerprint images is a difficult task.
and is typically carried out by latent fingerprint analysts, each of which may rcquire up to 5 years training in how to accurately identify aM classify feature points in latent fingerprint images.
The training requires the use of latent fingerprint images that have known feature points, so that a trainee latent fingerprint analyst can attempt to identify and classify the feature points in the latent fingerprint image. and so that the feature points identified by the trainee latent fingerprint analyst can he evaluated against the known feature points. Then die trainee latent fingerprint analyst can be informed of which points they failed to identify or classify colTectly.
Latent fingerprint images with known feature points are difficult to create, they require someone to place a fingerprint on a surface, forensically triage the print, and image it.
They then require analysis by at least one or two experienced latent fingerprint analysts to identify and classify all the feature points that are present within them, before passing them to a trainee latent fingerprint analyst for training purposes.
Far more latent fingerprint, images with luiown feature points are required to effectively train latent fingerprint analysts. than are currently readily available from public databases.
It is therefore an object of the invention to provide a means for generating synthetic latent fingerprint images having known feature points and having at least one characteristic representative of those of real latent fingerprints as may he found in forensic investigations.
Summary of the Invention
According to one aspect of the invention, there is provided a method of generating a synthetic latent fingerprint image and positions of feature points within the synthetic latent fingerprint image. The method comprises: -receiving an initial fingerprint image and positions of feature points within the initial fingerprint image; -transforming the initial fingerprint image onto a background image to provide a synthetic latent fingerprint image this step including transforming one region of the fingerprint image differently than a separate region of the fingerprint image: -mapping the positions of the feature points within the initial fingerprint image to positions of the feature points within the synthetic latent fingerpnnt image, based upon the transforming: and -outputting the synthetic latent fingerprint image and the positions of the feature points that. are within the synthetic latent fingerprint image to enable the training of a fingerprint forensic analyst trainee, the training including the trainee identifying feature points of the synthetic latent fingerprint image and including assessment of a level of success of the trainee by comparison of positions of potential feature points identified by the trainee against the positions of the outputted feature points.
The method enables the generation of synthetic latent fingerprint images having known feature points, which can he used as latent fingerprint images having known feature points for the training and evaluation of latent fingerprint analysts.
The term Synthetic Latent Fingerprint Image pertains to an image of a fingerprint where the fingerprint has at least one characteristic of a Latent Fingerprint as may be found during a forensic investigation, as opposed to of a Plain Fingerprint.
The training of the fingerprint analyst is performed manually, in the sense that it done by the trainee manually inspecting images. However the images may be displayed on a computer screen by a computer for these purposes. Furthermore the training may also be partly automated, requiring the trainee to input analysis in response to viewing the image.
whereupon another images may he automatically displayed. The assessment of the level of success of the trainee may similarly be automated and feedback may also be automatically provided by the above method.
Advantageously, to simulate the latent fingerprint images that are typically encountered in real life, the transforming of the initial fingerprint image onto the background image may comprise various operations such as: -removing at least one region of the initial fingerprint image while retaining another region thereof This may be performed to simulate when only part of a person's finger touches an object; -altering (he initial fingerprint image according to a non-linear spatial transform such that at least one region of the initial fingerprint image is efflarged or compressed in preference to another region thereof. This may be performed to simulate compression/stretching of parts of a fingertip surface that can occur when the finger touches an object; -smearing at least one portion of the initial fingerprint image such that at least one region is smeared in preference to another region thereof. This may be performed to simulate lateral movement of parts of a fingertip along an object that can occur when the finger touches an object. This can he achieved optionally by providing a rotational smearing transformation with respect to an axis of rotation, that causes at least one region respectively distal to the axis of rotation to be smeared in preference to a region respectively proximate to the axis of rotation; and -adding image noise to at least one region of the initial fingerprint image in preference to another region thereof This may he performed to simulate image noise that maybe introduced by processes such as power dusting or cyanoacrylate ester fuming used to develop a fingerprint, into a latent. fingerprint, image.
The method may further comprise repeating the step of transforming the initial fingerprint image onto a background image (E.g. transforming. mapping, and outputting) a plurality of times, with varying transformations. This may be used to generate a fingerprint image with more complex deterioration as compared to applying one transformation.
The method may further comprise repeating the step of transforming the initial fingerprint image onto a background image (optionally multiple times each), for example as listed above, to generate a plurality of synthetic latent fingerprint images from the initial fingerprint image.
Accordingly, just one initial fingerprint image with known feature points may be all that is required to generate a vast array of latent fingerprint images for training purposes. The parameters that are used to transform the initial fingerprint image into a synthetic latent fingerprint image may he randonfly varied in order to automatically produce multiple synthetic latent fingerprint images, each one being different to the others.
Advantageously, the initial fingerprint image may be a plain fingerprint image comprising lines upon a uniform background, the lines colTesponding to ridges of a finger. The plain fingerprint image may be used multiple times to generate multiple latent fingerprint images all having the same starting point of the plain fingerprint image.
Alternatively, a synthetic latent fingerpnnt image resulting from performing the method a first time. may be used as the initial fingerprint image in a subsequent repetition of the method, to generate another synthetic atent fingerprint image. The generated synthetic latent fingerprint image will typically be of lower quality than the initial synthetic latent fingerprint image due to the additional transformation step, and so this method may be used to generate a series of synthetic latent fingerprint images which are progressively harder and harder to read.
The feature points may comprise minutiae points, the minutiae points being one or more of ridge endings, ridge bifurcations, short ridges, islands, spurs, and crossovers.. etc, as will be apparent to those skilled in the art.
The feature points may further comprise one or more of pores. dots, incipient ridges, and ridge edge protrusions to further test and train the abilities of latent print analysts.
Advantageously, the background image may comprise at least one of texture and patterning. The texture and patterning may be designed to increase the difficulty of identifying and classifying feature points of the generated synthetic latent fingerprint image, andlor may he designed to simulate a surface of an object upon which a fingerprint maybe left and a latent fingerprint image developed.
The term Region pertains to a non-trivial area of the fingerprint image (and in practice is a region within a boundary defined by the fingerprint in the fingerprint image). For the purposes of this invention a transformation step is applied differentially to two non-overlapping regions. or only to one of two non-overlapping regions. The regions may be neighbouring or distal within the image, and there may be further regions where the transformation step is further applied differently or to a different extent. The application of the transfoimation may vary in the extent it is applied according to a smooth function (for example in the case of rotational smearing where the amount of smearing varies smoothly with position) and it is only necessary to be ab'e to identify two non-trivially sized (e.g. large enough to for a feature point to at least originally have been identifiable therein) regions where the transformation has provided a different effect or an effect to a different degree therebet ween.
Uniform changes such as changing the contrast ratio or bnghtness or position or overall orientation of the image do not provide this, and neither do uniformly adding image noise or a background pattern or linearly stretching or compressing the imagc. However rotational smearing does so, because the amount of smearing varies depending on proximity to an axis of rotation thereof.
Optionally the invention includes the step of manually training a trainee btent fingerprint analyst to analyse latent fingerprints, this step including presenting the trainee with a synthetic latent fingerprint image as generated in the above method, requiring the trainee to attempt to identify andlor classify feature points in the latent fingerprint image, evaluating feature points identified by the trainee against the known feature points, and informing the trainee of feature points he or she failed to identify or dassify correctly.
According to another aspect of the invention there is provided a computer program product to configurc a computer to perform thc method of gcncrating synthetic latent fingerprints of the first aspect of the invention. The computer program product may for example be embodied in a computer readable storage media storing the computer program for configuring and controlling a computer to perfoim the above-described methods, or the computer program product may he in the lorm of a digital network signal transmitted to the computer via a communications network such as the Internet.
According to another aspect of the invention there is provided a synthetic latent fingerprint image generator for generating synthetic latent fingerprint images for permitting the manual training of a fingerprint forensic analyst trainee, the manual training including the trainee identifying feature points of the synthetic latent fingerprint image and including assessment of a level of success of the trainee by comparison of positions of potential feature points identified by the trainee against the positions of the outputted feature points., the synthetic latent fingerprint image generator comprising: -an input for receiving an initial fingerprint image and positions of feature points within the initial fingerprint image; -a processor configured to transform the initial fingerprint image onto a background image to provide a synthetic latent fingerprint image this including transforming one region of the fingerprint image differently than a separate region of the fingerprint image; and to map the positions of the feature points within the initial fingerprint image to positions of the feature points within the synthetic latent fingerprint image, based upon the transforming; and -an output for outputting the synthetic latent fingerprint image with the positions of the feature points that are within the synthetic latent fingerprint image.
Brief Description of the Drawings
Illustrative embodiments of the invention will now be described by way of example only and with reference to the accompanying drawings, in which: Fig. 1 shows a flow diagram of a method according to an embodiment of the invention: Fig. 2 shows a schematic diagram of a computing device configured as a synthetic latent fingerprint image generator for performing the method of Fig. 1; Fig. 3 shows a plain fingerprint image having feature points, the plain fingerprint image and feature points suitable for use as an initial fingerprint image and initial feature points in the method of Fig. 1; Fig. 4 shows a schematic diagram of an orange bottle having an patterned label; Fig. 5 shows a synthetic latent fingerprint image based on transforming the plain fingerprint image of Fig. 3 onto the patterned label of Fig. 4. according to a first type of transformation; Fig. 6 shows a synthetic latent fingerprint image based on transforming the plain fingerprint image of Fig. 3 onto the patterned label of Fig. 4, according to a second type of transformation.
Fig. 7 shows a synthetic latent fingerprint image based on transforming the plain fingerprint image of Fig. 3 onto the patterned label of Fig. 4, according to a third type of transformation; Fig. 8 shows a synthetic latent fingerprint image based on transforming the plain fingerprint image of Fig. 3 onto the patterned label of Fig. 4. according to a fourth type of transformation.
Fig. 9 shows a rotation and reduction of the plain fingerprint image of Fig. 3, the rotation and reduction forming a first step of fifth and sixth types of transformation; Fig. 10 shows an image of a pressure map that is used in the fifth and sixth types of transformation; Fig. 11 shows an intermediate image of the fifth and sixth types of transformation, the intermediate image generated by applying the pressure map of Fig. 10 to the rotated and reduced pain fingerprint image of Fig. 9; Fig. 12 shows a synthetic latent fingerprint image based on superimposing the intermediate image of Fig. 11 onto the patterned lahd of Fig. 4, according to the fifth type of transformation; Fig. 13 shows a further intermediate image of the sixth type of transformation, the further intermediate image generated by applying motion blur to the intermediate image of Fig. 11; and Fig. 14 shows a synthetic latent fingerprint image based on superimposing the further intermediate image of Fig. 13 onto the patterned label of Fig. 4. according to the sixth type of transformation.
Detailed Description
An embodiment of the invention will now be described with reference to the flow diagram of Fig. I and the schematic diagram of Fig. 2.
The schematic diagram of Fig. 2 shows a computing device CD, for example a personal computer. The computing device CD comprises a processor uP and a memory MRY for use by the processor. The computing device CD has an input lIP such as a connection to a network, disk drive, or scanner. The computing device CD also has an output OP. such as a connection to a network, disk drive, or printer. The computing device CD has been configured using a computer software program product to carry out the method shown in Fig. Referring to Fig. 1. in a first step 2 the computing device CD receives an initial fingerprint image from the input IP, for example by requesting it from a network. A set of initial feature points of the initial fingerprint image is also received, either along with the initial fingerprint image. or by an experienced fingerprint analyst analysing the initial fingerprint image and entering the initial feature points into the computing device manually.
In a subsequent step 4, the processor uP of the computing device transforms the initial fingerprint image onto a background image that is retrieved from the memory MRY, including transforming one region of the fingerprint image differently than a separate region of the fingerprint image. This results in a synthetic latent fingerprint image which may be useful in a training program. The background image is typically an image of a surface upon which fingerprints maybe found, for example a surface of a door handle or a carry crate.
a step 6, the processor uP maps the positions of the initial feature points to positions where the same feature points will be found in the latent fingerprint image. The mapping is based on the operations that were carried out on the initial fingerprint image to transform it onto the background image. The mapped feature points therefore designate the positions within the latent fingerprint image where the feature points of the initial fingerprint image will be found.
hi a final step 8, the processor uP outputs the transformed image, which is the synthetic latent fingerprint image, to the output OP, for example by sending the synthetic latent fingerprint image over a network. The mapped feature points are also output to the output OP. along with the synthetic latent lingerprint image. The synthetic latent lingerprint image can then be presented to trainee latent fingerprint analysts with an instruction to identify and classify the feature points within the synthetic latent fingerprint image, and with the mapped feature points constituting the correct answers to that instruction.
hi alternate embodiments, the step 6 may be carried out before the step 4. or the steps 4 and 6 may be performed substantially simultaneously to one another. The processor uP is configured to randomly select and apply differing types and degrees of transformations to the initial fingerprint image when transforming the initial fingerprint image onto the background image, such that the steps 4. 6. and 8 can be repeated again and again to keep producing new synthetic latent fingerprint images.
Fig. 3 shows a plain fingerpñnt image 10. The plain fingerpnnt 10 is a high-quality fingerprint image as may be obtained in a laboratory from a person willingly providing their fingerprint. The plain fingerprint image covers substantially the whole of the person's fingertip, and is cicady defined on a uniform (typicafly white) background.
There are five feature points that have been highlighted on the plain fingerprint image of Fig. 3. at the positions of five features 11, 12, 13, 14, and 15. The feature 11 is the termination of a finger ridge approaching from the bottom right side of the image, the feature 12 is the termination of a finger ridge approaching from the bottom left side of the image. the feature 13 is the joining together of two finger ridges approaching from the bottom left side of the image, the feature 14 is the termination of a finger ridge approaching from the hottom of the image, and the feature 15 is the termination of a finger ridge approaching from the bottom right of the image.
Each feature point comprises a position where the corresponding feature (minutia) is present in the image, and in this particular embodiment further comprises the type of the feature, so that trainee latent fingerprint analysts can be tested on identifying the feature type as well as identifying the feature position.
lii the method of Fig. 1. the plain fingerprint image lOis suitable for use as an initial fingerprint image. and the feature points marked on Fig. 3 are suitable for use as initial feature points.
Fig. 4 shows a diagram of an Orange drink bottle 20 having a patterned label 22. The Orange bottle 20 simulates a typical surface upon which latent fingerprint images may be found, and may be used in conjunction with the plain fingerprint image 10 of Fig. 3 to synthesise synthetic latent fingerprint images according to the following illustrative embodiments of the invention. In particular, the surface region 24 of the patterned label 22 is used as a background image 25 in the following four different types of transformations.
which are described below with reference to Figs. 5 -8.
Fig. 5 shows a synthetic latent fingerprint image 30 generated from the plain fingerprint image 10 and the background image 25 using a first type of transformation. The transformation comprises slightly rotating the image 10 anticlockwise, reducing the rotated image in size to increase the difficulty of identifying features. faintening and removing the area 32 of the image to simulate light fingertip pressure at the area 32, and adding noise 34 to the upper fifth of the image to simulate imperfect latent fingerprint generation techniques.
Finally, the image was superimposed on the background image 25 to give the synthetic latent fingerprint image 30.
The feature points of the plain fingerprint image 10 are mapped according to the same transformations as described above, and the positions of the features 14 and 15 in the synthetic latent fingerprint image 30 are marked on Fig. 5. Specifically, the feature points of features 14 and 15 in the plain fingerprint image are rotated about one another with the same rotation as in the above transformation, the feature points are moved doser to one another according to the reduction of the image size in the above transformation, and the feature points are defined with respect to the background image in the above superimposition. The above transformation steps of faintening and noise addition clearly do not have any influence on the positions of the features, and so are not taken into account when performing the mapping.
The synthetic latent fingerprint image 30 could be presented to a trainee latent fingerprint analyst with an instruction to identify feature points, and the identified feature points could be compared against the mapped feature points above for training purposes, for example to evaluate the trainee latent fingerprint analyst's performance.
Fig. 6 shows a synthetic latent fingerprint image 40 generated from the p'ain fingerprint image 10 and the background image 25 using a second type of transformation.
The transformation comprises rotating the image 10 clockwise by around 180 degrees, removing the area 42 of the image to simulate a smearing of the fingertip across the surface, and adding noise 44 to the bottom left and upper right corners of the image to simulate imperfect latent fingerprint generation techniques. Finally, the image was superimposed on the background image 25 to give the synthetic latent fingerprint image 40.
The feature points of the plain fingerprint image 10 are mapped according to the same transformations. and the positions of the features 12, 14, and 15 in the synthetic latent fingerprint image 40 are marked on Fig. 6.
Fig. 7 shows a synthetic latent fingerprint image 50 generated from the plain fingerprint image 10 and the background image 25 using a third type of transformation. The transformation comprises rotating the image 10 clockwise by around 145 degrees, reducing the rotated image in size to increase the difficulty of identifying features, cropping away the area 52 of the image to simulate only a central region of the fingertip touching the surface, and adding noise 54 to the upper right corner of the image to simulate imperfect latent fingerprint generation techniques. Finafly, the image was superimposed on the background image 25 to give the synthetic latent fingerprint image 50.
The feature points of the plain fingerprint image 10 are mapped according to the same transformations, and the positions of the features 11, 13, and 15 in the synthetic atent fingerprint image 50 arc marked on Fig. 7.
Fig. 8 shows a synthetic latent fingerprint image 60 generated from the plain fingerprint image 10 and the background image 25 using a fourth type of transformation. The transformation comprises rotating the image 10 clockwise by around 170 degrees, reducing the rotated image in size to increase the difficulty of identifying features, compressing the right side of the rotated image but not the left side (a non-linear transform) to simulate compression and squidging together of the fingerprint ridges of the fingertip at the right side of the rotated image, and adding noise 64 to the upper right corner of the image to simulate imperfect latent fingerprint generation techniques. Finally, the image was superimposed on the background image 25 to give the synthetic latent fingerprint image 60.
Thc feature points of thc plain fingerprint image 10 are mappcd according to the samc transformations, and the positions of the features 11 and 15 in the synthetic latent fingerprint image 50 are marked on Fig. 8.
Further details on how the various image transformations could be performed will now be described with reference to Figs. 9 to 14, and in relation to fifth and sixth types of image transformation. According to the fifth and sixth types of image transformation, the plain fingerprint image 10 is rotated anticlockwise by around 115 degrees, and reduced in size to provide a rotated and reduced plain fingerprint image 70 that is shown in Fig. 9. The feature points of the plain fingerprint image are also rotated and reduced by the same amounts, to provide rotated and reduced feature points that designate the positions of the features 11 -15 of the rotated and reduced pain fingerprint image 70.
Then, a pressure map 80 that is shown in Fig. 10 is generated. The pressure map defines pressures at which a simulated fingertip touches the surface region 24 of the Orange bottle 20, and comprises white areas where the fingertip presses against the surface region 24 with high pressure, and black areas where the fingertip either presses against the surface region 24 with lower pressure or does not press against the surface region 24 at all. Clearly, in practice, the pressure map would have more pressure graduations than just the two black and white pressure graduations shown in Fig. 10. For example, a greyscale pressure map with 256 pressure graduations may be used.
Next, the pressure map 80 is applied to the rotated and reduced plain fingerprint image 70, which results in the intermediate fingerprint image 90 that is shown in Fig. 11. The areas of the rotated and reduced plain fingerprint image 70 that the pressure map 80 indicates the simulated fingertip contacts the surface region 24 with high pressure (white) are maintained in the intermediate fingerprint image 90. and the areas of the rotated and reduced plain fingerprint image 70 that the pressure map 80 indicates the simulated fingertip contacts the surface region 24 with low pressure (black) are removed in the intermediate fingerprint image 90. The rotated and reduced feature points remain unchanged by this step. The features 11, 12, 14, and 15 that are within still-visible parts of the fingerprint image are indicated on Fig. 11.
Suhsequendy, according to the filth type of transformation, the intermediate fingerprint image 90 is superimposed onto the background image 25. for example by adding the images together. This results in a synthetic latent fingerprint image 100 that is shown in Fig. 12. The rotated and reduced feature points then are defined with respect to the synthetic latent fingerprint image 100, by translating them from a co-ordinate system of the intermediate fingerprint image 90 to a co-ordinate system of the synthetic latent fingerprint image 100. No changes to the rotated and reduced feature points may be required if the background image was the same size and orientation as the intermediate fingerprint image 90, and therefore the same size and orientation as the synthetic atent fingerprint image 100.
According to the sixth type of transformation, the intermediate fingerprint image 90 is further transformed before superimposing it on the background image 25. In particular, the intermediate fingerprint image 90 is processed by a motion blur filter to produce a further intermediate fingerprint image 110 that is shown in Fig. 12.
The motion blur filter applies a smearing effect to the intermediate fingerprint image 90, to simulate smearing of a fingerprint that is left behind on the surface region 24 of the Orange bottle 20, as the fingertip is withdrawn from the surface region 24 at a non-perpendicular angle to the surface region 24. The feature points may be slightly translated in the direction of motion blur am! in accordance with the amount of motion blur that is applied.
Then, the further intermediate fingerprint image 110 is superimposed upon the background image 25 in just the same manner as described above in relation to the fifth type of transformation, producing the synthetic latent fingerprint image 120 that is shown in Fig. 12.
The above embodiments are for illustrative purposes only, and greysealc or cothur images may be used in practice to greatly improve the intelligibility of the images. rather than the pure black and white images described herein. Many types of image processing methods in addition to those described herein may he used to produce image transformations that simulate latent fingerprints, as will be apparent to those skilled in the art. Various other embodiments of the invention falling within the scope of the appended claims will also be apparent to those skilled in the art.

Claims (16)

  1. CLAIMS1. A method of generating a synthetic latent fingerprint image awl positions of feature points within the synthetic latent fingerprint image, comprising: -receiving an initial fingerprint image and positions of feature points within the initial fingerprint image; -transforming the initial fingerprint image onto a background image to provide a synthetic latent fingerprint image this step including transforming one region of the fingerprint image differently than a separate region of the fingerprint image: -mapping the positions of the feature points within the initial fingerprint image to positions of the feature points within the synthetic latent fingerprint image. based upon the transforming: and -outputting the synthetic latent fingerprint image and the positions of the fcature points that are within the synthetic latent fingerpnnt image to enable the manual training of a fingerprint forensic analyst trainee, the manual training including the trainee identifying feature points of the synthetic atent fingerprint image and including assessment of a level of success of the trainee by comparison of positions of potential feature points identified by the trainee against the positions of the outputted feature points.
  2. 2. The method of claim 1. wherein transforming the initial fingerprint image onto the background image comprises applying only a part of the initial fingerprint image onto thebackground image.
  3. 3. Thc method of claim I or 2. wherein transforming the initial fingerprint image onto the background image comprises at least one of re-sizing and/or rotating a region of the initial fingerprint image differentially as compared to at least one other region of thereof.
  4. 4. The method of any preceding claim, wherein tnmsforrning the initial fingerprint image onto the background image comprises altering the initial fingerprint image according to a non-linear spatial transform.
  5. 5. The method of any preceding claim, wherein transforming the initial fingerprint image onto the background image comprises smearing a region of the initial fingerprint image differentially as compared to at least one other region thereof.
  6. 6. The method of any preceding claim, wherein transforming the initial fingerprint image onto the background image comprises adding image noise to a region of the initial fingerprint image differentially as compared to at least one other region thereof.
  7. 7. The method of any preceding claim, wherein the initial fingerprint image is a plain fingerprint image comprising lines upon a uniform background, the lines corresponding to ridges of a finger.
  8. 8. The method of any preceding claim, wherein the output synthetic fingerprint image and the positions of the feature points that are within the synthetic latent fingerprint image.are used as an initial fingerprint image and positions of feature points within the initial fingerprint image. in a repetition of the steps of transforming, mapping, and outputting to generate another synthetic atent fingerprint image.
  9. 9. The method of any preceding claim, further comprising repeating the steps of transforming, mapping. and outputting a plurality of times, with varying transformations, to generate a plurality of synthetic latent fingerprint images from the initial fingerprint image.
  10. 10. The method of any preceding claim, wherein the feature points compr se minutiae points, the minutiae points being one or more oiridge endings, ridge bifurcations. short ridges, islands, spurs, and crossovers.
  11. 11. The method of any preceding claim, wherein the feature points comprise one or more of pores. dots, incipient ridges, and ridge edge protrusions.
  12. 12. The method of any preceding claim, wherein the background image comprises at least one of texture and patterning.
  13. 13. A computer program product to configure a computer to perform the method of any preceding claim.
  14. 14. The computer program product of claim 13, wherein the computer program product is a computer readable storage media storing a computer program for configuring a computer to perform the method of any one of claims 1 to 12.
  15. 15. A synthetic latent fingerprint image generator for generating synthetic latent fingerprint images for permitting the manual training of a fingerprint forensic analyst trainee, the manual training including the trainee identifying feature points of the synthetic latent fingerprint image and including assessment of a level of success of the trainee by comparison of positions of potential feature points identified by the trainee against the positions of the outputted feature points., the synthetic latent fingerprint image generator comprising: -an input for receiving an initial fingerprint image and positions of feature points within the initial fingerprint image; -a processor configured to transform the initial fingerprint image onto a background image to provide a synthetic latent fingerprint image this including transforming one region of (he fingerprint image differently than a separate region of the fingerprint image; and to map the positions of the feature points within the initial fingerprint image to positions of the feature points within the synthetic latent fingerprint image, based upon the transforming; and -an output for outputting the synthetic latent fingerprint image with the positions of the feature points that are within the synthetic latent fingerprint image.
  16. 16. A method of generating a synthetic latent fingerprint and positions of feature points within the synthetic latent fingerprint image, the method substantially as described herein with relerence to the accompanying drawings.
GB1410063.0A 2014-06-06 2014-06-06 Synthetic latent fingerprint images Withdrawn GB2526874A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1410063.0A GB2526874A (en) 2014-06-06 2014-06-06 Synthetic latent fingerprint images
PCT/GB2015/000156 WO2015185883A1 (en) 2014-06-06 2015-06-03 Synthetic latent fingerprint images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1410063.0A GB2526874A (en) 2014-06-06 2014-06-06 Synthetic latent fingerprint images

Publications (2)

Publication Number Publication Date
GB201410063D0 GB201410063D0 (en) 2014-07-16
GB2526874A true GB2526874A (en) 2015-12-09

Family

ID=51214837

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1410063.0A Withdrawn GB2526874A (en) 2014-06-06 2014-06-06 Synthetic latent fingerprint images

Country Status (2)

Country Link
GB (1) GB2526874A (en)
WO (1) WO2015185883A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3103045A1 (en) * 2019-11-07 2021-05-14 Idemia Identity & Security France A method of augmenting a training image base representing a fingerprint on a background using a generative antagonist network

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107239737A (en) * 2017-05-03 2017-10-10 广东欧珀移动通信有限公司 A kind of optical finger print recognition methods and Related product
CN108307041A (en) * 2017-12-26 2018-07-20 努比亚技术有限公司 A kind of method, mobile terminal and storage medium obtaining operational order according to fingerprint
CN111639521B (en) * 2020-04-14 2023-12-01 天津极豪科技有限公司 Fingerprint synthesis method, fingerprint synthesis device, electronic equipment and computer readable storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101149787A (en) * 2006-09-20 2008-03-26 中国科学院自动化研究所 Fingerprint synthesis method based on orientation field model and Gabor filter

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101149787A (en) * 2006-09-20 2008-03-26 中国科学院自动化研究所 Fingerprint synthesis method based on orientation field model and Gabor filter

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3103045A1 (en) * 2019-11-07 2021-05-14 Idemia Identity & Security France A method of augmenting a training image base representing a fingerprint on a background using a generative antagonist network
US11610081B2 (en) 2019-11-07 2023-03-21 Idemia Identity & Security France Method for augmenting a training image base representing a print on a background by means of a generative adversarial network

Also Published As

Publication number Publication date
GB201410063D0 (en) 2014-07-16
WO2015185883A1 (en) 2015-12-10

Similar Documents

Publication Publication Date Title
US10685224B1 (en) Unsupervised removal of text from form images
Dhivya et al. Copy-move forgery detection using SURF feature extraction and SVM supervised learning technique
Qidwai et al. Digital image processing: an algorithmic approach with MATLAB
An et al. Face image super-resolution using 2D CCA
Davenport et al. Joint manifolds for data fusion
Lei et al. Scale insensitive and focus driven mobile screen defect detection in industry
DE102017009276A1 (en) TO PRODUCE A THREE-DIMENSIONAL MODEL FROM A SCANNED OBJECT
GB2526874A (en) Synthetic latent fingerprint images
JP2019523474A (en) Method for extended authentication of material subject
Priesnitz et al. Syncolfinger: Synthetic contactless fingerprint generator
Priesnitz et al. Deep learning-based semantic segmentation for touchless fingerprint recognition
US8655084B2 (en) Hand-based gender classification
Panetta et al. Unrolling post-mortem 3D fingerprints using mosaicking pressure simulation technique
CN113963193A (en) Method and device for generating vehicle body color classification model and storage medium
Sakurai et al. Restoring aspect ratio distortion of natural images with convolutional neural network
CN115965844B (en) Multi-focus image fusion method based on visual saliency priori knowledge
Makihara et al. Grasp pose detection for deformable daily items by pix2stiffness estimation
Singh et al. Implementation and evaluation of DWT and MFCC based ISL gesture recognition
CN115546796A (en) Non-contact data acquisition method and system based on visual computation
Viacheslav et al. Low-level features for inpainting quality assessment
Mohedano et al. Improving object segmentation by using EEG signals and rapid serial visual presentation
CN113191942A (en) Method for generating image, method for training human detection model, program, and device
CN112380995B (en) Face recognition method and system based on deep feature learning in sparse representation domain
van der Maaten et al. Identifying the real van gogh with brushstroke textons
Li et al. Evaluating the efficacy of skincare product: A realistic short-term facial pore simulation

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)