WO2019021173A1 - System for and method of adjusting the orientation of a captured image of a skewed fingerprint - Google Patents

System for and method of adjusting the orientation of a captured image of a skewed fingerprint Download PDF

Info

Publication number
WO2019021173A1
WO2019021173A1 PCT/IB2018/055494 IB2018055494W WO2019021173A1 WO 2019021173 A1 WO2019021173 A1 WO 2019021173A1 IB 2018055494 W IB2018055494 W IB 2018055494W WO 2019021173 A1 WO2019021173 A1 WO 2019021173A1
Authority
WO
WIPO (PCT)
Prior art keywords
navigation
reference point
foreground
value
axis
Prior art date
Application number
PCT/IB2018/055494
Other languages
French (fr)
Inventor
Ishmael Sbusiso MSIZA
Original Assignee
Mmapro It Solutions (Pty) Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mmapro It Solutions (Pty) Ltd filed Critical Mmapro It Solutions (Pty) Ltd
Publication of WO2019021173A1 publication Critical patent/WO2019021173A1/en
Priority to ZA2020/01145A priority Critical patent/ZA202001145B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/242Aligning, centring, orientation detection or correction of the image by image rotation, e.g. by 90 degrees

Definitions

  • THIS invention is in the field of a system for and a method of adjusting the orientation of a captured image of a skewed fingerprint, and is also in the field of a system for and method of establishing a centroid of a captured image of a skewed fingerprint for use in adjusting the orientation of the skewed fingerprint.
  • the performance of an automated fingerprint recognition system is often quantified in terms of its match rate, non-match rate, and execution speed.
  • the system's performance can be adversely affected when a captured fingerprint is rotated when measured against the image background and is not in an upright position at an angle of approximately 0 degrees relative to the vertical axis of the captured image.
  • Fingerprints can be ordered into a number of categories, known as fingerprint classes, by performing fingerprint analytics on captured images of fingerprints.
  • fingerprint classes include the Central Twins (CT), Tented Arch (TA), Left Loop (LL), Right Loop (RL), and Plain Arch (PA). These fingerprint classes are often determined based on the relationship between fingerprint characteristics or landmarks known as singular points.
  • a fingerprint core is forensically defined as the inner-most turning point of a fingerprint loop.
  • a fingerprint delta is a point where the fingerprint ridges tend to form a triangular shape.
  • a fingerprint loop is formed by a ridge pattern that emanates from one side of a fingerprint, flows inwards, and returns in the original direction.
  • fingerprint core As a reference is the fact that not all types of fingerprints have a core as part of their pattern, such as fingerprints that belong to the PA fingerprint class. Furthermore, locating the fingerprint core could be cumbersome or even impossible in some cases.
  • 'foreground' of a fingerprint image is often used by those skilled in the art as part of the image including alternating ridge-furrow patterns
  • 'background' of a fingerprint image is often used by those skilled in the art to refer to the remaining part of the image (i.e. the part of the image that does not contain the fingerprint). Accordingly, -these terms should be understood, for purposes of this specification, as embracing such meanings.
  • a method of adjusting the orientation of a captured image of a skewed fingerprint comprising the steps of: separating a foreground of the captured image from a background of the captured image; estimating a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint, wherein the estimated centroid of the foreground defines a first foreground axis; estimating an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and pivoting or rotating the captured image by the estimated orientation angle so as to correct the orientation of the skewed fingerprint in the captured image.
  • a system for adjusting the orientation of a captured image of a skewed fingerprint comprising: a processor; and a memory which is connected to the processor, the memory containing instructions which when executed causes the processor to: provide data relating to the captured image; separate a foreground of the image from a background of the image; estimate a centroid of the foreground with respect to a predefined point of reference that is located in one of the foreground and background of the captured image, wherein the estimated centroid of the foreground defines a first foreground axis; estimate an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and pivot or rotate the image by the estimated orientation angle.
  • a non-transitory computer readable device storing instructions thereon which when executed by a processor of a computing device performs the functions of: separating, by means of at least one processor of the computing device, a foreground of the captured image from a background of the captured image; estimating, by means of at least one processor of the computing device, a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint, wherein the estimated centroid of the foreground defines a first foreground axis; estimating, by means of at least one processor of the computing device, an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and
  • a method of establishing a centroid of a foreground of a captured image of a skewed fingerprint comprising: providing a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimating a centroid of the foreground with respect to a predefined reference point located in one of the foreground and background of the captured image of the fingerprint.
  • a system for establishing a centroid of a foreground of a captured image of a skewed fingerprint comprising: a processor; and a memory that is coupled to the processor, the memory containing instructions which when executed by the processor causes the processor to: provide a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimate the location of a centroid of the foreground with respect to a predefined reference point in one of the foreground and background of the captured image of the fingerprint.
  • a non-transitory device which when executed by a processor of a computing device causes the processor to perform the actions of: providing a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimating a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint.
  • FIG. 1 shows a prior art reference point utilised in fingerprint analysing techniques
  • FIG. 2 is an example illustration of a fingerprint that would be classified under the Plain Arch (PA) fingerprint class
  • FIG.3 is a flow diagram illustrating steps in a method of adjusting the orientation of a captured image of a fingerprint according to the invention
  • FIG. 4 is a first example captured image of a fingerprint
  • FIG. 5 is the image of Figure 4, with a contrast thereof being enhanced
  • FIG. 6 is a mask image generated by a foreground separation module shown in Figure 3;
  • FIG. 7 is a second example captured image of a fingerprint, wherein the fingerprint is rotated in a counter-clockwise direction;
  • FIG. 8 is a schematic representation showing, as a line, the interface of the foreground and background of the fingerprint of Figure 7;
  • FIG. 9 is an adjusted or corrected image which is produced by a correcting module shown in Figure 3 according to the invention.
  • FIG. 10 is a high-level block diagram illustrating a system for adjusting the orientation of a captured image of a fingerprint.
  • example embodiments of a method of and a system for adjusting the orientation of a captured image of a skewed fingerprint are generally designated by the reference numeral 10 in Figures 3 and 10.
  • Figure 1 shows a prior art technique wherein a fingerprint core 12 is utilised as a reference point in fingerprints analytics.
  • Figure 2 shows a fingerprint belonging to the Plain Arch (PA) fingerprint class, in which a ridge pattern 14 emanates from a left- hand side of the fingerprint and terminates at a right-hand side. Because the ridge pattern does not form a loop, this fingerprint does not have a fingerprint core. A fingerprint delta is generally also not present in PA fingerprints.
  • prior art techniques which utilise a fingerprint core as a reference fail when presented with a fingerprint that belongs to the PA fingerprint class.
  • FIG 3 there is shown a flow diagram of the method 10 of adjusting the orientation of a captured image of a skewed fingerprint.
  • the method 10 comprises, at 16, capturing of the image of a fingerprint with a fingerprint reader. It should be appreciated that, in other embodiments of the invention, the image could be captured by scanning a physical representation of a fingerprint.
  • an optional step is performed wherein a contrast of the captured image is adjusted and/or enhanced.
  • a foreground of the captured image is separated from a background of the image.
  • data relating to a centroid of the foreground is generated by a centroid location module, as will be described in more detail below.
  • an orientation angle of the captured image of the skewed fingerprint relative to an axis (i.e. foreground axis) of the foreground is estimated by utilising the data relating to the centroid.
  • a corrected or adjusted image is generated by pivoting or rotating the captured image by the estimated orientation angle and storing data relating to a corrected image onto a database.
  • Figure 4 shows a first example captured image 28 of a fingerprint 30.
  • the fingerprint 30 comprises alternating ridges 32.1 to 32. n (which are represented by the dark lines) and furrows 34 which are indicated by white spaces in between the dark lines of the ridges 32.
  • the contrast enhancement module 18 increases the difference in intensity values of the ridges 32 and the furrows 34 so that the ridges 32 become darker, and the furrows 34 become lighter. This is achieved through the use of a normalization technique, where a standard deviation of the image's 28 intensity values is set to unity, and a mean thereof is set to zero.
  • the input of the contrast enhancement module 18 is in this case the first example image 28, whereas the output of the contrast enhancement module 18 is an enhanced image 36, also referred to as a normalised image, shown in Figure 5.
  • the foreground separation module 20 is configured to separate the foreground 38 of the enhanced image 36 from the background 40 via a variance-based technique, based on the fact that there is high variance in the foreground 38, and low variance in the background 40. More specifically, the variance in darkness of pixels in the foreground 38 and background 40 of the enhanced image 36 is, for example, analysed by the foreground separation module 20. A heuristically determined variance threshold is hence determined and utilised to separate the foreground 38 from the background 40.
  • the input of the foreground separation module 20 is data relating to the enhanced image 36 and the output of the foreground separation module 20 is data relating to a mask image 42, shown in Figure 6.
  • the foreground 38 is, for example, assigned an intensity value of 255 (grayscale white), whereas the background 40 is assigned an intensity value of 0 (grayscale black).
  • the foreground 38 is hence separated from the background 40 by applying a mask to the image 36, thereby masking the background 40 to yield the foreground 38.
  • Figure 7 is shown a second example captured image 44 of the fingerprint 30, but with the fingerprint 30 being skewed, i.e. rotated in a counter-clockwise direction with respect to the vertical.
  • the fingerprint 30 is skewed, i.e. rotated in a counter-clockwise direction with respect to the vertical.
  • An angle at which the fingerprint image is captured, with respect to the vertical forms part of a fingerprint quality measure.
  • the centroid location module 22 is configured to receive as input, data relating to the mask 42 and/or data relating to the enhanced image 36. It will be appreciated that a second mask and a second enhanced image may be similarly generated for the second example image 44 or for further fingerprint images (not shown).
  • Figure 8 is a schematic two-dimensional representation 46 of the image 44 of Figure 7 illustrating the outline 48 of the foreground 38 of the fingerprint 30.
  • the location of the centroid (xf C , ytc) of the foreground 38 is estimated through two sets of, both, horizontal and vertical navigation in the foreground-area 38 which would ultimately define four navigational coordinates, the mean of which would define the coordinates of the centroid (xf C , ytc).
  • the x-coordinate, xf C , and y-coordinate are determined through the process of navigating horizontally and vertically (that is in the x-axis direction and y-axis direction of a Cartesian plane) within a foreground of the captured image of the skewed fingerprint, as will be described in more detail below.
  • the starting point of each navigation set is the centre of the fingerprint image background (xt>,yb) which are the first and second reference point coordinates of the fingerprint image background.
  • the centre of the image background would fall within the foreground area 38.
  • the x- coordinate of the background centre is given by:
  • the method of determining the location of the centroid (xfc, ytc), takes place in several interlinked stages.
  • the first stage includes conducting a first navigation from the reference point (xt>, yb) towards a first direction (i.e. the right), and a navigation towards a second direction (i.e. the left).
  • the first reference point coordinate xb defines a first reference point axis A
  • the second reference point coordinate yb defines a second reference point axis C which is transverse, i.e. orthogonal, to the first reference point axis A.
  • a horizontal right-ward navigation starts at the point (xt>, yb) and increases in steps/increments of 1 pixel (i.e.
  • the value of this x-coordinate is increased up to the pixel that marks the interface between the foreground 38 and the background 40, if it exists. Alternatively, it is increased up to the pixel that marks the right-most edge of the image background 40 or that of the image 44.
  • This extreme-right x-coordinate is recorded and stored as XM, being a first value.
  • the horizontal, left-ward navigation i.e. navigation in a second direction that is opposite the first direction starts at the point (xt>,yb) and decreases - in steps of 1 pixel (i.e.
  • the value of the x- coordinate is decreased up to the pixel that marks the interface between the foreground 38 and the background 40, if it exists. Alternatively, it is decreased up to the pixel that marks the left-most edge of the image background 40 or that of the image 44.
  • This extreme-left x-value is recorded and stored as xn, as a second value.
  • the first navigation reference point (or first navigational set) is defined by first and second navigational coordinates X , yt>.
  • the first navigational coordinate xn defines a first navigational axis that corresponds with the first reference point axis A, (i.e. horizontal axis A) and the second navigational coordinate yb defines a second navigational axis B (i.e. vertical axis B) which is transverse to the first reference point axis A.
  • the second stage includes conducting a second navigation (i.e. a vertical navigation along the B-axis) with respect to the first navigation set having first and second navigation coordinates XM, yb.
  • This vertical navigation occurs in two stages, namely, the upward navigation (in a third direction along the second navigational axis B) and the downward navigation (in a fourth direction that is opposite the third direction along the second navigational axis B).
  • the upward navigation i.e. in the third direction along the axis B
  • the y-coordinate is decreased up to the pixel that marks the interface between the foreground 38 and the background 40, if it exists. Alternatively, it is decreased up to the pixel that marks the upper-most edge of the image background 40 or that of the image 44.
  • This extreme-top y-coordinate is recorded and stored as y u i , being a fourth value.
  • the downward navigation (in the fourth direction), also, starts at the point (XM, yb) and increases - in steps of 1 pixel (i.e. the value of the y-coordinate), while the x-coordinate (i.e. XM) remains constant.
  • the value of the y-coordinate is increased up to the point that marks the interface between the foreground 38 and the background 40, if it exists. Alternatively, it is increased up to the pixel that marks the lower-most edge of the image background 40 or that of the image 44.
  • This extreme-bottom y-value is recorded and stored as yn, being a fifth value.
  • the vertical navigation thus uses the third value (xf-i ) in determining the sixth value (yn).
  • the second navigation reference point has fifth and sixth navigation coordinates xti , yn.
  • a third navigation set is the same as above, however, it occurs in reverse, whereby it first commences with the vertical navigation along the vertical axis C in a fifth (upward) direction and sixth (downward) direction, with respect to the starting point (xb, yt>). Similar to the above-described process, y U 2, being a seventh value, and yi2, being an eighth value, are determined, and accordingly the y-coordinate of the third navigation set (i.e. third navigation reference point), being a ninth value, yt2, is determined.
  • the coordinates of the third navigation set includes fifth and sixth navigation coordinates xt>, yt2, wherein the fifth navigation coordinate Xb defines a fourth navigation axis D which is parallel to the first reference point axis A and transverse to the third navigation axis C.
  • a fourth navigation set is determined by performing the horizontal navigation along the D axis in a seventh (rightward) direction and eighth (leftward) direction, with starting point (xt>, yt2). Accordingly, similar to the above-described procedure, x r 2 and xi2, which are tenth and eleventh values, are determined. Accordingly the x-coordinate of the fourth navigation set, being a twelfth value, is determined as xt2:
  • centroid location module 22 is able to accurately estimate the centroid location of the foreground 38 by calculating the respective averages of the aforementioned determined third and twelfth values, and sixth and ninth values.
  • a first foreground coordinate, in the x-direction, of the centroid location is determined by:
  • a second foreground coordinate, in the y-direction, of the centroid location is determined by:
  • the coordinates of the centroid location could be (X , yt-i ) or (xt2, yt2)-
  • the first foreground coordinate defines a first foreground axis Fx, being a first foreground axis of the location of the estimated centroid of the foreground 38.
  • the second foreground coordinates defines a second foreground axis F y , being a second foreground axis of the location of the estimated centroid of the foreground 38.
  • the foreground 38 defines a longitudinal axis Y which is slanted relative to the substantially vertical, second foreground axis Fy.
  • the longitudinal axis Y passes through an upper edge G of the foreground 38 and the centroid xf C , ytc.
  • the upper edge G is determined by navigating upwardly, in steps of 1 -pixel, from the foreground centroid Xf C , ytc (while maintaining the Xf C coordinate constant) up to the point that marks the interface between the foreground 38 and the background 40.
  • this point is marked as E, and is referred to herein as a theoretical position at which an upper edge of the foreground 38 should be located.
  • This point E has coordinates x e , ye and the y e coordinate has a fifteenth value.
  • the x-values on the left-hand side of point E would be negative, and those on the right hand side would be positive.
  • the method further includes navigating towards the right up to a point that marks the interface between the foreground 38 and the background 40 to establish a sixteenth value in the x-direction.
  • navigating horizontally to the right will immediately fall into the background 40, indicating that the foreground is not skewed in the clockwise direction.
  • the method 10 also includes navigating horizontally to the left in increments of 1 pixel up to a point, G, that marks the interface between the foreground 38 and the background 40.
  • the point G referred to herein as the actual upper edge of the skewed foreground 40 has coordinates x g , y g and the x g coordinate has a seventeenth value.
  • the method further includes calculating the vertical length/distance between the point y e (i.e. the fifteenth value) and the foreground centroid ytc (i.e. the fourteenth value) and, and also calculating the horizontal distance between the point x g and the point xe. Once these distances have been computed, the method includes calculating an angle a as follows:
  • the angle a defined between the longitudinal axis Y and the second foreground axis F y indicates the extent at which the point G of the foreground 38 (and essentially the foreground 38) is oriented/skewed with respect to the second foreground axis F y (as shown in Figure 8).
  • a foreground centroid designated by reference numeral 50 as shown in Figure 9, which is determined by the centroid location module 22, as described above, is preferably defined as the accurate geometric centre of the foreground 38 of the fingerprint 30.
  • the location of this fingerprint centroid may be performed via locating a geometric centre of the second example image 44 which corresponds to the arithmetic mean or average position of all the points or pixels in the foreground 38 of the image 44 with respect to a reference point of the schematic representation 46 of the image 44, as described previously.
  • Data relating to the centroid 50 such as the second foreground axis F y and centroid coordinates xf C , ytc, is utilised as an input for the rotation estimation module 24.
  • the rotation estimation module 24 is configured to estimate the orientation angle a between, for example, the longitudinal axis of the foreground Y as described above relative to the second foreground axis F y .
  • the output of the orientation estimation module 24 is a data structure comprising the angle a and a rotation direction such as clock-wise or anti-clockwise, indicating the direction in which the captured image 44 (i.e. combination of the foreground and background thereof) must be rotated by a from the longitudinal axis Y in the direction of the second foreground axis F y .
  • This data structure may be stored into a database 60, 66 which are described below.
  • the orientation adjustment module 26 is configured to generate a corrected image 52 (shown in Figure 9) by pivoting or rotating the image 44 by the estimated orientation angle a.
  • the corrected image 52 is generated by pivoting or rotating the captured image 44 of the skewed fingerprint 30 by the angle a in the clockwise direction since the calculated value of a was negative.
  • the resultant/corrected image 52 will have the longitudinal axis Y thereof taking the place of the second foreground axis F y (i.e. being substantially upright), and the second foreground axis F y will accordingly be rotated in the clockwise direction by the estimated angle a, as shown in Figure 9.
  • the foreground centroid 50 is utilised as a reference point to aid in the estimation of the angle by which the skewed fingerprint in the captured image 44 needs to be tilted in order to generate the corrected image 52 (i.e. with the fingerprint 30 being oriented substantially upright so that the upper edge G thereof in the corrected image 52 can also be oriented substantially upright with respect to a horizontal axis X of the corrected image 52.
  • many classes of fingerprints may be analysed utilising the method 10, including Central Twins, Tented Arch, Left Loop, Right Loop, Plain Arch, and even other fingerprints which do not belong to anyone of the aforementioned classes.
  • Figure 10 shows a high-level block diagram illustrating a system 10 for correcting the orientation of the fingerprint image.
  • a plurality of distributed fingerprint readers 16.1 to 16.n capture a plurality of images relating to fingerprints.
  • Data relating to an image of a fingerprint is stored into a first database 60.
  • the system 10 preferably comprises the first database 60 and a processor 64 which is connected to the first database 60 and configured to utilise the data relating to the fingerprint image and to execute an image correcting algorithm in accordance with instructions which are stored in a memory (not shown) which is coupled to the processor 64.
  • the instructions (i.e. algorithm) in the memory (not shown) comprises a foreground separation module 20 for separating a foreground of the image from a background of the image.
  • the algorithm further comprises a centroid location module 22 for generating data relating to a centroid of the foreground.
  • the algorithm yet further comprises an orientation estimation module 24 for estimating an orientation direction and angle a between a longitudinal axis Y of the skewed foreground image (as shown in Figure 8) and the second foreground axis F y of the; and further comprises an orientation adjustment module 26 for generating a corrected image by pivoting or rotating the image by the estimated orientation angle a.
  • Data relating to the corrected image 52 is stored into a second database 66.
  • a back end 62 of an operator may be utilised comprising the processor 64 and the second database 66.
  • the first and second databases 60, 66 form a single database.
  • Backend 62 may hence receive data relating to a plurality of scanned fingerprint images which were captured by the plurality of fingerprint readers 16.1 to 16.n.
  • the fingerprint reader 16 may form part of the system and/or method 10, however embodiments are possible wherein the system 10 connects to the database 60 comprising data relating to fingerprints as shown in Figure 10 which data has been pre-stored.
  • the step of providing the fingerprint image may comprise retrieving data relating to the fingerprint image from the database 60.
  • One or more of the fingerprint readers 16.1 to 16.n may hence form part of the system 10. It will further be appreciated that the method 10 may be performed without performing the contrast enhancement step 18.
  • the method 10 is enabled to rotate fingerprint images in the counter-clockwise direction as well.
  • the image 44 may be pivoted or rotated about the centroid 50, alternatively the image 44 may be pivoted about another pivot point (not shown).

Abstract

This invention relates to a method of adjusting the orientation of a captured image of a skewed fingerprint, the method comprising the steps of: separating a foreground of the captured image from a background of the captured image; estimating a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint, wherein the estimated centroid of the foreground defines a first foreground axis; estimating an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and pivoting or rotating the captured image by the estimated orientation angle so as to correct the orientation of the skewed fingerprint in the captured image. The invention also relates to a unique manner of establishing a centroid of the foreground of the captured image.

Description

SYSTEM FOR AND METHOD OF ADJUSTING THE ORIENTATION OF A
CAPTURED IMAGE OF A SKEWED FINGERPRINT
FIELD OF INVENTION
THIS invention is in the field of a system for and a method of adjusting the orientation of a captured image of a skewed fingerprint, and is also in the field of a system for and method of establishing a centroid of a captured image of a skewed fingerprint for use in adjusting the orientation of the skewed fingerprint.
BACKGROUND OF INVENTION
The performance of an automated fingerprint recognition system is often quantified in terms of its match rate, non-match rate, and execution speed. The system's performance can be adversely affected when a captured fingerprint is rotated when measured against the image background and is not in an upright position at an angle of approximately 0 degrees relative to the vertical axis of the captured image.
Fingerprints can be ordered into a number of categories, known as fingerprint classes, by performing fingerprint analytics on captured images of fingerprints. These fingerprint classes include the Central Twins (CT), Tented Arch (TA), Left Loop (LL), Right Loop (RL), and Plain Arch (PA). These fingerprint classes are often determined based on the relationship between fingerprint characteristics or landmarks known as singular points.
There are two types of singular points, namely, the fingerprint core and the fingerprint delta. A fingerprint core is forensically defined as the inner-most turning point of a fingerprint loop. A fingerprint delta is a point where the fingerprint ridges tend to form a triangular shape. A fingerprint loop is formed by a ridge pattern that emanates from one side of a fingerprint, flows inwards, and returns in the original direction.
In order to adjust the orientation or re-align a captured fingerprint, it is necessary to locate some point of reference and then perform the orientation adjustment relative to that point of reference. Literature reveals that practitioners normally use the fingerprint core as a point of reference for image re-alignment.
One main disadvantage of using the fingerprint core as a reference is the fact that not all types of fingerprints have a core as part of their pattern, such as fingerprints that belong to the PA fingerprint class. Furthermore, locating the fingerprint core could be cumbersome or even impossible in some cases.
With a view of addressing the above disadvantages, it has been proposed to locate the True Fingerprint Centre Point (TFCP) as a reference, where the TFCP is the centre of the fingerprint image foreground. However, previously disclosed methods of determining the centre of the fingerprint image foreground, and using such centre for re-aligning a captured fingerprint have proved to be inaccurate and therefore unsuccessful.
The term 'foreground' of a fingerprint image is often used by those skilled in the art as part of the image including alternating ridge-furrow patterns, and the term 'background' of a fingerprint image is often used by those skilled in the art to refer to the remaining part of the image (i.e. the part of the image that does not contain the fingerprint). Accordingly, -these terms should be understood, for purposes of this specification, as embracing such meanings.
It is an object of the present invention to provide a system for and a method of adjusting the orientation of a captured image of a skewed fingerprint with which the applicant believes the aforementioned problems may at least be alleviated and/or which may provide a useful alternative for the known systems and/or methods.
It is another object of the present invention to provide a new system for and method of determining a centroid of a captured image of the skewed fingerprint for use in correcting the orientation of the captured image of the skewed fingerprint.
SUMMARY OF INVENTION
According to a first aspect of the invention there is provided a method of adjusting the orientation of a captured image of a skewed fingerprint, the method comprising the steps of: separating a foreground of the captured image from a background of the captured image; estimating a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint, wherein the estimated centroid of the foreground defines a first foreground axis; estimating an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and pivoting or rotating the captured image by the estimated orientation angle so as to correct the orientation of the skewed fingerprint in the captured image.
According to a second aspect of the invention there is provided a system for adjusting the orientation of a captured image of a skewed fingerprint, the system comprising: a processor; and a memory which is connected to the processor, the memory containing instructions which when executed causes the processor to: provide data relating to the captured image; separate a foreground of the image from a background of the image; estimate a centroid of the foreground with respect to a predefined point of reference that is located in one of the foreground and background of the captured image, wherein the estimated centroid of the foreground defines a first foreground axis; estimate an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and pivot or rotate the image by the estimated orientation angle.
According to a third aspect of the invention, there is provided a non-transitory computer readable device storing instructions thereon which when executed by a processor of a computing device performs the functions of: separating, by means of at least one processor of the computing device, a foreground of the captured image from a background of the captured image; estimating, by means of at least one processor of the computing device, a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint, wherein the estimated centroid of the foreground defines a first foreground axis; estimating, by means of at least one processor of the computing device, an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and
pivoting or rotating, by means of at least one processor of the computing device, the captured image by the estimated orientation angle so as to correct the orientation of the skewed fingerprint in the captured image. According to a fourth aspect of the invention there is provided a method of establishing a centroid of a foreground of a captured image of a skewed fingerprint, the method comprising: providing a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimating a centroid of the foreground with respect to a predefined reference point located in one of the foreground and background of the captured image of the fingerprint.
According to a fifth aspect of the invention there is provided a system for establishing a centroid of a foreground of a captured image of a skewed fingerprint, the system comprising: a processor; and a memory that is coupled to the processor, the memory containing instructions which when executed by the processor causes the processor to: provide a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimate the location of a centroid of the foreground with respect to a predefined reference point in one of the foreground and background of the captured image of the fingerprint.
According to a sixth aspect of the invention, there is provided a non-transitory device which when executed by a processor of a computing device causes the processor to perform the actions of: providing a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimating a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint. The other features of the invention are set out in the detailed description and the claims.
BRIEF DESCRIPTION OF DRAWINGS
The objects of this invention and the manner of obtaining them, will become more apparent, and the invention itself will be better understood, by reference to the following description of embodiments of the invention taken in conjunction with the accompanying diagrammatic drawings, wherein:
FIG. 1 shows a prior art reference point utilised in fingerprint analysing techniques;
FIG. 2 is an example illustration of a fingerprint that would be classified under the Plain Arch (PA) fingerprint class;
FIG.3 is a flow diagram illustrating steps in a method of adjusting the orientation of a captured image of a fingerprint according to the invention;
FIG. 4 is a first example captured image of a fingerprint;
FIG. 5 is the image of Figure 4, with a contrast thereof being enhanced;
FIG. 6 is a mask image generated by a foreground separation module shown in Figure 3;
FIG. 7 is a second example captured image of a fingerprint, wherein the fingerprint is rotated in a counter-clockwise direction;
FIG. 8 is a schematic representation showing, as a line, the interface of the foreground and background of the fingerprint of Figure 7;
FIG. 9 is an adjusted or corrected image which is produced by a correcting module shown in Figure 3 according to the invention; and FIG. 10 is a high-level block diagram illustrating a system for adjusting the orientation of a captured image of a fingerprint.
DETAILED DESCRIPTION OF AN EXAMPLE EMBODIMENT
The following description of the invention is provided as an enabling teaching of the invention. Those skilled in the relevant art will recognise that many changes can be made to the embodiment described, while still attaining the beneficial results of the present invention. It will also be apparent that some of the desired benefits of the present invention can be attained by selecting some of the features of the present invention without utilising other features. Accordingly, those skilled in the art will recognise that modifications and adaptations to the present invention are possible and can even be desirable in certain circumstances, and are a part of the present invention. Thus, the following description is provided as illustrative of the principles of the present invention and not a limitation thereof.
Referring to the figures, in which like features are indicated by like numerals, example embodiments of a method of and a system for adjusting the orientation of a captured image of a skewed fingerprint are generally designated by the reference numeral 10 in Figures 3 and 10.
Figure 1 shows a prior art technique wherein a fingerprint core 12 is utilised as a reference point in fingerprints analytics. Figure 2 shows a fingerprint belonging to the Plain Arch (PA) fingerprint class, in which a ridge pattern 14 emanates from a left- hand side of the fingerprint and terminates at a right-hand side. Because the ridge pattern does not form a loop, this fingerprint does not have a fingerprint core. A fingerprint delta is generally also not present in PA fingerprints. Hence, prior art techniques which utilise a fingerprint core as a reference fail when presented with a fingerprint that belongs to the PA fingerprint class. Referring to Figure 3, there is shown a flow diagram of the method 10 of adjusting the orientation of a captured image of a skewed fingerprint. The method 10 comprises, at 16, capturing of the image of a fingerprint with a fingerprint reader. It should be appreciated that, in other embodiments of the invention, the image could be captured by scanning a physical representation of a fingerprint. At 18, an optional step is performed wherein a contrast of the captured image is adjusted and/or enhanced. At 20, a foreground of the captured image is separated from a background of the image. At 22, data relating to a centroid of the foreground is generated by a centroid location module, as will be described in more detail below. At 24, an orientation angle of the captured image of the skewed fingerprint relative to an axis (i.e. foreground axis) of the foreground is estimated by utilising the data relating to the centroid. At 26, a corrected or adjusted image is generated by pivoting or rotating the captured image by the estimated orientation angle and storing data relating to a corrected image onto a database.
Figure 4 shows a first example captured image 28 of a fingerprint 30. The fingerprint 30 comprises alternating ridges 32.1 to 32. n (which are represented by the dark lines) and furrows 34 which are indicated by white spaces in between the dark lines of the ridges 32. The contrast enhancement module 18 increases the difference in intensity values of the ridges 32 and the furrows 34 so that the ridges 32 become darker, and the furrows 34 become lighter. This is achieved through the use of a normalization technique, where a standard deviation of the image's 28 intensity values is set to unity, and a mean thereof is set to zero. The input of the contrast enhancement module 18 is in this case the first example image 28, whereas the output of the contrast enhancement module 18 is an enhanced image 36, also referred to as a normalised image, shown in Figure 5.
The foreground separation module 20 is configured to separate the foreground 38 of the enhanced image 36 from the background 40 via a variance-based technique, based on the fact that there is high variance in the foreground 38, and low variance in the background 40. More specifically, the variance in darkness of pixels in the foreground 38 and background 40 of the enhanced image 36 is, for example, analysed by the foreground separation module 20. A heuristically determined variance threshold is hence determined and utilised to separate the foreground 38 from the background 40. The input of the foreground separation module 20 is data relating to the enhanced image 36 and the output of the foreground separation module 20 is data relating to a mask image 42, shown in Figure 6. The foreground 38 is, for example, assigned an intensity value of 255 (grayscale white), whereas the background 40 is assigned an intensity value of 0 (grayscale black). The foreground 38 is hence separated from the background 40 by applying a mask to the image 36, thereby masking the background 40 to yield the foreground 38.
In Figure 7 is shown a second example captured image 44 of the fingerprint 30, but with the fingerprint 30 being skewed, i.e. rotated in a counter-clockwise direction with respect to the vertical. In practice, it is difficult for a subject whose fingerprint is being captured to impress its finger correctly without rotating or pivoting its finger. An angle at which the fingerprint image is captured, with respect to the vertical, forms part of a fingerprint quality measure. There is a need to find an efficient reference point location technique, that is able to process all types of fingerprints (including the PA fingerprint of Figure 2) and that preferably performs same as part of the early processes of fingerprint manipulation and analyses.
The centroid location module 22 is configured to receive as input, data relating to the mask 42 and/or data relating to the enhanced image 36. It will be appreciated that a second mask and a second enhanced image may be similarly generated for the second example image 44 or for further fingerprint images (not shown).
Figure 8 is a schematic two-dimensional representation 46 of the image 44 of Figure 7 illustrating the outline 48 of the foreground 38 of the fingerprint 30. The location of the centroid (xfC, ytc) of the foreground 38 is estimated through two sets of, both, horizontal and vertical navigation in the foreground-area 38 which would ultimately define four navigational coordinates, the mean of which would define the coordinates of the centroid (xfC, ytc). The x-coordinate, xfC, and y-coordinate are determined through the process of navigating horizontally and vertically (that is in the x-axis direction and y-axis direction of a Cartesian plane) within a foreground of the captured image of the skewed fingerprint, as will be described in more detail below.
The starting point of each navigation set is the centre of the fingerprint image background (xt>,yb) which are the first and second reference point coordinates of the fingerprint image background. Preferably, the centre of the image background would fall within the foreground area 38. For an image of width W and height H, the x- coordinate of the background centre is given by:
Xb = 0.5 x w, while the y-coordinate of the background centre is given by: yb = 0.5 x h.
Accordingly, the method of determining the location of the centroid (xfc, ytc), takes place in several interlinked stages. The first stage includes conducting a first navigation from the reference point (xt>, yb) towards a first direction (i.e. the right), and a navigation towards a second direction (i.e. the left). The first reference point coordinate xb defines a first reference point axis A and the second reference point coordinate yb defines a second reference point axis C which is transverse, i.e. orthogonal, to the first reference point axis A. A horizontal right-ward navigation starts at the point (xt>, yb) and increases in steps/increments of 1 pixel (i.e. the value of the x-coordinate) of the captured digital image along the A axis in the first direction, while the y-coordinate remains unchanged. The value of this x-coordinate is increased up to the pixel that marks the interface between the foreground 38 and the background 40, if it exists. Alternatively, it is increased up to the pixel that marks the right-most edge of the image background 40 or that of the image 44. This extreme-right x-coordinate is recorded and stored as XM, being a first value. Similarly, the horizontal, left-ward navigation (i.e. navigation in a second direction that is opposite the first direction) starts at the point (xt>,yb) and decreases - in steps of 1 pixel (i.e. the value of the x-coordinate) of the captured image along the A axis in the second direction, while the y-coordinate remains constant. The value of the x- coordinate is decreased up to the pixel that marks the interface between the foreground 38 and the background 40, if it exists. Alternatively, it is decreased up to the pixel that marks the left-most edge of the image background 40 or that of the image 44. This extreme-left x-value is recorded and stored as xn, as a second value. Following this two-stage horizontal navigation along the A axis in the first and second directions, it now becomes possible to compute an x-coordinate (i.e. xn) of a first navigational reference point (or first navigation set), being a third value. The third value is given by: xf = -2
As can be seen in Figure 8, the first navigation reference point (or first navigational set) is defined by first and second navigational coordinates X , yt>. The first navigational coordinate xn defines a first navigational axis that corresponds with the first reference point axis A, (i.e. horizontal axis A) and the second navigational coordinate yb defines a second navigational axis B (i.e. vertical axis B) which is transverse to the first reference point axis A.
The second stage includes conducting a second navigation (i.e. a vertical navigation along the B-axis) with respect to the first navigation set having first and second navigation coordinates XM, yb. This vertical navigation occurs in two stages, namely, the upward navigation (in a third direction along the second navigational axis B) and the downward navigation (in a fourth direction that is opposite the third direction along the second navigational axis B). The upward navigation (i.e. in the third direction along the axis B), however, starts at the point (XM, yb) and decreases - in steps of 1 pixel (i.e. the value of the y-coordinate), while the x-coordinate (i.e. XM ) remains unchanged. The y-coordinate is decreased up to the pixel that marks the interface between the foreground 38 and the background 40, if it exists. Alternatively, it is decreased up to the pixel that marks the upper-most edge of the image background 40 or that of the image 44. This extreme-top y-coordinate is recorded and stored as yui , being a fourth value. The downward navigation (in the fourth direction), also, starts at the point (XM, yb) and increases - in steps of 1 pixel (i.e. the value of the y-coordinate), while the x-coordinate (i.e. XM) remains constant. The value of the y-coordinate is increased up to the point that marks the interface between the foreground 38 and the background 40, if it exists. Alternatively, it is increased up to the pixel that marks the lower-most edge of the image background 40 or that of the image 44. This extreme-bottom y-value is recorded and stored as yn, being a fifth value. Following this two-stage vertical navigation, it now becomes possible to compute the y-coordinate (i.e. yn) of a second navigation set (i.e. second navigation reference point), being a sixth value, as follows: yf i = 2
It follows from the above description that the vertical navigation thus uses the third value (xf-i ) in determining the sixth value (yn). As can be seen in Figure 8, the second navigation reference point has fifth and sixth navigation coordinates xti , yn.
A third navigation set is the same as above, however, it occurs in reverse, whereby it first commences with the vertical navigation along the vertical axis C in a fifth (upward) direction and sixth (downward) direction, with respect to the starting point (xb, yt>). Similar to the above-described process, yU2, being a seventh value, and yi2, being an eighth value, are determined, and accordingly the y-coordinate of the third navigation set (i.e. third navigation reference point), being a ninth value, yt2, is determined. As can be seen in Figure 8, the coordinates of the third navigation set includes fifth and sixth navigation coordinates xt>, yt2, wherein the fifth navigation coordinate Xb defines a fourth navigation axis D which is parallel to the first reference point axis A and transverse to the third navigation axis C.
Subsequently, a fourth navigation set is determined by performing the horizontal navigation along the D axis in a seventh (rightward) direction and eighth (leftward) direction, with starting point (xt>, yt2). Accordingly, similar to the above-described procedure, xr2 and xi2, which are tenth and eleventh values, are determined. Accordingly the x-coordinate of the fourth navigation set, being a twelfth value, is determined as xt2:
Following the above navigations, the centroid location module 22 is able to accurately estimate the centroid location of the foreground 38 by calculating the respective averages of the aforementioned determined third and twelfth values, and sixth and ninth values. A first foreground coordinate, in the x-direction, of the centroid location is determined by:
Also, a second foreground coordinate, in the y-direction, of the centroid location is determined by:
It should be appreciated that in further example embodiments of the invention, the coordinates of the centroid location could be (X , yt-i ) or (xt2, yt2)-
As can be seen in Figure 8, the first foreground coordinate defines a first foreground axis Fx, being a first foreground axis of the location of the estimated centroid of the foreground 38. The second foreground coordinates defines a second foreground axis Fy, being a second foreground axis of the location of the estimated centroid of the foreground 38. Again, as seen in Figure 8, the foreground 38 defines a longitudinal axis Y which is slanted relative to the substantially vertical, second foreground axis Fy. The longitudinal axis Y passes through an upper edge G of the foreground 38 and the centroid xfC, ytc. In an embodiment, the upper edge G is determined by navigating upwardly, in steps of 1 -pixel, from the foreground centroid XfC, ytc (while maintaining the XfC coordinate constant) up to the point that marks the interface between the foreground 38 and the background 40. In FIG. 8, this point is marked as E, and is referred to herein as a theoretical position at which an upper edge of the foreground 38 should be located. This point E has coordinates xe, ye and the ye coordinate has a fifteenth value. By way of example, the x-values on the left-hand side of point E would be negative, and those on the right hand side would be positive.
Now in order to establish whether the foreground 38 is skewed with respect to the vertical, the method further includes navigating towards the right up to a point that marks the interface between the foreground 38 and the background 40 to establish a sixteenth value in the x-direction. In this example, as shown in Figure 8, navigating horizontally to the right will immediately fall into the background 40, indicating that the foreground is not skewed in the clockwise direction. The method 10 also includes navigating horizontally to the left in increments of 1 pixel up to a point, G, that marks the interface between the foreground 38 and the background 40. The point G, referred to herein as the actual upper edge of the skewed foreground 40 has coordinates xg, yg and the xg coordinate has a seventeenth value.
The method further includes calculating the vertical length/distance between the point ye (i.e. the fifteenth value) and the foreground centroid ytc (i.e. the fourteenth value) and, and also calculating the horizontal distance between the point xg and the point xe. Once these distances have been computed, the method includes calculating an angle a as follows:
Tan(a) = (horizontal distance between point xg and point xe)/(vertical distance between point ye and point ytc) a = Arctan{(horizontal distance between point xg and point xe)/(vertical distance between point ye and point ytc)}
In general, when the resultant value of a is negative, the negative value would indicate that foreground is skewed in the anti-clockwise direction and the fingerprint image 44 would need to be rotated in the clockwise direction by the value of a. Similar, a positive a value would be indicative that the foreground 40 is skewed in the clockwise direction, and needs to be rotated in the anticlockwise direction in order to correct the orientation of the captured fingerprint image. In general, the angle a defined between the longitudinal axis Y and the second foreground axis Fy indicates the extent at which the point G of the foreground 38 (and essentially the foreground 38) is oriented/skewed with respect to the second foreground axis Fy (as shown in Figure 8).
As mentioned previously, a foreground centroid designated by reference numeral 50 as shown in Figure 9, which is determined by the centroid location module 22, as described above, is preferably defined as the accurate geometric centre of the foreground 38 of the fingerprint 30. As mentioned previously, the location of this fingerprint centroid may be performed via locating a geometric centre of the second example image 44 which corresponds to the arithmetic mean or average position of all the points or pixels in the foreground 38 of the image 44 with respect to a reference point of the schematic representation 46 of the image 44, as described previously. Data relating to the centroid 50, such as the second foreground axis Fy and centroid coordinates xfC, ytc, is utilised as an input for the rotation estimation module 24. The rotation estimation module 24 is configured to estimate the orientation angle a between, for example, the longitudinal axis of the foreground Y as described above relative to the second foreground axis Fy.
The output of the orientation estimation module 24 is a data structure comprising the angle a and a rotation direction such as clock-wise or anti-clockwise, indicating the direction in which the captured image 44 (i.e. combination of the foreground and background thereof) must be rotated by a from the longitudinal axis Y in the direction of the second foreground axis Fy. This data structure may be stored into a database 60, 66 which are described below.
The orientation adjustment module 26 is configured to generate a corrected image 52 (shown in Figure 9) by pivoting or rotating the image 44 by the estimated orientation angle a. In this example, the corrected image 52 is generated by pivoting or rotating the captured image 44 of the skewed fingerprint 30 by the angle a in the clockwise direction since the calculated value of a was negative. The resultant/corrected image 52 will have the longitudinal axis Y thereof taking the place of the second foreground axis Fy (i.e. being substantially upright), and the second foreground axis Fy will accordingly be rotated in the clockwise direction by the estimated angle a, as shown in Figure 9. As mentioned previously, the foreground centroid 50 is utilised as a reference point to aid in the estimation of the angle by which the skewed fingerprint in the captured image 44 needs to be tilted in order to generate the corrected image 52 (i.e. with the fingerprint 30 being oriented substantially upright so that the upper edge G thereof in the corrected image 52 can also be oriented substantially upright with respect to a horizontal axis X of the corrected image 52. It will be appreciated that many classes of fingerprints may be analysed utilising the method 10, including Central Twins, Tented Arch, Left Loop, Right Loop, Plain Arch, and even other fingerprints which do not belong to anyone of the aforementioned classes.
Figure 10 shows a high-level block diagram illustrating a system 10 for correcting the orientation of the fingerprint image. A plurality of distributed fingerprint readers 16.1 to 16.n capture a plurality of images relating to fingerprints. Data relating to an image of a fingerprint is stored into a first database 60. The system 10 preferably comprises the first database 60 and a processor 64 which is connected to the first database 60 and configured to utilise the data relating to the fingerprint image and to execute an image correcting algorithm in accordance with instructions which are stored in a memory (not shown) which is coupled to the processor 64. The instructions (i.e. algorithm) in the memory (not shown) comprises a foreground separation module 20 for separating a foreground of the image from a background of the image. The algorithm further comprises a centroid location module 22 for generating data relating to a centroid of the foreground. The algorithm yet further comprises an orientation estimation module 24 for estimating an orientation direction and angle a between a longitudinal axis Y of the skewed foreground image (as shown in Figure 8) and the second foreground axis Fy of the; and further comprises an orientation adjustment module 26 for generating a corrected image by pivoting or rotating the image by the estimated orientation angle a. Data relating to the corrected image 52 is stored into a second database 66. It will be appreciated that a back end 62 of an operator may be utilised comprising the processor 64 and the second database 66. However, other embodiments may be possible wherein the first and second databases 60, 66 form a single database. Backend 62 may hence receive data relating to a plurality of scanned fingerprint images which were captured by the plurality of fingerprint readers 16.1 to 16.n.
It will be appreciated that there are many variations in detail on the invention as herein defined and/or described without departing from the scope and spirit of this disclosure.
For example, referring to Figure 3, the fingerprint reader 16 may form part of the system and/or method 10, however embodiments are possible wherein the system 10 connects to the database 60 comprising data relating to fingerprints as shown in Figure 10 which data has been pre-stored. Hence, in the method 10, the step of providing the fingerprint image may comprise retrieving data relating to the fingerprint image from the database 60. One or more of the fingerprint readers 16.1 to 16.n (shown in Figure 10) may hence form part of the system 10. It will further be appreciated that the method 10 may be performed without performing the contrast enhancement step 18.
It will further be appreciated that even though in the example depicted in Figures 7 and 9 the second image 44 was rotated in the clockwise direction, the method 10 is enabled to rotate fingerprint images in the counter-clockwise direction as well. The image 44 may be pivoted or rotated about the centroid 50, alternatively the image 44 may be pivoted about another pivot point (not shown).

Claims

1 . A method of adjusting the orientation of a captured image of a skewed fingerprint, the method comprising the steps of: separating a foreground of the captured image from a background of the captured image; estimating a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint, wherein the estimated centroid of the foreground defines a first foreground axis; estimating an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and pivoting or rotating the captured image by the estimated orientation angle so as to correct the orientation of the skewed fingerprint in the captured image.
2. The method of claim 1 , wherein the step of estimating the centroid of the foreground comprises determining the coordinates of the centroid of the foreground, which coordinates are the mean values of the coordinates of the points of at least the foreground of the captured image with respect to the predefined reference point, the step of estimating the centroid including the steps of: defining a first reference point coordinate and a second reference point coordinate of the predefined reference point located in one of the background or foreground of the captured image, preferably the background of the captured image, wherein the first and second reference point coordinates define a first reference point axis and a second reference point axis that is transverse to the first reference point axis; conducting a first navigation with respect to the predefined reference point including: calculating a first value in a first direction along the first reference point axis between the first reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a second value in a second direction that is opposite the first direction along the first reference point axis between the first reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; calculating a third value that is the average between the first value and the second value with respect to the first reference point coordinate, to define a first navigation reference point having a first navigation coordinate defining a first navigation axis that is coaxial with the first reference point axis, and having a second navigation coordinate that corresponds with the second reference point coordinate and defines a second navigation axis that is transverse to the first navigation axis; conducting a second navigation with respect to the first navigation reference point including: calculating a fourth value in a third direction along the second navigation axis between the second navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a fifth value in a fourth direction that is opposite the third direction along the second navigation axis between the second navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a sixth value that is the average between the fourth value and the fifth value with respect to the second navigation coordinate, to define a second navigation reference point having a third navigation coordinate defining a third navigation axis that is parallel to the first navigation axis, wherein the third navigation coordinate corresponds with the first navigation coordinate, and wherein the second navigation reference point has a fourth coordinate defining a fourth navigational axis that is coaxial with the second navigation axis.
3. The method of claim 2, wherein the step of estimating the centroid further comprises: conducting a third navigation with respect to the reference point including: calculating a seventh value in a fifth direction along the second reference point axis between the second reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating an eighth value in a sixth direction that is opposite the fifth direction along the second reference point axis between the second reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; calculating a ninth value that is the average between the seventh value and the eighth value with respect to the second reference point coordinate, to define a third navigation reference point having a fifth navigation co-ordinate defining a fifth navigation axis that is coaxial with the second reference point axis, and having a sixth navigation coordinate that corresponds with the first reference point coordinate and defines a sixth navigation axis that is transverse to the fifth navigation axis; conducting a fourth navigation with respect to the third navigation reference point including: calculating a tenth value in a seventh direction along the sixth navigation axis between the sixth navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating an eleventh value in an eighth direction that is opposite the seventh direction along the sixth navigation axis between the sixth navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a twelfth value that is the average between the tenth value and the eleventh value with respect to the sixth navigation coordinate, to define a fourth navigation reference point having a seventh navigation coordinate that corresponds with the fifth navigation coordinate, and wherein the fourth navigation reference point has an eighth coordinate defining an eighth navigational axis that is coaxial with the sixth navigation axis.
4. The method of claim 3, wherein the step of estimating the centroid of the foreground further comprises: determining the average of the third value and twelfth value to establish a thirteenth value that corresponds with a first co-ordinate of the centroid; and determining the average of the sixth value and ninth value to establish a fourteenth value that corresponds with a second co-ordinate of the centroid, wherein the first coordinate and second coordinate of the centroid are x and y coordinates of the Cartesian plane, respectively.
5. The method of claim 4, wherein the second co-ordinate of the centroid defines the first foreground axis.
6. The method of claim 5, wherein the step of estimating an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated foreground centroid comprises defining the predefined point of the foreground of the captured image as an edge of the foreground, wherein the step of defining the edge of the foreground of the captured image includes the step of determining a location of the edge of the foreground, wherein the step of determining the location of the edge of the foreground includes: navigating in a ninth direction from the estimated centroid along the first foreground axis between the centroid and an interface of the foreground and background of the captured image to determine a fifth navigation reference point having an ninth navigation coordinate that corresponds with the value of the x-coordinate of the estimated centroid, the ninth navigation coordinate defining a ninth navigational axis that is transverse to the first foreground axis, and wherein the fifth navigation reference point has a tenth navigation coordinate having a fifteenth value; navigating in a tenth direction along the ninth navigational axis between the ninth navigation coordinate and an interface of the foreground and background if it exists to determine a sixth navigational reference point having a twelfth navigation coordinate having a sixteenth value, and a thirteenth navigation coordinate that corresponds with the value of the tenth navigation coordinate; whereas if it does not exists, navigate in an eleventh direction that is opposite the tenth direction along the ninth navigational axis between the ninth navigation coordinate and an interface of the foreground and background to determine a seventh navigational reference point having a fourteenth navigation coordinate having a seventeenth value, and a fifteenth navigation coordinate that corresponds with the value of the tenth navigation coordinate, wherein either one of the sixteenth value or seventeenth value defines an upper edge of the skewed foreground; and determining the angle of orientation defined by the arctan of the fraction of a numerator which is defined by the difference of the sixteenth value and fifteenth value or seventeenth value and fifteenth value, and the denominator defined by the difference between the fifteenth value and fourteenth value.
7. The method of claim 6, including the step of determining the rotation direction and accordingly rotating the captured image of the fingerprint in that direction such that the upper edge of the corrected image of the fingerprint is substantially upright with respect to a vertical axis.
8. The method of claim 7, wherein the step of determining the rotation direction includes establishing whether the value of the determined angle of orientation is positive or negative, wherein a negative value of the angle of orientation indicates that the foreground is skewed in the anticlockwise direction and that the rotation direction for correcting the orientation of the fingerprint is clockwise, and wherein a positive value of the angle of orientation indicates that the foreground is skewed in the clockwise direction and that the rotation direction for correcting the orientation of the fingerprint is anticlockwise.
9. The method of claim 8, including the further step of storing on a database the corrected image.
10. The method of claim 1 , further including the step of adjusting a contrast of the captured image of the fingerprint.
1 1 . The method of claim 1 , wherein the step of separating the foreground from the background includes applying a mask to the image, thereby masking the background to yield the foreground.
12. The method of claim 1 , wherein the step of providing the fingerprint image may include capturing the image of the fingerprint with a fingerprint capturing device.
13. A system for adjusting the orientation of a captured image of a skewed fingerprint, the system comprising: a processor; and a memory which is connected to the processor, the memory containing instructions which when executed causes the processor to: provide data relating to the captured image; separate a foreground of the image from a background of the image; estimate a centroid of the foreground with respect to a predefined point of reference that is located in one of the foreground and background of the captured image, wherein the estimated centroid of the foreground defining a first foreground axis; estimate an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and pivot or rotate the image by the estimated orientation angle.
14. The system of claim 13, wherein the instructions that cause the processor to estimate the centroid of the foreground further cause the processor to determine the coordinates of the centroid of the foreground, which coordinates are the mean values of the coordinates of the points of at least the foreground with respect to the predefined reference point, wherein the said instructions, which when executed in determining the coordinates of the centroid, causes the processor to: define a first reference point coordinate and a second reference point coordinate of the predefined reference point of the background of the captured image, wherein the first and second reference point coordinates define a first reference point axis and a second reference point axis that is transverse to the first reference point axis; conduct a first navigation with respect to the reference point including: calculating a first value in a first direction along the first reference point axis between the first reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a second value in a second direction that is opposite the first direction along the first reference point axis between the first reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; calculate a third value that is the average between the first value and the second value with respect to the first reference point coordinate, to define a first navigation reference point having a first navigation co-ordinate defining a first navigation axis that is coaxial with the first reference point axis, and having a second navigation coordinate that corresponds with the second reference point coordinate and defines a second navigation axis that is transverse to the first navigation axis; conduct a second navigation with respect to the first navigation reference point including: calculating a fourth value in a third direction along the second navigation axis between the second navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a fifth value in a fourth direction that is opposite the third direction along the second navigation axis between the second navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculate a sixth value that is the average between the fourth value and the fifth value with respect to the second navigation coordinate, to define a second navigation reference point having a third navigation coordinate defining a third navigation axis that is parallel to the first navigation axis, wherein the third navigation coordinate corresponds with the first navigation coordinate, and wherein the second navigation reference point has a fourth coordinate defining a fourth navigational axis that is coaxial with the second navigation axis.
15. The system of claim 14, wherein the instructions that cause the processor to estimate the centroid further cause the processor to: conduct a third navigation with respect to the reference point including: calculating a seventh value in a fifth direction along the second reference point axis between the second reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating an eighth value in a sixth direction that is opposite the fifth direction along the second reference point axis between the second reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; calculate a ninth value that is the average between the seventh value and the eighth value with respect to the second reference point coordinate, to define a third navigation reference point having a fifth navigation co-ordinate defining a fifth navigation axis that is coaxial with the second reference point axis, and having a sixth navigation coordinate that corresponds with the first reference point coordinate and defines a sixth navigation axis that is transverse to the fifth navigation axis; conduct a fourth navigation with respect to the third navigation reference point including: calculating a tenth value in a seventh direction along the sixth navigation axis between the sixth navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating an eleventh value in an eighth direction that is opposite the seventh direction along the sixth navigation axis between the sixth navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculate a twelfth value that is the average between the tenth value and the eleventh value with respect to the sixth navigation coordinate, to define a fourth navigation reference point having a seventh navigation coordinate that corresponds with the fifth navigation coordinate, and wherein the sixth navigation reference point has an eighth coordinate defining an eighth navigational axis that is coaxial with the sixth navigation axis.
16. The system of claim 15, wherein the instructions that cause the processor to estimate the angle of orientation of the predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid location further cause the processor to define the predefined point of the foreground of the captured image as an edge of the foreground, wherein the step of defining the edge of the foreground of the captured image includes causing the processor to: navigate in a ninth direction from the estimated centroid along the first foreground axis between the centroid and an interface of the foreground and background of the captured image to determine a fifth navigation reference point having an ninth navigation coordinate that corresponds with the value of the x-coordinate of the estimated centroid, the ninth navigation coordinate defining a ninth navigational axis that is transverse to the first foreground axis, and wherein the fifth navigation reference point has a tenth navigation coordinate having a fifteenth value; navigate in a tenth direction along the ninth navigational axis between the ninth navigation coordinate and an interface of the foreground and background if it exists to determine a sixth navigational reference point having a twelfth navigation coordinate having a sixteenth value, and a thirteenth navigation coordinate that corresponds with the value of the tenth navigation coordinate; whereas if the interface of the foreground and background does not exists, navigate in an eleventh direction that is opposite the tenth direction along the ninth navigational axis between the ninth navigation coordinate and an interface of the foreground and background to determine a seventh navigational reference point having a fourteenth navigation coordinate having a seventeenth value, and a fifteenth navigation coordinate that corresponds with the value of the tenth navigation coordinate, wherein either one of the sixteenth value or seventeenth value defines an upper edge of the skewed foreground; and determine the angle of orientation defined by the arctan of the fraction of a numerator which is defined by the difference of the sixteenth value and fifteenth value or seventeenth value and fifteenth value, and the denominator defined by the difference between the fifteenth value and fourteenth value.
17. The system of claim 16, wherein the memory contains instructions which when executed cause the processor to determine the rotation direction and accordingly further cause the processor to rotate the captured image of the fingerprint in that direction such that the upper edge of the corrected image of the fingerprint is substantially upright with respect to a vertical axis.
18. The system of claim 17, wherein the instructions that cause the processor to determine the rotation direction further cause the processor to establish whether the value of the determined angle of orientation is positive or negative, wherein a negative value of the angle of orientation indicates that the foreground is skewed in the anticlockwise direction and that the rotation direction for correcting the orientation of the fingerprint is clockwise, and wherein a positive value of the angle of orientation indicates that the foreground is skewed in the clockwise direction and that the rotation direction for correcting the orientation of the fingerprint is anticlockwise.
19. The system of claim 13, wherein the memory contains instructions which when executed by the processor cause the processor to adjust a contrast of the captured image of the fingerprint.
20. The system of claim 13, wherein the instructions that cause the processor to separate the foreground from the background further includes instructions that can cause the processor to apply a mask to the image, thereby masking the background to yield the foreground.
21 . The system of claim 13 comprising an image capturing device for capturing the fingerprint image, where after data relating to the captured fingerprint image is stored in a database.
22. A non-transitory computer readable device storing instructions thereon which when executed by a processor of a computing device performs the functions of: separating, by means of at least one processor of the computing device, a foreground of the captured image from a background of the captured image; estimating, by means of at least one processor of the computing device, a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint, wherein the estimated centroid of the foreground defines a first foreground axis; estimating, by means of at least one processor of the computing device, an angle of orientation of a predefined point of the foreground of the captured image with respect to the first foreground axis of the estimated centroid; and pivoting or rotating, by means of at least one processor of the computing device, the captured image by the estimated orientation angle so as to correct the orientation of the skewed fingerprint in the captured image.
23. A method of establishing a centroid of a foreground of a captured image of a skewed fingerprint, the method comprising: providing a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimating a centroid of the foreground with respect to a predefined reference point located in one of the foreground and background of the captured image of the fingerprint.
The method of claim 23, wherein the step of estimating the centroid of the foreground comprises determining the coordinates of the centroid of the foreground, which coordinates are the mean values of the coordinates of the points of at least the foreground with respect to the predefined reference point, the step of estimating the centroid including the steps of: defining a first reference point coordinate and a second reference point coordinate of the predefined reference point located in one of the background or foreground of the captured image, preferably the background of the captured image, wherein the first and second reference point coordinates define a first reference point axis and a second reference point axis that is transverse to the first reference point axis; conducting a first navigation with respect to the predefined reference point including: calculating a first value in a first direction along the first reference point axis between the first reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a second value in a second direction that is opposite the first direction along the first reference point axis between the first reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; calculating a third value that is the average between the first value and the second value with respect to the first reference point coordinate, to define a first navigation reference point having a first navigation coordinate defining a first navigation axis that is coaxial with the first reference point axis, and having a second navigation coordinate that corresponds with the second reference point coordinate and defines a second navigation axis that is transverse to the first navigation axis; conducting a second navigation with respect to the first navigation reference point including: calculating a fourth value in a third direction along the second navigation axis between the second navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a fifth value in a fourth direction that is opposite the third direction along the second navigation axis between the second navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a sixth value that is the average between the fourth value and the fifth value with respect to the second navigation coordinate, to define a second navigation reference point having a third navigation coordinate defining a third navigation axis that is parallel to the first navigation axis, wherein the third navigation coordinate corresponds with the first navigation coordinate, and wherein the second navigation reference point has a fourth coordinate defining a fourth navigational axis that is coaxial with the second navigation axis.
25. The method of claim 24, wherein the step of estimating the centroid further comprises: conducting a third navigation with respect to the reference point including: calculating a seventh value in a fifth direction along the second reference point axis between the second reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating an eighth value in a sixth direction that is opposite the fifth direction along the second reference point axis between the second reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; calculating a ninth value that is the average between the seventh value and the eighth value with respect to the second reference point coordinate, to define a third navigation reference point having a fifth navigation co-ordinate defining a fifth navigation axis that is coaxial with the second reference point axis, and having a sixth navigation coordinate that corresponds with the first reference point coordinate and defines a sixth navigation axis that is transverse to the fifth navigation axis; conducting a fourth navigation with respect to the third navigation reference point including: calculating a tenth value in a seventh direction along the sixth navigation axis between the sixth navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating an eleventh value in an eighth direction that is opposite the seventh direction along the sixth navigation axis between the sixth navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a twelfth value that is the average between the tenth value and the eleventh value with respect to the sixth navigation coordinate, to define a fourth navigation reference point having a seventh navigation coordinate that corresponds with the fifth navigation coordinate, and wherein the fourth navigation reference point has an eighth coordinate defining an eighth navigational axis that is coaxial with the sixth navigation axis.
26. The method of claim 25, wherein the step of estimating the centroid of the foreground further comprises: determining the average of the third value and twelfth value to establish a thirteenth value that corresponds with a first co-ordinate of the centroid; and determining the average of the sixth value and ninth value to establish a fourteenth value that corresponds with a second co-ordinate of the centroid, wherein the first coordinate and second coordinate of the centroid are x and y coordinates of the Cartesian plane.
A system for establishing a centroid of a foreground of a captured image of a skewed fingerprint, the system comprising: a processor; and a memory that is coupled to the processor, the memory containing instructions which when executed by the processor causes the processor to: provide a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimate the location of a centroid of the foreground with respect to a predefined reference point in one of the foreground and background of the captured image of the fingerprint.
The system of claim 27, wherein the instructions causing the processor to estimate the centroid of the foreground further comprises instructions which when executed causes the processor to determine the coordinates of the centroid of the foreground, which coordinates are the mean values of the coordinates of the points of at least the foreground with respect to the predefined reference point, wherein the said instructions which when executed causes the processor to: define a first reference point coordinate and a second reference point coordinate of the predefined reference point of the background of the captured image, wherein the first and second reference point coordinates define a first reference point axis and a second reference point axis that is transverse to the first reference point axis; conduct a first navigation with respect to the reference point including: calculating a first value in a first direction along the first reference point axis between the first reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a second value in a second direction that is opposite the first direction along the first reference point axis between the first reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; calculate a third value that is the average between the first value and the second value with respect to the first reference point coordinate, to define a first navigation reference point having a first navigation co-ordinate defining a first navigation axis that is coaxial with the first reference point axis, and having a second navigation coordinate that corresponds with the second reference point coordinate and defines a second navigation axis that is transverse to the first navigation axis; conduct a second navigation with respect to the first navigation reference point including: calculating a fourth value in a third direction along the second navigation axis between the second navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating a fifth value in a fourth direction that is opposite the third direction along the second navigation axis between the second navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculate a sixth value that is the average between the fourth value and the fifth value with respect to the second navigation coordinate, to define a second navigation reference point having a third navigation coordinate defining a third navigation axis that is parallel to the first navigation axis, wherein the third navigation coordinate corresponds with the first navigation coordinate, and wherein the second navigation reference point has a fourth coordinate defining a fourth navigational axis that is coaxial with the second navigation axis.
The system of claim 28, wherein the instructions that cause the processor to estimate the centroid further cause the processor to: conduct a third navigation with respect to the reference point including: calculating a seventh value in a fifth direction along the second reference point axis between the second reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and
calculating an eighth value in a sixth direction that is opposite the fifth direction along the second reference point axis between the second reference point coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; calculate a ninth value that is the average between the seventh value and the eighth value with respect to the second reference point coordinate, to define a third navigation reference point having a fifth navigation co-ordinate defining a fifth navigation axis that is coaxial with the second reference point axis, and having a sixth navigation coordinate that corresponds with the first reference point coordinate and defines a sixth navigation axis that is transverse to the fifth navigation axis; conduct a fourth navigation with respect to the third navigation reference point including: calculating a tenth value in a seventh direction along the sixth navigation axis between the sixth navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculating an eleventh value in an eighth direction that is opposite the seventh direction along the sixth navigation axis between the sixth navigation coordinate and one of the interface of the background and foreground of the captured image and periphery/edge of the background of the captured image; and calculate a twelfth value that is the average between the tenth value and the eleventh value with respect to the sixth navigation coordinate, to define a fourth navigation reference point having a seventh navigation coordinate that corresponds with the fifth navigation coordinate, and wherein the sixth navigation reference point has an eighth coordinate defining an eighth navigational axis that is coaxial with the sixth navigation axis.
30. The system of claim 29, wherein the instructions that cause the processor to estimate the centroid of the foreground further cause the processor to: determine the average of the third value and twelfth value to establish a thirteenth value that corresponds with a first co-ordinate of the centroid; and determine the average of the sixth value and ninth value to establish a fourteenth value that corresponds with a second co-ordinate of the centroid.
31 . A non-transitory device which when executed by a processor of a computing device causes the processor to perform the actions of: providing a captured image of a skewed fingerprint, the captured image having a foreground and background; and estimating a centroid of the foreground with respect to a predefined reference point that is located in one of the foreground and background of the captured image of the fingerprint.
PCT/IB2018/055494 2017-07-24 2018-07-24 System for and method of adjusting the orientation of a captured image of a skewed fingerprint WO2019021173A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
ZA2020/01145A ZA202001145B (en) 2017-07-24 2020-02-24 System for and method of adjusting the orientation of a captured image of a skewed fingerprint

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ZA2017/05008 2017-07-24
ZA201705008 2017-07-24

Publications (1)

Publication Number Publication Date
WO2019021173A1 true WO2019021173A1 (en) 2019-01-31

Family

ID=65039459

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/055494 WO2019021173A1 (en) 2017-07-24 2018-07-24 System for and method of adjusting the orientation of a captured image of a skewed fingerprint

Country Status (2)

Country Link
WO (1) WO2019021173A1 (en)
ZA (1) ZA202001145B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1227429B1 (en) * 2001-01-29 2006-08-02 Nec Corporation Fingerprint identification system and method
CN103077377B (en) * 2012-12-31 2015-07-29 清华大学 Based on the fingerprint correction method of field of direction distribution

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1227429B1 (en) * 2001-01-29 2006-08-02 Nec Corporation Fingerprint identification system and method
CN103077377B (en) * 2012-12-31 2015-07-29 清华大学 Based on the fingerprint correction method of field of direction distribution

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ELLINGSGAARD, J: "Fingerprint Alteration Detection", THESIS, 30 June 2013 (2013-06-30), Technical University of Denmark, XP055570484, Retrieved from the Internet <URL:http://www2.imm.dtu.dk/pubdb/views/edoc_download.php/6601/pdf/imm6601.pdf> [retrieved on 20181016] *

Also Published As

Publication number Publication date
ZA202001145B (en) 2021-09-29

Similar Documents

Publication Publication Date Title
US11227144B2 (en) Image processing device and method for detecting image of object to be detected from input data
JP5772821B2 (en) Facial feature point position correction apparatus, face feature point position correction method, and face feature point position correction program
US8811744B2 (en) Method for determining frontal face pose
US20150199583A1 (en) Matching Process Device, Matching Process Method, and Inspection Device Employing Same
US20180336407A1 (en) Image processing system
CN108573471B (en) Image processing apparatus, image processing method, and recording medium
KR102073468B1 (en) System and method for scoring color candidate poses against a color image in a vision system
JP6836561B2 (en) Image processing device and image processing method
US9569850B2 (en) System and method for automatically determining pose of a shape
US9710707B1 (en) Detecting iris orientation
TW201516969A (en) Visual object tracking method
WO2018176514A1 (en) Fingerprint registration method and device
JP2018128897A (en) Detection method and detection program for detecting attitude and the like of object
US11049268B2 (en) Superimposing position correction device and superimposing position correction method
CN114549400A (en) Image identification method and device
EP3223193A1 (en) Image processing device, image processing method and image processing program
JP5048609B2 (en) Object tracking device and program
US9846807B1 (en) Detecting eye corners
JP2008152555A (en) Image recognition method and image recognition device
CN114936997A (en) Detection method, detection device, electronic equipment and readable storage medium
JP2008116206A (en) Apparatus, method, and program for pattern size measurement
JP2019149119A (en) Image processing device, image processing method, and program
JPWO2011083665A1 (en) Similarity calculation device, similarity calculation method, and program
WO2019021173A1 (en) System for and method of adjusting the orientation of a captured image of a skewed fingerprint
CN115937003A (en) Image processing method, image processing device, terminal equipment and readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18838451

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18838451

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18838451

Country of ref document: EP

Kind code of ref document: A1