CN101650781B - Grayscale normalization method of fingerprint images - Google Patents

Grayscale normalization method of fingerprint images Download PDF

Info

Publication number
CN101650781B
CN101650781B CN200910044242XA CN200910044242A CN101650781B CN 101650781 B CN101650781 B CN 101650781B CN 200910044242X A CN200910044242X A CN 200910044242XA CN 200910044242 A CN200910044242 A CN 200910044242A CN 101650781 B CN101650781 B CN 101650781B
Authority
CN
China
Prior art keywords
category
image
image block
gray
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200910044242XA
Other languages
Chinese (zh)
Other versions
CN101650781A (en
Inventor
祝恩
殷建平
李永
胡春风
陈晖�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN200910044242XA priority Critical patent/CN101650781B/en
Publication of CN101650781A publication Critical patent/CN101650781A/en
Application granted granted Critical
Publication of CN101650781B publication Critical patent/CN101650781B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a grayscale normalization method of fingerprint images, which aims at solving the technical problem of blocking effect and has the characteristics of fast operation and good real time. The technical scheme is as follows: carrying out non-overlapping blocking on the images, computing the grayscale histogram statistical characteristics of a block which coincides with the center of the current block and is larger than the current block, carrying out grayscale normalization by utilizing grayscale histogram equalization and grayscale stretching, after processing one image block, adopting an incremental method to compute the grayscale histogram statistical characteristics of the image block when the next adjacent image block is processed and finally covering the original image with the grayscale normalization result. Being adopted, the method of the invention overcomes the problem of blocking effect, features high speed, has better real time and can be better used for real-time embedded systems.

Description

Grayscale normalization method of fingerprint images
Technical field
The present invention relates to the Pre-processing Method for Fingerprint Image in fingerprint recognition field in the computer science, especially to the grayscale normalization method of fingerprint image.
Background technology
Fingerprint recognition obtains more and more widely application as a kind of identity identifying technology based on biological characteristic.High performance fingerprint recognition system needs high accuracy and feature extraction fast and matching algorithm.Feature extraction is extracted and the process of screening through direction calculating, image segmentation, figure image intensifying, lines extraction and refinement, minutiae feature usually.In order to improve the accuracy of feature extraction, usually before feature extraction earlier to the gradation of image pre-service that standardizes.The fingerprint image that the fingerprint image of different acquisition equipment collection or same collecting device are gathered under different condition has different gray features, the fingerprint image grayscale normalization is meant to be adjusted the gray scale of fingerprint image, make image after standardizing, have similar intensity contrast feature with different intensity contrast features, make the interior different textured region of same image have similar intensity contrast feature simultaneously, can dwindle gray feature difference between the different images by grayscale normalization, strengthen the lines of fingerprint image and the intensity contrast of line paddy, thereby improve the adaptive faculty of feature extraction algorithm.
Lin Hong (Lin Hong, Yifei Wan, Anil Jain.Fingerprint Image Enhancement:Algorithm and Performance Evaluation.IEEE Transactions on Pattern Analysis andMachine Intelligence, 1998,20 (8): 777-789) proposed grayscale normalization method of fingerprint images based on gray average and variance: the average gray of supposing fingerprint image I is M, gray variance is VAR, gray average after the expectation standardization is M0, gray variance after the expectation standardization is VAR0, after standardization, pixel (i, j) original gray-scale value I (i, j) become new gray-scale value G (i, j):
G ( i , j ) = M 0 + VAR 0 ( I ( i , j ) - M ) 2 VAR I ( i , j ) > M M 0 - VAR 0 ( I ( i , j ) - M ) 2 VAR I ( i , j ) ≤ M - - - ( 1 )
The Hong method is handled entire image, and M and VAR are calculated by whole sub-picture.The different textured region of many images has different gray features, thereby this method can not be dwindled the gray feature difference of different textured region, the gray average of the entire image of can only standardizing and variance, in the image after standardization between the different local textured region gray feature differ greatly.
Gray feature for each regional area of fingerprint image that can standardize, Byung-Gyu Kim (Byung-Gyu Kim, Dong-Jo Park.Adaptive Image Normalisation Based on BlockProcessing for Enhancement of Fingerprint Image.Electronics Letters, 2002,38 (14): 696-698) each localized mass employing is carried out grayscale normalization based on the method for gray average and variance, this method is nonoverlapping image block with image division, calculate the gray average and the variance of each image block, adopt formula (1) that local image block is carried out grayscale normalization then.This method makes to have similar gray feature between the different local textured region in the same image, but be easy to generate block effect: because image segmentation is become nonoverlapping, calculate by this piece as standardization average of parameter and variance, make to be connected smoothly between standardization back piece and the piece.
The Kim method can further be improved to overcome block effect.When certain topography's piece was carried out grayscale normalization, the Kim method was calculated the gray average M of this piece and gray variance VAR as the standardization parameter, uses formula (1) to calculate the new gray scale of each pixel again.The size of supposing localized mass is α * α, the central point of current block be (i, j), desirable with (i, j) being center, size, (piece of β>α) calculates the gray average M ' of this bigger piece and the gray variance VAR ' parameter as the standardization current block for β * β.β-α is big more, and then block effect is not obvious more, but needs more CPU time.This method is the Kim extended method.
Summary is got up, and makes each local textured region gray feature obviously different among the Hong method grayscale normalization result easily, and the regional area lines line attrition degree contrast that has can not be strengthened; The Kim method can overcome the problem of Hong method, but is easy to generate block effect; The Kim extended method can overcome block effect, but needs more CPU time, and the travelling speed of having lost algorithm is unfavorable for being used for real-time system.Therefore, how to overcome block effect and to improve grayscale normalization speed be the technical matters that those skilled in the art very pay close attention to.
Summary of the invention
The technical problem to be solved in the present invention is: when solving the different problem of the different local textured region gray features of fingerprint image grayscale normalization, solve the block effect problem, and fast operation, have good real-time performance.
In order to solve the problems of the technologies described above, technical scheme of the present invention is: image is carried out the zero lap piecemeal, to overlapping but calculate gray scale Histogram statistics feature than the bigger piece of current block with the current block center, use that the gray scale Nogata is balanced to carry out grayscale normalization with grey level stretching, when handling the next adjacent image block of an image block aftertreatment, adopt the gray scale Histogram statistics feature of increment type method computed image piece.
Concrete technical scheme is:
The first step, to the fingerprint image piecemeal, method is the piece that the fingerprint image I of pending grayscale normalization is divided into the non-overlapping copies of the big or small α * α of being.The height of image I is height, and width is width, and (i, gray-scale value j) are I (i, j) (0≤i<height, 0≤j<width) to the capable j row of image I i pixel.The size that is divided into is that the image block of the non-overlapping copies of α * α is called the category-A image block, and all category-A image blocks are formed a category-A image block matrix, with A (ci, cj) identify that the ci of this matrix is capable, the category-A image block of cj row, the row and column of matrix is from 0 open numbering, 0≤ci, 0≤cj.(ci, cj), (ci cj) overlaps, size be β * β, and (rectangular area of β 〉=α), (ci cj) represents this rectangular area usefulness B, is called the category-B image block to get a center and A for each category-A image block A.During piecemeal, tell the category-A piece earlier, again each category-A piece is determined the category-B piece of a correspondence.When determining parameter, determine the β value earlier, determine the value of α again.The length of side β of category-B image block generally is taken as 2 to 3 times of lines width.The fingerprint image resolution of fingerprint acquisition instrument collection is generally 500dpi, and in this case, inter-ridge distance is generally about 8 pixels, the lines width is about 4 to 5 pixels, the β value is unsuitable excessive, otherwise needs more computing times, and general β gets between 9 to 15 pixels.After having determined β, the span of α is that under given β value, α is more little between 1 to β pixel, and the standardization resultant image quality is high more, but needs more working times, and α is big more, and travelling speed is fast more, but the standardization resultant image quality is low more.For convenience of calculation, α and β get odd number usually, make the center pixel symmetry of image block about piece.In order to make the category-B image block not exceed the border of image I, the center pixel of the category-A piece of left column is β/2 apart from the distance of image left margin, and the center pixel of the most descending category-A piece is β/2 apart from the distance of image lower boundary, and category-A image block matrix altogether
Figure G200910044242XD00031
OK
Figure G200910044242XD00032
Row, so
Figure G200910044242XD00033
Figure G200910044242XD00034
Second goes on foot, and sets up the copy G of image I, and method is recomputated in employing or the increment type computing method are carried out grayscale normalization to the category-A image block one by one, and the intact image block of every standardization just is kept at the standardization result of this image block among the copy G.To image block A (ci, process of normalization cj) is:
2.1 statistics center and A (ci, and the image block B that cj) overlap, size is β * β (ci, cj) middle gray-scale value is the number of pixels c[k of k] (0≤k≤255).Add up c[k in two kinds of situation].
If 2.1.1 A (ci cj) is the category-A image block that is positioned at left column, i.e. cj=0 then adopts the method that recomputates to calculate c[k]: earlier with c[k] assignment 0, then for B (ci, cj) each pixel in (i, j), with c[I (i, j)] (k=I (i, j)) adds 1.
If 2.1.2 A (ci, cj) not the category-A image block that is positioned at left column, it is cj>0, this moment c[k] in store B (ci, cj-1) gray-scale statistical result only needs c[k] revise and just can obtain B (ci, gray-scale statistical result cj), concrete grammar is the increment type computing method: for B (ci, cj-1)-B (ci, cj) (set subtract computing, regard two image blocks as set that pixel is formed, do this subtraction and obtain belonging to B (ci, cj-1) but do not belong to each pixel among the B (ci, collection of pixels cj)) (i, j), with c[I (i, j)] subtract 1; For B (ci, cj)-B (ci, cj-1) (do this subtraction obtain belonging to B (ci, cj) but do not belong to each pixel among the B (ci, collection of pixels cj-1)) (i, j), with c[I (i, j)] add 1.
The method that recomputates also can be used for calculating c[k in second kind of situation], but the method that recomputates needs more working time than increment type computing method.
2.2 to image block B (ci cj) carries out gray balance, B (ci, the gray scale k in cj) becomes b[k through after the equilibrium], equilibrium result is kept at b[k] in, do not change B (ci, cj) in the gray-scale value of pixel.B[k] concrete computation process be:
2.2.1 initializing variable sc=0.
2.2.2 k from 0 to 255 calculates b[k successively]:
2.2.2.1?sc=sc+c[k];
2.2.2.2?b[k]=(sc×256-β 2)/β 2
2.3 with [b[low] between gray area, b[up]] be mapped between gray area [0,255], low and up be presentation video piece B (ci respectively, cj) minimum gradation value in and maximum gradation value, mapping process are equivalent to [b[low], b[up]] be stretched to [0,255], with gray-scale value b[k] and (low≤k≤up) is mapped to e[k] method be to calculate e[k with formula (2)].
e [ k ] = ( b [ k ] - b [ low ] ) × 255 b [ up ] - b [ low ] - - - ( 2 )
2.4 among the update image copy G with A (ci, cj) grey scale pixel value in Dui Ying zone.Method be for each A (ci, cj) pixel in (i, j), adopt formula (3) upgrade gray-scale value G among the G (i, j).
G(i,j)=e[I(i,j)],(i,j)∈A(ci,cj) (3)
In the 3rd step, after all category-A piece standardization, the in store grayscale normalization result of duplicate pictures G is with duplicate pictures G overlay image I.
Compare with existing method, adopt the present invention can obtain following beneficial effect:
By image block having been overcome existing Hong method different textured region in the standardization result problem of obvious different gray features is arranged; Overcome the problem of Kim method block effect by the bigger category-B image block of each category-A image being got a correspondence; Kim extended method speed under identical piecemeal parameter (α and β) is faster relatively simultaneously, and this is because adopted simple gray balance of calculating and stretching, and carries out gray-scale statistical with the increment type computing method.The present invention has better real-time property, can be used for real time embedded system better.If identical piecemeal parameter alpha=3 and β=9 are set, be about 43% of Kim extended method working time of the present invention, if identical piecemeal parameter alpha=3 and β=15 are set, be about 18% of Kim extended method working time of the present invention, and normalized resultant image quality is suitable.
Description of drawings
Fig. 1 is an image block synoptic diagram in the first step of the present invention.
Fig. 2 is an overview flow chart of the present invention.
Fig. 3 is the original image that is used to test.
Fig. 4 is the Kim method and the result of the present invention contrast of α=β=300 o'clock.
Fig. 5 is the Kim method and the result of the present invention contrast of α=β=9 o'clock.
Fig. 6 is the Kim method and the result of the present invention contrast of α=5, β=9 o'clock.
Fig. 7 is the Kim method and the result of the present invention contrast of α=3, β=9 o'clock.
Fig. 8 is the Kim method and the result of the present invention contrast of α=β=15 o'clock.
Fig. 9 is the Kim method and the result of the present invention contrast of α=3, β=15 o'clock.
Embodiment
Fig. 1 carries out the synoptic diagram of image block for adopting the present invention.I is a fingerprint image, highly is height, and width is width, the capable j row of image i pixel (i, gray-scale value j) be I (i, j).Image I is divided into the piece of size for the non-overlapping copies of α * α, be called the category-A image block, all category-A image blocks are formed a category-A image block matrix, the ci of this matrix is capable, the category-A image block A (ci of cj row, cj) sign, the row and column of matrix is from 0 open numbering, so A (2,1) is the category-A image block of the 3rd row, the 2nd row.For each category-A image block A (ci, cj), get a center and A (ci, cj) overlap, size be β * β (rectangular area of β 〉=α), (ci cj) represents this rectangular area usefulness B, be called the category-B image block, therefore B (2,1) is that center and A (2,1) coincidence size are the image block of β * β.In order to make the category-B image block not exceed the border of image I, the center pixel of the category-A piece of left column is β/2 apart from the distance of image left margin, the center pixel of the most descending category-A piece is β/2 apart from the distance of image lower boundary, the category-A image block A (0 in the lower left corner, 0) its center pixel is (β/2, β/2), category-A image block matrix altogether OK
Figure G200910044242XD00052
Row.
Fig. 2 is an overview flow chart of the present invention.Detailed process is:
1. establishing pending image is I, sets up I duplicate pictures G.(intermediate result that I is handled is kept among the G.)
2. the category-A image block A (0,0) from the image lower left corner begins, the category-A image block that standardizes one by one, and standardization is the result be kept among the G.To current block A (ci, cj) normalized detailed process is:
2.1 computing center and A (ci, cj) the category-B image block B of Chong Heing (ci, gray-scale statistical c[k cj)], c[k] be image block B (ci, cj) in gray-scale value be the number of pixels of k.Calculate c[k in two kinds of situation].
If 2.1.2 cj=0 then adopts the method that recomputates to calculate c[k].
If 2.1.2 cj>0, this moment c[k] in store B (and ci, gray-scale statistical result cj-1) only need c[k] revise just can obtain B (ci, gray-scale statistical result cj) therefore adopt increment type computing method calculating c[k].
2.2 to c[k] carry out gray balance and obtain new gray-scale value b[k].Concrete computation process is:
2.2.1 initializing variable sc=0.
2.2.2k from 0 to 255 calculates b[k successively]:
2.2.2.1?sc=sc+c[k];
2.2.2.2?b[k]=(sc×256-β 2)/β 2
2.3 calculate image block B (ci, cj) the minimum gradation value low in and maximum gradation value up when calculating low, test 0,1,2,3 successively ..., up to running into k, make c[k] be not equal to 0, the low assignment is k.When calculating up, test 255,254,253,252 successively ..., up to running into k, make c[k] be not equal to 0, the up assignment is k.
2.4 with [b[low] between gray area, b[up]] be mapped to [0,255] between gray area, mapping process is equivalent to [b[low], b[up]] be stretched to [0,255], gray-scale value b[k] (low≤k≤up) be mapped to e [ k ] = ( b [ k ] - b [ low ] ) × 255 b [ up ] - b [ low ] .
2.5 among the update image copy G with A (if the gray scale of certain pixel is k, the gray-scale value after then upgrading is e[k for ci, the cj) grey scale pixel value in Dui Ying zone].
If 2.6 also have the category-A image block not standardized, then get next image block (according to from the bottom up, order from left to right, the ci assignment is the row number of next image block, the cj assignment is the row number of next image block), forwarded for the 2.1st step to; Otherwise changeed for the 3rd step.
3. all pieces dispose, and the in store grayscale normalization result images of duplicate pictures G with duplicate pictures G overlay image I, is about to G and copies I to, and I is the standardization result.
Fig. 3 is the original image that is used to test, and the image size is 300x300.
Fig. 4-Fig. 9 has provided the present invention and Hong method and Kim (expansion) method to the standardization result of the original image among Fig. 3 and the contrast of working time.Experiment porch is Intel Pentium CPU 1.86G.Are working times of the present invention the working times of the employing increment type computing method that provide among Fig. 5-Fig. 9, adopt to be meant the working time recomputate and to change the increment type computing method among the 2.1.2 into after the method for recomputating working time.
Fig. 4 (a) is the result of the Kim method (this moment, the Kim method deteriorated to the Hong method) of α=β=300 o'clock, and be 5.5ms working time; Fig. 4 (b) is the result of the present invention in α=β=300 o'clock, and be 0.6ms working time.Both grayscale normalization resultant image quality are suitable, but are about the former 11% latter's working time.The both is to integral image (not piecemeal) standardization, and each textured region intensity contrast feature is obviously different.
Fig. 5 (a) is the result of the Kim method of α=β=9 o'clock, and be 6.2ms working time; Fig. 5 (b) is the result of the present invention of α=β=9 o'clock, and be 3.0ms working time.Both grayscale normalization resultant image quality are suitable, but are about the former 48% latter's working time.The method of Fig. 5 has overcome the problem in the result images among Fig. 4 to image block.Because the category-B piece is identical with the category-A block size during piecemeal, can find out block effect from result images: there is jagged edges in the zone that has; Be connected unsmooth between adjacent block; Article two, parallel lines some places are joined together, and obviously do not separate.This block effect shows more obviously in Fig. 8.
Fig. 6 (a) is α=5, the result of the Kim method of β=9 o'clock, and be 17.6ms working time; Fig. 6 (b) is α=5, the result of the inventive method of β=9 o'clock, and be 7.9ms working time.Both grayscale normalization resultant image quality are suitable, but are about the former 45% latter's working time.Among Fig. 6, the category-B block size is 9 * 9, and the category-A block size is 5 * 5, can be from finding out, and the block effect among Fig. 5 has obtained certain overcoming in Fig. 6, and this is because reduced the α value.
Fig. 7 (a) is α=3, the result of the Kim method of β=9 o'clock, and be 47.7ms working time; Fig. 7 (b) is α=3, the result of the inventive method of β=9 o'clock, and be 20.7ms working time.Both grayscale normalization resultant image quality are suitable, but are about the former 43% latter's working time.Among Fig. 7, the category-B block size is 9 * 9, and the category-A block size is 3 * 3, does not have tangible block effect in the result images.As seen, α is relative, and β is more little, and block effect overcomes well more.
Fig. 8 (a) is the result of the Kim method of α=β=15 o'clock, and be 5.5ms working time; Fig. 8 (b) is the result of the inventive method of α=β=15 o'clock, and be 1.6ms working time.Both grayscale normalization resultant image quality are suitable, but are about the former 29% latter's working time.The method of Fig. 8 is to image block, and the category-B piece is identical with the category-A block size, and the size of piece is bigger than the piecemeal among Fig. 5, can be from finding out, and the result images of Fig. 8 has more obvious block effect than Fig. 5, and when visible two class block sizes were identical, piecemeal was big more, and block effect is obvious more.
Fig. 9 (a) is α=3, the result of the Kim method of β=15 o'clock, and be 126.2ms working time; Fig. 9 (b) is α=3, the result of the inventive method of β=15 o'clock, and be 29.6ms working time.Both grayscale normalization resultant image quality are suitable, but are about the former 18% latter's working time.Among Fig. 9, the category-B block size is 15 * 15, and the category-A block size is 3 * 3, does not have tangible block effect in the result images.
In general, (1) under fixing β value, α is big more, and the standardization result images occurs that the phenomenon that jagged edges, the unsmooth and adjacent lines of adjacent block linking be connected is many more, and α is more little, and these phenomenons can be overcome, but consume more CPU time; When (2) two class block sizes were identical, piecemeal was big more, and block effect is obvious more; (3) under identical α value and β value, relative Kim method of the present invention and Kim extended method have quicker operation speed, and the relative β value of α value is more little, and speed advantage of the present invention is obvious more.For the fingerprint image of resolution 500dpi, generally β=9 can be set, α=3 or 5 can obtain high-quality standardization result under this set, have fast travelling speed again.
Therefore adopt the present invention that fingerprint image is carried out grayscale normalization and can overcome block effect, compare, be provided with down, have higher travelling speed, be suitable for real time embedded system in identical piecemeal parameter with existing method.

Claims (3)

1. grayscale normalization method of fingerprint images is characterized in that may further comprise the steps:
The first step, the fingerprint image I of pending grayscale normalization is divided into the piece of size for the non-overlapping copies of α * α: the height of image I is height, and width is width, the capable j row of image I i pixel (i, gray-scale value j) be I (i, j); The size that is divided into is that the image block of the non-overlapping copies of α * α is called the category-A image block, and all category-A image blocks are formed a category-A image block matrix, and (ci cj) identifies that the ci of this matrix is capable, the category-A image block of cj row, 0≤ci, 0≤cj with A; For each category-A image block A (ci, cj), get a center and A (ci cj) overlaps, size be the rectangular area of β * β, β 〉=α, (ci cj) represents this rectangular area usefulness B, is called the category-B image block; During piecemeal, tell the category-A piece earlier, again each category-A piece is determined the category-B piece of a correspondence; When determining parameter, determine the β value earlier, determine the value of α again; The length of side β of category-B image block is taken as 2 to 3 times of lines width, and the span of α is between 1 to β pixel;
Second step, set up the copy G of image I, method is recomputated in employing or the increment type computing method are carried out grayscale normalization to the category-A image block one by one, and the intact image block of every standardization just is kept at the standardization result of this image block among the copy G, to image block A (ci, process of normalization cj) is:
2.1 statistics center and A (ci, the image block B that cj) overlap, size is β * β (ci, cj) middle gray-scale value is the number of pixels c[k of k], 0≤k≤255 are added up c[k in two kinds of situation]:
If 2.1.1 A (ci cj) is the category-A image block that is positioned at left column, i.e. cj=0 then adopts the method that recomputates to calculate c[k]: earlier with c[k] assignment 0, then for B (ci, cj) each pixel in (i, j), with c[I (i, j)] add 1;
If 2.1.2 A (ci cj) is not the category-A image block that is positioned at left column, and promptly the increment type computing method are then adopted to c[k in cj>0] make amendment, method is: for B (ci, cj-1)-B (ci, cj) each pixel in (i, j), with c[I (i, j)] subtract 1; For B (ci, cj)-B (ci, cj-1) each pixel in (i, j), with c[I (i, j)] add 1;
2.2 to image block B (ci cj) carries out gray balance, B (ci, the gray scale k in cj) becomes b[k through after the equilibrium], equilibrium result is kept at b[k] in, b[k] concrete computation process be:
2.2.1 initializing variable sc=0;
2.2.2 k from 0 to 255 calculates b[k successively]:
2.2.2.1 sc=sc+c[k];
2.2.2.2 b[k]=(sc×256-β 2)/β 2
2.3 with [b[low] between gray area, b[up]] be mapped between gray area [0,255], (mapping process is equivalent to [b[low] for ci, cj) minimum gradation value in and maximum gradation value for low and up difference presentation video piece B, b[up]] be stretched to [0,255], with gray-scale value b[k] be mapped to e[k] method be to calculate e[k with formula (2)], this moment low≤k≤up:
Figure F200910044242XC00021
2.4 among the update image copy G with A (ci, the cj) grey scale pixel value in Dui Ying zone, method be for each A (ci, cj) pixel in (i, j), adopt formula (3) upgrade gray-scale value G among the G (i, j):
G(i,j)=e[I(i,j)],(i,j)∈A(ci,cj) (3)
In the 3rd step, after all category-A piece standardization, the in store grayscale normalization result of duplicate pictures G is with duplicate pictures G overlay image I.
2. grayscale normalization method of fingerprint images as claimed in claim 1 is characterized in that described β gets between 9 to 15 pixels, and α and β get odd number.
3. grayscale normalization method of fingerprint images as claimed in claim 1, the center pixel of the category-A piece of left column is β/2 apart from the distance of image left margin when it is characterized in that piecemeal, the center pixel of the most descending category-A piece is β/2 apart from the distance of image lower boundary, and category-A image block matrix altogether
Figure F200910044242XC00022
OK
Figure F200910044242XC00023
Row,
CN200910044242XA 2009-09-03 2009-09-03 Grayscale normalization method of fingerprint images Expired - Fee Related CN101650781B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910044242XA CN101650781B (en) 2009-09-03 2009-09-03 Grayscale normalization method of fingerprint images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910044242XA CN101650781B (en) 2009-09-03 2009-09-03 Grayscale normalization method of fingerprint images

Publications (2)

Publication Number Publication Date
CN101650781A CN101650781A (en) 2010-02-17
CN101650781B true CN101650781B (en) 2011-04-20

Family

ID=41673017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910044242XA Expired - Fee Related CN101650781B (en) 2009-09-03 2009-09-03 Grayscale normalization method of fingerprint images

Country Status (1)

Country Link
CN (1) CN101650781B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530641B (en) * 2013-09-30 2016-08-17 西安空间无线电技术研究所 A kind of implementation method of high speed real-time image gray level co-occurrence matrix angular second moment
CN104111455A (en) * 2014-07-29 2014-10-22 上海无线电设备研究所 Microwave imaging radar image data gray level quantification method and device
CN111709949A (en) * 2020-08-19 2020-09-25 武汉精测电子集团股份有限公司 Outdoor display screen detection and repair method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1580686A2 (en) * 2004-03-22 2005-09-28 Lg Electronics Inc. Fingerprint recognition system and method
CN101266644A (en) * 2008-04-02 2008-09-17 范九伦 Fingerprint image thinning method based on formwork

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1580686A2 (en) * 2004-03-22 2005-09-28 Lg Electronics Inc. Fingerprint recognition system and method
CN101266644A (en) * 2008-04-02 2008-09-17 范九伦 Fingerprint image thinning method based on formwork

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨菊.指纹图像预处理及特征提取算法的研究和实现.《中国优秀硕士学位论文全文库》.2006, *

Also Published As

Publication number Publication date
CN101650781A (en) 2010-02-17

Similar Documents

Publication Publication Date Title
US9846932B2 (en) Defect detection method for display panel based on histogram of oriented gradient
Pai et al. Adaptive thresholding algorithm: Efficient computation technique based on intelligent block detection for degraded document images
WO2021217851A1 (en) Abnormal cell automatic labeling method and apparatus, electronic device, and storage medium
WO2021003938A1 (en) Image classification method and apparatus, computer device and storage medium
CN103020971A (en) Method for automatically segmenting target objects from images
CN102999926B (en) A kind of image vision significance computational methods merged based on low-level image feature
CN104392231A (en) Block and sparse principal feature extraction-based rapid collaborative saliency detection method
CN108427969A (en) A kind of paper sheet defect sorting technique of Multiscale Morphological combination convolutional neural networks
CN104217213A (en) Medical image multi-stage classification method based on symmetry theory
CN104282008A (en) Method for performing texture segmentation on image and device thereof
CN101650781B (en) Grayscale normalization method of fingerprint images
Seetharaman et al. Statistical distributional approach for scale and rotation invariant color image retrieval using multivariate parametric tests and orthogonality condition
CN103839066A (en) Feature extraction method based on biological vision
CN106991753A (en) A kind of image binaryzation method and device
CN102855484B (en) Based on object detection method, the Apparatus and system of Local Integral image procossing
Gong et al. Breast density analysis based on glandular tissue segmentation and mixed feature extraction
Dornaika et al. Object-centric contour-aware data augmentation using superpixels of varying granularity
Wang et al. Expression recognition method based on evidence theory and local texture
CN111127407B (en) Fourier transform-based style migration forged image detection device and method
CN106056575B (en) A kind of image matching method based on like physical property proposed algorithm
Jiang et al. An improved svm classifier for medical image classification
CN107146215A (en) A kind of conspicuousness detection method based on color histogram and convex closure
CN114913345A (en) Simplified image feature extraction method based on SIFT algorithm of FPGA
CN102999763B (en) Based on the top-down vision significance extracting method of scale selection
CN110097058A (en) Irregular form image object automatic marking method based on sub-region right combination

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110420

Termination date: 20160903

CF01 Termination of patent right due to non-payment of annual fee