CN100517371C - Stitching Method of Fingerprint Image Frame Sequence Based on Waveform Matching - Google Patents

Stitching Method of Fingerprint Image Frame Sequence Based on Waveform Matching Download PDF

Info

Publication number
CN100517371C
CN100517371C CNB2007101000085A CN200710100008A CN100517371C CN 100517371 C CN100517371 C CN 100517371C CN B2007101000085 A CNB2007101000085 A CN B2007101000085A CN 200710100008 A CN200710100008 A CN 200710100008A CN 100517371 C CN100517371 C CN 100517371C
Authority
CN
China
Prior art keywords
waveform
image
image frame
reference area
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB2007101000085A
Other languages
Chinese (zh)
Other versions
CN101086766A (en
Inventor
王朋
张有光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CNB2007101000085A priority Critical patent/CN100517371C/en
Publication of CN101086766A publication Critical patent/CN101086766A/en
Application granted granted Critical
Publication of CN100517371C publication Critical patent/CN100517371C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention relates to a finger print image jointing method for image processing control. For each picture, it only selects part of it as the reference area to get the grey binary wave and to extract the wave jumping information of the current picture with the jointed image overlapping area to realize the jointing. It has small amount of computation, simple in realization, quick to complete the jointing of finger print image, applicable for all kinds of scrape finger print image collection module.

Description

基于波形匹配的指纹图像帧序列的拼接方法 Stitching Method of Fingerprint Image Frame Sequence Based on Waveform Matching

技术领域 technical field

本发明涉及一种图像处理过程控制方法,具体来说是一种指纹图像帧的拼接方法。The invention relates to an image processing process control method, in particular to a splicing method of fingerprint image frames.

背景技术 Background technique

现在自动指纹识别系统的应用范围越来越广,如手机和计算机,特别是当自动指纹识别系统应用于手机时,需要采用小的指纹传感器,否则将会影响手机的外观布局以及体积大小。从产品成本上讲,小的指纹传感器需要的集成电路芯片的面积更小,成本更低。因此,刮擦式传感器获得了越来越广泛的应用。Now the automatic fingerprint identification system is more and more widely used, such as mobile phones and computers, especially when the automatic fingerprint identification system is applied to mobile phones, it is necessary to use a small fingerprint sensor, otherwise it will affect the appearance layout and volume of the mobile phone. In terms of product cost, a small fingerprint sensor requires a smaller integrated circuit chip area and lower cost. Therefore, the scratch sensor has been widely used.

刮擦式指纹传感器的工作原理是:采集很多帧的指纹图像,相邻帧间有一定的重叠区域,需要通过图像拼接将各帧图像拼接成一幅完整的指纹图像。因此指纹图像拼接算法对于刮擦式指纹传感器有重要意义。The working principle of the scratch-type fingerprint sensor is: collect many frames of fingerprint images, and there is a certain overlapping area between adjacent frames. It is necessary to stitch each frame of images into a complete fingerprint image through image stitching. Therefore, the fingerprint image mosaic algorithm is of great significance to the scratch-type fingerprint sensor.

在实际应用中,要求自动指纹识别系统的响应时间比较短,所以要求指纹图像拼接算法运算速度快,拼接图像能够满足指纹识别算法的要求。In practical applications, the response time of the automatic fingerprint identification system is required to be relatively short, so the fingerprint image mosaic algorithm is required to be fast, and the mosaic image can meet the requirements of the fingerprint identification algorithm.

经过对现有技术文献的检索发现,中国专利公开号CN1694118A,公开日2005年11月9日,发明创造的名称为“扩展相位相关的滑动指纹序列无缝拼接方法”,该申请案公开了一种扩展相位相关的滑动指纹序列无缝拼接方法。其不足之处是需要对每个图像帧进行傅立叶变换以实现拼接。After searching the prior art documents, it was found that Chinese Patent Publication No. CN1694118A, published on November 9, 2005, was named "Seamless Splicing Method of Sliding Fingerprint Sequences with Extended Phase Correlation", and the application disclosed a A method for seamless stitching of swipe fingerprint sequences with extended phase correlation. Its disadvantage is that it needs to perform Fourier transform on each image frame to achieve splicing.

中国专利公开号CN 1804862A,公开日2006年7月19日,发明创造的名称为“指纹图像帧的拼接方法”,该申请案公开了一种指纹图像帧的拼接方法。其不足之处是需要对每个图像帧计算灰度方差以实现拼接。Chinese Patent Publication No. CN 1804862A, the disclosure date was July 19, 2006. The title of the invention is "the splicing method of fingerprint image frames", and the application discloses a splicing method of fingerprint image frames. Its shortcoming is that it needs to calculate the variance of gray level for each image frame to achieve splicing.

发明内容 Contents of the invention

本发明克服了上述缺点,提供了一种在指纹图像拼接过程中只处理每个图像帧经过二值化的参考区域,从而减少计算量,加快拼接速度的指纹图像帧的拼接方法。The present invention overcomes the above-mentioned shortcomings and provides a splicing method of fingerprint image frames that only processes the binarized reference area of each image frame during the splicing process of fingerprint images, thereby reducing the amount of calculation and speeding up the splicing speed.

本发明解决其技术问题所采取的技术方案是:一种指纹图像帧的拼接方法,对每帧图像,只根据其参考区域提取出的二进制波形跳变信息进行拼接。The technical scheme adopted by the present invention to solve the technical problem is: a splicing method of fingerprint image frames, and splicing each frame of image only according to the binary waveform jump information extracted from its reference area.

可包括如下步骤:May include the following steps:

1)收到指纹图像采集模块采集到的指纹图像帧后,读入第一帧图像;1) After receiving the fingerprint image frame collected by the fingerprint image acquisition module, read in the first frame image;

2)对图像帧进行缩减处理,拼入拼接图像,依次采用各个参考区域提取模板提取基准参考区域,计算提取到的基准参考区域的波形跳变信息,如果能够获得基准区域波形跳变信息,读入下一帧图像,进入步骤3);如果无法获得基准区域波形跳变信息,读入下一帧图像,返回步骤2);2) The image frame is reduced, stitched into the spliced image, each reference area extraction template is used in turn to extract the reference reference area, and the waveform jump information of the extracted reference reference area is calculated. If the waveform jump information of the reference area can be obtained, read Enter the next frame of image, enter step 3); if the waveform jump information of the reference area cannot be obtained, read in the next frame of image, return to step 2);

3)对图像帧进行缩减处理,采用上一图像帧所采用的参考区域提取模板提取当前图像帧的参考区域,计算参考区域的波形跳变信息;3) The image frame is reduced, and the reference area of the current image frame is extracted using the reference area extraction template adopted by the previous image frame, and the waveform jump information of the reference area is calculated;

4)确定当前图像帧与拼接图像的重叠区域;4) Determining the overlapping area between the current image frame and the spliced image;

5)如果当前图像帧参考区域标定重复行以上的图像区域中存在能够作为基准区域的图像区域,将其作为新基准区域,更新基准区域波形跳变信息,将当前图像帧拼入拼接图像,读入下一个图像帧,转入步骤3);反之,进入步骤6);5) If there is an image area that can be used as a reference area in the image area above the current image frame reference area calibration repetition line, use it as a new reference area, update the waveform jump information of the reference area, stitch the current image frame into the spliced image, and read Enter the next image frame, go to step 3); otherwise, go to step 6);

6)选用其他的参考区域模板提取新基准参考区域,计算新基准参考区域波形跳变信息,获得新基准区域波形跳变信息,提取新参考区域,计算新参考区域的波形跳变信息,直到通过步骤4)和5)进入步骤3);反之,如果所有参考区域提取模板都使用后,仍然不能进入步骤3),则进入步骤7)完成拼接;6) Select other reference area templates to extract the new reference area, calculate the waveform transition information of the new reference area, obtain the waveform transition information of the new reference area, extract the new reference area, and calculate the waveform transition information of the new reference area until passing Step 4) and 5) enter step 3); On the contrary, if all reference region extraction templates are used, still cannot enter step 3), then enter step 7) to complete splicing;

7)如果所有图像帧都无法提取基准区域波形跳变信息,或者处理完所有图像帧,或者某个图像帧不能拼入拼接图像,或者拼接图像大小已经达到预设阈值,终止拼接算法;如果拼接成功,输出拼接图像,反之,输出拼接失败。7) If all image frames cannot extract the waveform jump information of the reference area, or all image frames have been processed, or a certain image frame cannot be spliced into the spliced image, or the size of the spliced image has reached the preset threshold, the splicing algorithm is terminated; if spliced If successful, the spliced image will be output, otherwise, the splicing will fail.

所述图像帧的缩减处理是将指纹图像采集模块采集到的图像帧删除最上面和最下面各一行。The reduction processing of the image frame is to delete the uppermost and the lowermost row respectively from the image frame collected by the fingerprint image collection module.

所述参考区域是指图像帧或者拼接图像中由参考区域提取模板提取出来的图像区域,大小为图像帧的1/6~1/3。The reference area refers to the image area extracted from the reference area extraction template in the image frame or spliced image, and its size is 1/6-1/3 of the image frame.

所述步骤4)通过如下步骤实现:Described step 4) realizes by following steps:

1)将当前图像帧覆盖在拼接图像上,使其最上一行与基准区域最上一行重合,计算当前图像帧与基准区域重叠的部分与基准区域的波形相似度;1) Overlay the current image frame on the spliced image so that the top line coincides with the top line of the reference area, and calculate the waveform similarity between the overlapping part of the current image frame and the reference area and the reference area;

2)向上移动当前图像帧,每次移动一行,直到当前图像帧最下面一行与基准区域最下一行重合;每移动一次,计算一次当前图像帧与基准区域重叠的部分与基准区域的波形相似度;2) Move the current image frame upwards, one line at a time, until the bottom line of the current image frame coincides with the bottom line of the reference area; every time it moves, calculate the waveform similarity between the overlapping part of the current image frame and the reference area and the reference area ;

3)所有区域波形相似度的最大值对应的当前图像帧区域即为标定重复区域;3) The current image frame area corresponding to the maximum value of the waveform similarity of all areas is the calibration repetition area;

4)标定重复区域中与其所对应的基准行的行波形相似度最大的图像行即为当前图像帧的标定重复行,与其对应的基准行为标定基准行;4) The image line with the largest similarity to the line waveform of the corresponding reference line in the calibration repetition area is the calibration repetition line of the current image frame, and the corresponding reference behavior is the calibration reference line;

5)在水平方向上移动标定重复行对应的波形,每移动一个像素点,计算一次与标定基准行的行波形相似度,所得行波形相似度的最大值所对应的标定重复行波形的水平移动量即为当前图像帧的水平偏移量。5) Move the waveform corresponding to the calibrated repetitive row in the horizontal direction, and calculate the similarity of the row waveform with the calibrated reference row for each pixel point, and the horizontal movement of the calibrated repetitive row waveform corresponding to the maximum value of the obtained row waveform similarity The amount is the horizontal offset of the current image frame.

所述步骤5)中将当前图像帧拼入拼接图像时,重叠部分取两个重叠像素点灰度值的加权平均值作为拼接结果,两个重叠像素点灰度值的权值非负,和为1。When the current image frame is stitched into the spliced image in said step 5), the weighted average of the gray values of two overlapping pixels is taken as the splicing result in the overlapping part, and the weights of the gray values of the two overlapping pixels are non-negative, and is 1.

所述基准区域是指拼接图像基准参考区域中的连续3行图像,其中每行图像都至少包含一个波形跳变信息。The reference area refers to 3 consecutive rows of images in the reference region of the mosaic image, where each row of images contains at least one waveform jump information.

所述波形跳变信息是指各图像行所对应的二进制波形中由0变化到1或由1变化到0的波形跳变的位置和次序。The waveform jump information refers to the position and sequence of the waveform jumps from 0 to 1 or from 1 to 0 in the binary waveform corresponding to each image row.

所述波形跳变信息是指二进制波形中由1变化到连续两个0的波形跳变的位置和次序。The waveform jump information refers to the position and sequence of the waveform jump from 1 to two consecutive 0s in the binary waveform.

所述行波形相似度的计算是由两个波形所包含的波形跳变的位置、数目和次序决定的,每次参与计算的两个波形分别属于基准区域和当前图像帧参考区域。The calculation of the row waveform similarity is determined by the position, number and order of the waveform jumps contained in the two waveforms, and the two waveforms participating in the calculation each time belong to the reference area and the reference area of the current image frame respectively.

所述行波形相似度的计算规则如下:The calculation rules for the row waveform similarity are as follows:

1)如果两个跳变位置相差3或4,相似度加1,并且如果两个跳变在各自波形中的次序相同,相似度加1;1) If the two transition positions differ by 3 or 4, add 1 to the similarity, and if the two transitions are in the same order in their respective waveforms, add 1 to the similarity;

2)如果两个跳变位置相差2,相似度加2,并且如果两个跳变在各自波形中的次序相同,相似度加1;2) If the difference between the two jump positions is 2, the similarity is added by 2, and if the order of the two jumps in their respective waveforms is the same, the similarity is added by 1;

3)如果两个跳变位置相差1,相似度加3,并且如果两个跳变在各自波形中的次序相同,相似度加2;3) If the difference between two transition positions is 1, add 3 to the similarity, and if the order of the two transitions in their respective waveforms is the same, add 2 to the similarity;

4)如果两个跳变位置相同,相似度加4,并且如果两个跳变在各自波形中的次序相同,相似度加2;4) If the positions of the two transitions are the same, add 4 to the similarity, and if the order of the two transitions in their respective waveforms is the same, add 2 to the similarity;

5)如果满足规则1)~规则4)中一个或多个规则的两个波形包含的跳变数目相同,相似度加2。5) If two waveforms that satisfy one or more rules in rule 1) to rule 4) contain the same number of transitions, add 2 to the similarity.

附图说明 Description of drawings

图1为本发明的工作流程图Fig. 1 is a work flow diagram of the present invention

图2为程序初始化时计算基准区域波形跳变信息的程序流程图Figure 2 is the program flow chart for calculating the waveform jump information in the reference area when the program is initialized

图3为拼接单个图像帧的程序流程图Figure 3 is a program flow chart for splicing a single image frame

图4为本发明所采用的Authentec公司AES2510刮擦式传感器所采集到的图像帧序列Fig. 4 is the image frame sequence gathered by the AES2510 scraping type sensor of Authentec company that the present invention adopts

图5为根据本发明拼接方法实施例1所得到的拼接

Figure C20071010000800061
果Fig. 5 is the splicing obtained according to Embodiment 1 of the splicing method of the present invention
Figure C20071010000800061
fruit

图6为根据本发明拼接方法实施例2所得到的拼接结果Fig. 6 is the splicing result obtained according to Embodiment 2 of the splicing method of the present invention

图7为根据本发明拼接方法实施例3所得到的拼接结果Fig. 7 is the splicing result obtained according to embodiment 3 of the splicing method of the present invention

具体实施方式 Detailed ways

下面结合实施例对本发明作进一步的描述。The present invention will be further described below in conjunction with embodiment.

刮擦式指纹传感器每次采集到的图像帧的数目不定,相邻两个图像帧中会有吻合或者可以近似看作吻合的重叠区域。这种吻合的程度通过波形相似度来衡量。衡量的依据是当前图像帧的波形跳变信息与基准区域的波形相似度。所述波形相似度是指当前图象帧对应波形与基准区域对应波形的相似程度,根据波形相似度确定当前图像帧的标定重复行和水平偏移量以定位重叠区域。本实施例中所采用的相似度由波形跳变偏移量、数目和序数决定。The number of image frames collected by the scratch-type fingerprint sensor is not fixed each time, and there will be overlapping areas in two adjacent image frames that match or can be approximately regarded as coincidence. The degree of this agreement is measured by waveform similarity. The measurement is based on the similarity between the waveform jump information of the current image frame and the waveform of the reference area. The waveform similarity refers to the similarity between the waveform corresponding to the current image frame and the waveform corresponding to the reference area. According to the similarity of the waveform, the calibration repetition line and the horizontal offset of the current image frame are determined to locate the overlapping area. The similarity used in this embodiment is determined by the waveform jump offset, number and sequence number.

在图像帧拼接的过程中,删除了易受噪音、汗渍和污迹影响的图像帧最上面和最下面各一行,在提高拼接算法的精确度的同时也减小了计算量。In the process of image frame splicing, the uppermost and lowermost lines of the image frame, which are easily affected by noise, sweat stains and smudges, are deleted, which improves the accuracy of the splicing algorithm and reduces the amount of calculation.

对删减后的图像帧提取其部分区域进行拼接运算,进一步减小了计算量。Partial regions of the pruned image frames are extracted for splicing operations, which further reduces the amount of computation.

其中,下文中所述的图像帧缩减处理是将指纹图像采集模块采集到的图像帧删除最上面和最下面各一行。Wherein, the image frame reduction process described below is to delete the topmost and bottommost lines of the image frames collected by the fingerprint image collection module.

所述的行波形相似度的计算规则如下:The calculation rules for the similarity of the row waveforms are as follows:

1)如果两个跳变位置相差3或4,相似度加1,并且如果两个跳变在各自波形中的次序相同,相似度加1;1) If the two transition positions differ by 3 or 4, add 1 to the similarity, and if the two transitions are in the same order in their respective waveforms, add 1 to the similarity;

2)如果两个跳变位置相差2,相似度加2,并且如果两个跳变在各自波形中的次序相同,相似度加1;2) If the difference between the two jump positions is 2, the similarity is added by 2, and if the order of the two jumps in their respective waveforms is the same, the similarity is added by 1;

3)如果两个跳变位置相差1,相似度加3,并且如果两个跳变在各自波形中的次序相同,相似度加2;3) If the difference between two transition positions is 1, add 3 to the similarity, and if the order of the two transitions in their respective waveforms is the same, add 2 to the similarity;

4)如果两个跳变位置相同,相似度加4,并且如果两个跳变在各自波形中的次序相同,相似度加2;4) If the positions of the two transitions are the same, add 4 to the similarity, and if the order of the two transitions in their respective waveforms is the same, add 2 to the similarity;

5)如果满足规则1)~规则4)中一个或多个规则的两个波形包含的跳变数目相同,相似度加2。5) If two waveforms that satisfy one or more rules in rule 1) to rule 4) contain the same number of transitions, add 2 to the similarity.

实施例1Example 1

以下结合如图1、2、3中,具体描述本实施例中图像拼接过程。The image splicing process in this embodiment will be described in detail below in conjunction with FIGS. 1 , 2 , and 3 .

首先,假定每帧图像大小为16行,192列,灰度级0~255,正常拼接后的图像最大192行,192列。采用的参考区域提取模板共3个,分别提取缩减后图像帧第1到第14行中的第65列到第128列、第1列到第64列和第129列到192列。First of all, it is assumed that the image size of each frame is 16 rows, 192 columns, and the gray level is 0-255. The maximum size of the image after normal splicing is 192 rows, 192 columns. A total of 3 reference area extraction templates are used, which extract the 65th to 128th columns, the 1st to 64th columns and the 129th to 192nd columns in the 1st to 14th rows of the reduced image frame respectively.

收到第一个图像帧后,对其进行缩减处理,剩余部分拼入拼接图像最下方。选择一个参考区域提取模板提取基准参考区域,计算二值化灰度阈值为参考区域灰度均值减去16。根据此阈值将基准参考区域二值化,将各图像行转化为二进制序列,得到对应的二进制波形。对每一个波形进行扫描,当遇到由1跳变到连续两个0的情况时,记录第一个0的位置。如果连续三行图像对应的波形都包含波形跳变信息,将此三行图像作为基准区域,其包含的波形跳变信息即为基准区域波形跳变信息。After receiving the first image frame, it is reduced, and the remaining part is stitched into the bottom of the stitched image. Select a reference area extraction template to extract the benchmark reference area, and calculate the binarized gray threshold as the average gray value of the reference area minus 16. Binarize the benchmark reference area according to this threshold, convert each image line into a binary sequence, and obtain the corresponding binary waveform. Scan each waveform, and record the position of the first 0 when it encounters a jump from 1 to two consecutive 0s. If the waveforms corresponding to three consecutive lines of images all contain waveform jump information, these three lines of images are used as the reference area, and the waveform jump information contained therein is the waveform jump information of the reference area.

收到第二帧图像后,采用上一帧图像所采用的参考区域提取模板提取参考区域,计算二值化灰度阈值,将参考区域二值化,得到对应的二进制波形。对每一个波形进行扫描,当遇到由1跳变到连续两个0的情况时,记录第一个0的位置。得到参考区域各行的波形跳变信息。After receiving the second frame of image, use the reference region extraction template used in the previous frame of image to extract the reference region, calculate the binarization gray threshold, and binarize the reference region to obtain the corresponding binary waveform. Scan each waveform, and record the position of the first 0 when it encounters a jump from 1 to two consecutive 0s. Obtain the waveform transition information of each row in the reference area.

计算基准区域波形跳变信息与参考区域波形跳变信息的波形相似度。分别计算参考区域任意连续三个波形与参考区域波形跳变信息的相似度,所有波形相似度的最大值所对应的连续三个波形对应的参考区域中的连续三个图像行即为第二个图像帧的标定重复区域。三个波形分别对应三个行波形相似度,其中的最大值所对应的图像行即为第二个图像帧的标定重复行,所对应的基准行即为标定基准行。在水平方向上移动标定重复行对应的波形跳变信息,分别计算各个行波形相似度,所有相似度的最大值对应的波形跳变信息的移动量即为第二个图像帧的水平偏移量。Calculate the waveform similarity between the waveform jump information in the reference area and the waveform jump information in the reference area. Calculate the similarity between any three consecutive waveforms in the reference area and the waveform jump information in the reference area, and the three consecutive image lines in the reference area corresponding to the maximum value of all waveform similarities are the second The calibrated repeat region of the image frame. The three waveforms correspond to the waveform similarities of the three lines respectively, and the image line corresponding to the maximum value is the calibration repetition line of the second image frame, and the corresponding reference line is the calibration reference line. Move and calibrate the waveform jump information corresponding to the repeated rows in the horizontal direction, and calculate the waveform similarity of each row respectively. The movement amount of the waveform jump information corresponding to the maximum value of all similarities is the horizontal offset of the second image frame .

得到第二个图像帧的标定重复行位置和水平偏移量后,确定了第二个图像帧的重叠区域,从第二个图像帧的参考区域中选择连续三个位于标定重复行之上的图像行,并且每个图像行都包含波形跳变信息,从而获得新基准区域和新基准区域波形跳变信息。将第二个图像帧拼入拼接图像,首先是将标定重复行与标定基准行重合,然后根据水平偏移量向左或向右移动第二帧图像,非重叠区域直接拼入拼接图像,对重叠区域,计算灰度值加权平均值时拼接图像像素点灰度值权值为0.6,当前图像帧像素点灰度值权值为0.4。由于图像帧移动所造成的拼接图像的空白区域以灰度为0的像素点填充。After obtaining the calibrated repetition line position and horizontal offset of the second image frame, the overlapping area of the second image frame is determined, and three consecutive positions above the calibrated repetition line are selected from the reference area of the second image frame. Image lines, and each image line contains waveform jump information, so as to obtain the new reference area and the waveform jump information of the new reference area. Stitch the second image frame into the spliced image. Firstly, coincide the calibrated repeat line with the calibrated reference line, and then move the second frame image to the left or right according to the horizontal offset. The non-overlapping area is directly stitched into the stitched image. In the overlapping area, when calculating the weighted average value of the gray value, the weight of the gray value of the pixel point of the spliced image is 0.6, and the weight of the gray value of the pixel point of the current image frame is 0.4. The blank area of the spliced image caused by the movement of the image frame is filled with pixels with a grayscale of 0.

第二帧图像处理完成后,读入后续的图像帧,处理方法同第二帧。最后输出如图5的图像。After the image processing of the second frame is completed, the subsequent image frames are read in, and the processing method is the same as that of the second frame. Finally, the image shown in Figure 5 is output.

实施例2Example 2

以下结合如图1、2、3中,具体描述本实施例中图像拼接过程。The image splicing process in this embodiment will be described in detail below in conjunction with FIGS. 1 , 2 , and 3 .

首先,假定每帧图像大小为16行,192列,灰度级0~255,正常拼接后的图像最大192行,192列。采用的参考区域提取模板共3个,分别提取缩减后图像帧第1到第14行中的第73列到第120列、第25列到第72列和第121列到168列。First of all, it is assumed that the image size of each frame is 16 rows, 192 columns, and the gray level is 0-255. The maximum size of the image after normal splicing is 192 rows, 192 columns. A total of 3 reference area extraction templates are used, which extract the 73rd to 120th columns, the 25th to 72nd columns and the 121st to 168th columns in the 1st to 14th rows of the reduced image frame respectively.

收到第一个图像帧后,对其进行缩减处理,剩余部分拼入拼接图像最下方。选择一个参考区域提取模板提取基准参考区域,计算二值化灰度阈值为参考区域灰度均值减去16。根据此阈值将基准参考区域二值化,将各图像行转化为二进制序列,得到对应的二进制波形。对每一个波形进行扫描,当遇到由1跳变到连续两个0的情况时,记录第一个0的位置。如果连续三行图像对应的波形都包含波形跳变信息,将此三行图像作为基准区域,其包含的波形跳变信息即为基准区域波形跳变信息。After receiving the first image frame, it is reduced, and the remaining part is stitched into the bottom of the stitched image. Select a reference area extraction template to extract the benchmark reference area, and calculate the binarized gray threshold as the average gray value of the reference area minus 16. Binarize the benchmark reference area according to this threshold, convert each image line into a binary sequence, and obtain the corresponding binary waveform. Scan each waveform, and record the position of the first 0 when it encounters a jump from 1 to two consecutive 0s. If the waveforms corresponding to three consecutive lines of images all contain waveform jump information, these three lines of images are used as the reference area, and the waveform jump information contained therein is the waveform jump information of the reference area.

收到第二帧图像后,采用上一帧图像所采用的参考区域提取模板提取参考区域,计算二值化灰度阈值,将参考区域二值化,得到对应的二进制波形。对每一个波形进行扫描,当遇到由1跳变到连续两个0的情况时,记录第一个0的位置。得到参考区域各行的波形跳变信息。After receiving the second frame of image, use the reference region extraction template used in the previous frame of image to extract the reference region, calculate the binarization gray threshold, and binarize the reference region to obtain the corresponding binary waveform. Scan each waveform, and record the position of the first 0 when it encounters a jump from 1 to two consecutive 0s. Obtain the waveform transition information of each row in the reference area.

计算基准区域波形跳变信息与参考区域波形跳变信息的波形相似度。分别计算参考区域任意连续三个波形与参考区域波形跳变信息的相似度,所有波形相似度的最大值所对应的连续三个波形对应的参考区域中的连续三个图像行即为第二个图像帧的标定重复区域。三个波形分别对应三个行波形相似度,其中的最大值所对应的图像行即为第二个图像帧的标定重复行,所对应的基准行即为标定基准行。在水平方向上移动标定重复行对应的波形跳变信息,分别计算各个行波形相似度,所有相似度的最大值对应的波形跳变信息的移动量即为第二个图像帧的水平偏移量。Calculate the waveform similarity between the waveform jump information in the reference area and the waveform jump information in the reference area. Calculate the similarity between any three consecutive waveforms in the reference area and the waveform jump information in the reference area, and the three consecutive image lines in the reference area corresponding to the maximum value of all waveform similarities are the second The calibrated repeat region of the image frame. The three waveforms correspond to the waveform similarities of the three lines respectively, and the image line corresponding to the maximum value is the calibration repetition line of the second image frame, and the corresponding reference line is the calibration reference line. Move and calibrate the waveform jump information corresponding to the repeated rows in the horizontal direction, and calculate the waveform similarity of each row respectively. The movement amount of the waveform jump information corresponding to the maximum value of all similarities is the horizontal offset of the second image frame .

得到第二个图像帧的标定重复行位置和水平偏移量后,确定了第二个图像帧的重叠区域,从第二个图像帧的参考区域中选择连续三个位于标定重复行之上的图像行,并且每个图像行都包含波形跳变信息,从而获得新基准区域和新基准区域波形跳变信息。将第二个图像帧拼入拼接图像,首先是将标定重复行与标定基准行重合,然后根据水平偏移量向左或向右移动第二帧图像,非重叠区域直接拼入拼接图像,对重叠区域,计算灰度值加权平均值时拼接图像像素点灰度值权值为0.5,当前图像帧像素点灰度值权值为0.5。由于图像帧移动所造成的拼接图像的空白区域以灰度为0的像素点填充。After obtaining the calibrated repetition line position and horizontal offset of the second image frame, the overlapping area of the second image frame is determined, and three consecutive positions above the calibrated repetition line are selected from the reference area of the second image frame. Image lines, and each image line contains waveform jump information, so as to obtain the new reference area and the waveform jump information of the new reference area. Stitch the second image frame into the spliced image. Firstly, coincide the calibrated repeat line with the calibrated reference line, and then move the second frame image to the left or right according to the horizontal offset. The non-overlapping area is directly stitched into the stitched image. In the overlapping area, when calculating the weighted average value of the gray value, the weight of the gray value of the pixel point of the spliced image is 0.5, and the weight of the gray value of the pixel point of the current image frame is 0.5. The blank area of the spliced image caused by the movement of the image frame is filled with pixels with a grayscale of 0.

第二帧图像处理完成后,读入后续的图像帧,处理方法同第二帧。最后输出如图6的图像。After the image processing of the second frame is completed, the subsequent image frames are read in, and the processing method is the same as that of the second frame. Finally, the image shown in Figure 6 is output.

实施例3Example 3

以下结合如图1、2、3中,具体描述本实施例中图像拼接过程。The image splicing process in this embodiment will be described in detail below in conjunction with FIGS. 1 , 2 , and 3 .

首先,假定每帧图像大小为16行,192列,灰度级0~255,正常拼接后的图像最大192行,192列。采用的参考区域提取模板共3个,分别提取缩减后图像帧第1到第14行中的第81列到第112列、第49列到第80列和第113列到144列。First of all, it is assumed that the image size of each frame is 16 rows, 192 columns, and the gray level is 0-255. The maximum size of the image after normal splicing is 192 rows, 192 columns. A total of 3 reference area extraction templates are used to extract the 81st to 112th columns, the 49th to 80th columns and the 113th to 144th columns in the 1st to 14th rows of the reduced image frame respectively.

收到第一个图像帧后,对其进行缩减处理,剩余部分拼入拼接图像最下方。选择一个参考区域提取模板提取基准参考区域,计算二值化灰度阈值为参考区域灰度均值减去16。根据此阈值将基准参考区域二值化,将各图像行转化为二进制序列,得到对应的二进制波形。对每一个波形进行扫描,当遇到由1跳变到连续两个0的情况时,记录第一个0的位置。如果连续三行图像对应的波形都包含波形跳变信息,将此三行图像作为基准区域,其包含的波形跳变信息即为基准区域波形跳变信息。After receiving the first image frame, it is reduced, and the remaining part is stitched into the bottom of the stitched image. Select a reference area extraction template to extract the benchmark reference area, and calculate the binarized gray threshold as the average gray value of the reference area minus 16. Binarize the benchmark reference area according to this threshold, convert each image line into a binary sequence, and obtain the corresponding binary waveform. Scan each waveform, and record the position of the first 0 when it encounters a jump from 1 to two consecutive 0s. If the waveforms corresponding to three consecutive lines of images all contain waveform jump information, these three lines of images are used as the reference area, and the waveform jump information contained therein is the waveform jump information of the reference area.

收到第二帧图像后,采用上一帧图像所采用的参考区域提取模板提取参考区域,计算二值化灰度阈值,将参考区域二值化,得到对应的二进制波形。对每一个波形进行扫描,当遇到由1跳变到连续两个0的情况时,记录第一个0的位置。得到参考区域各行的波形跳变信息。After receiving the second frame of image, use the reference region extraction template used in the previous frame of image to extract the reference region, calculate the binarization gray threshold, and binarize the reference region to obtain the corresponding binary waveform. Scan each waveform, and record the position of the first 0 when it encounters a jump from 1 to two consecutive 0s. Obtain the waveform transition information of each row in the reference area.

计算基准区域波形跳变信息与参考区域波形跳变信息的波形相似度。分别计算参考区域任意连续三个波形与参考区域波形跳变信息的相似度,所有波形相似度的最大值所对应的连续三个波形对应的参考区域中的连续三个图像行即为第二个图像帧的标定重复区域。三个波形分别对应三个行波形相似度,其中的最大值所对应的图像行即为第二个图像帧的标定重复行,所对应的基准行即为标定基准行。在水平方向上移动标定重复行对应的波形跳变信息,分别计算各个行波形相似度,所有相似度的最大值对应的波形跳变信息的移动量即为第二个图像帧的水平偏移量。Calculate the waveform similarity between the waveform jump information in the reference area and the waveform jump information in the reference area. Calculate the similarity between any three consecutive waveforms in the reference area and the waveform jump information in the reference area, and the three consecutive image lines in the reference area corresponding to the maximum value of all waveform similarities are the second The calibrated repeat region of the image frame. The three waveforms correspond to the waveform similarities of the three lines respectively, and the image line corresponding to the maximum value is the calibration repetition line of the second image frame, and the corresponding reference line is the calibration reference line. Move and calibrate the waveform jump information corresponding to the repeated rows in the horizontal direction, and calculate the waveform similarity of each row respectively. The movement amount of the waveform jump information corresponding to the maximum value of all similarities is the horizontal offset of the second image frame .

得到第二个图像帧的标定重复行位置和水平偏移量后,确定了第二个图像帧的重叠区域,从第二个图像帧的参考区域中选择连续三个位于标定重复行之上的图像行,并且每个图像行都包含波形跳变信息,从而获得新基准区域和新基准区域波形跳变信息。将第二个图像帧拼入拼接图像,首先是将标定重复行与标定基准行重合,然后根据水平偏移量向左或向右移动第二帧图像,非重叠区域直接拼入拼接图像,对重叠区域,计算灰度值加权平均值时拼接图像像素点灰度值权值为0.3,当前图像帧像素点灰度值权值为0.7。由于图像帧移动所造成的拼接图像的空白区域以灰度为0的像素点填充。After obtaining the calibrated repetition line position and horizontal offset of the second image frame, the overlapping area of the second image frame is determined, and three consecutive positions above the calibrated repetition line are selected from the reference area of the second image frame. Image lines, and each image line contains waveform jump information, so as to obtain the new reference area and the waveform jump information of the new reference area. Stitch the second image frame into the spliced image. Firstly, coincide the calibrated repeat line with the calibrated reference line, and then move the second frame image to the left or right according to the horizontal offset. The non-overlapping area is directly stitched into the stitched image. In the overlapping area, when calculating the weighted average value of the gray value, the weight of the gray value of the pixel point of the spliced image is 0.3, and the weight of the gray value of the pixel point of the current image frame is 0.7. The blank area of the spliced image caused by the movement of the image frame is filled with pixels with a grayscale of 0.

第二帧图像处理完成后,读入后续的图像帧,处理方法同第二帧。最后输出如图7的图像。After the image processing of the second frame is completed, the subsequent image frames are read in, and the processing method is the same as that of the second frame. Finally, the image shown in Figure 7 is output.

所述各图像帧的灰度级数可以是任意范围,如0~255或者0~15,对于非0~255的图像帧,只需将其灰度级范围转换到0~255即可,对于灰度级为0~1的二值图像,则不需二值化直接提取波形跳变信息即可。本发明也适用于采用类似于刮擦式指纹图像采集方式的其他图像采集成像过程。The gray levels of each image frame can be in any range, such as 0-255 or 0-15. For image frames other than 0-255, it is only necessary to convert the gray-scale range to 0-255. For For a binary image with a gray level of 0 to 1, the waveform jump information can be directly extracted without binarization. The present invention is also applicable to other image acquisition and imaging processes that adopt a method similar to the scraping fingerprint image acquisition method.

以上对本发明所提供的基于波形匹配的指纹图像帧的拼接方法进行了详细介绍,文中应用了具体的实施例对本发明的原理和实施方式进行了阐述,以上实施例的说明用于帮助理解本发明的方法及思想。综上所述,本说明书内容不应理解为对本发明的限制。The method for mosaicing fingerprint image frames based on waveform matching provided by the present invention has been introduced in detail above, and the principles and implementation methods of the present invention have been explained by applying specific embodiments in the text. The descriptions of the above embodiments are used to help understand the present invention methods and ideas. In summary, the contents of this specification should not be construed as limiting the present invention.

Claims (10)

1. the joining method of a fingerprint image frame, it is characterized in that: at first fingerprint image frame to be spliced is reduced processing, from the picture frame after the reduction, select reference zone again, extract reference zone waveform saltus step information and carry out the splicing of fingerprint image frame, comprise the steps:
1) receive the fingerprint image frame that the fingerprint image acquisition module collects after, read in first two field picture;
2) picture frame is reduced processing, piece together to go into stitching image, adopt each reference zone to extract template extraction reference zone successively, the waveform saltus step information in the reference zone that calculating is extracted, if can obtain reference area waveform saltus step information, read in the next frame image, enter step 3); If can't obtain reference area waveform saltus step information, read in the next frame image, return step 2);
3) picture frame is reduced processing, the reference zone that adopts a last picture frame to be adopted extracts the reference zone of template extraction current image frame, calculates the waveform saltus step information of reference zone;
4) determine the overlapping region of current image frame and stitching image;
5) existence can be as the image-region of reference area in the above image-region of repeated rows if the current image frame reference zone is demarcated, it as new reference area, is upgraded reference area waveform saltus step information, current image frame is pieced together into stitching image, read in next picture frame, change step 3) over to; Otherwise, enter step 6);
6) select other the new reference of reference zone template extraction zone for use, calculate new reference zone waveform saltus step information, obtain new reference area waveform saltus step information, extract new reference zone, calculate the waveform saltus step information of new reference zone, up to by step 4) and 5) enter step 3); Otherwise,, then enter step 7) and finish splicing if after all reference zones extraction templates are all used, still can not enter step 3);
7) if all images frame all can't extract reference area waveform saltus step information, perhaps handle all images frame, perhaps certain picture frame can not be pieced together into stitching image, and perhaps the stitching image size has reached predetermined threshold value, stops stitching algorithm; If splice successfully, the output stitching image, otherwise, output splicing failure.
2. the joining method of fingerprint image frame according to claim 1 is characterized in that: the reduction of described picture frame handle be picture frame deletion that the fingerprint image acquisition module is collected topmost and bottom each delegation.
3. the joining method of fingerprint image frame according to claim 1 is characterized in that: described reference zone is meant in picture frame or the stitching image and extracts the image-region that template extraction is come out by reference zone that size is 1/6~1/3 of a picture frame.
4. the joining method of fingerprint image frame according to claim 1, it is characterized in that: described step 4) realizes as follows:
1) current image frame is covered on the stitching image, the lastrow of its lastrow and reference area is overlapped, calculate the wave-form similarity of the overlapping part of current image frame and reference area and reference area;
2) current image frame that moves up, each mobile delegation overlaps up to the next line of current image frame bottom line and reference area; Whenever move once, calculate the wave-form similarity of the overlapping part of current image frame and reference area and reference area;
3) the current image frame zone of the maximal value correspondence of All Ranges wave-form similarity is the demarcation repeat region;
4) image line of demarcating the capable wave-form similarity maximum of reference row pairing with it in the repeat region is the demarcation repeated rows of current image frame, the benchmark behavior demarcation reference row corresponding with it;
5) move the waveform of demarcating the repeated rows correspondence in the horizontal direction, whenever move a pixel, calculate once and the capable wave-form similarity of demarcating reference row, the amount of moving horizontally of the pairing demarcation repeated rows of the maximal value of the capable wave-form similarity of gained waveform is the horizontal offset of current image frame.
5. the joining method of fingerprint image frame according to claim 1, it is characterized in that: when piecing together current image frame into stitching image in the described step 5), lap is got the weighted mean value of two superposition image vegetarian refreshments gray-scale values as the splicing result, the weights of two superposition image vegetarian refreshments gray-scale values are non-negative and be 1.
6. the joining method of fingerprint image frame according to claim 1 is characterized in that: described reference area is meant the continuous 3 row images in the stitching image reference zone, and wherein every capable image all comprises a waveform saltus step information at least.
7. the joining method of fingerprint image frame according to claim 1 is characterized in that: described waveform saltus step information is meant in the pairing binary waveform of each image line by 0 and changes to 1 or by 1 position and the order that changes to 0 waveform saltus step.
8. the joining method of fingerprint image frame according to claim 7 is characterized in that: described waveform saltus step information is meant in the binary waveform position and the order that changes to continuous two 0 waveform saltus step by 1.
9. the joining method of fingerprint image frame according to claim 4, it is characterized in that: the calculating of described capable wave-form similarity is to be determined by the position of two waveform saltus steps that waveform comprised, number and order, and each two waveforms that participate in calculating belong to reference area and current image frame reference zone respectively.
10. the joining method of fingerprint image frame according to claim 9, it is characterized in that: the computation rule of described capable wave-form similarity is as follows:
1) if two saltus step positions differ 3 or 4, similarity adds 1, and if two saltus steps order in waveform separately identical, similarity adds 1;
2) if two saltus step positions differ 2, similarity adds 2, and if two saltus steps order in waveform separately identical, similarity adds 1;
3) if two saltus step positions differ 1, similarity adds 3, and if two saltus steps order in waveform separately identical, similarity adds 2;
4) if two saltus step positions are identical, similarity adds 4, and if two saltus steps order in waveform separately identical, similarity adds 2;
5) if satisfy rule 1)~rule 4) in the saltus step number that comprises of two waveforms of one or more rules identical, similarity adds 2.
CNB2007101000085A 2007-06-04 2007-06-04 Stitching Method of Fingerprint Image Frame Sequence Based on Waveform Matching Expired - Fee Related CN100517371C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2007101000085A CN100517371C (en) 2007-06-04 2007-06-04 Stitching Method of Fingerprint Image Frame Sequence Based on Waveform Matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2007101000085A CN100517371C (en) 2007-06-04 2007-06-04 Stitching Method of Fingerprint Image Frame Sequence Based on Waveform Matching

Publications (2)

Publication Number Publication Date
CN101086766A CN101086766A (en) 2007-12-12
CN100517371C true CN100517371C (en) 2009-07-22

Family

ID=38937715

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2007101000085A Expired - Fee Related CN100517371C (en) 2007-06-04 2007-06-04 Stitching Method of Fingerprint Image Frame Sequence Based on Waveform Matching

Country Status (1)

Country Link
CN (1) CN100517371C (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012163112A1 (en) * 2011-05-27 2012-12-06 汉王科技股份有限公司 Frame-skipping scanning and recognizing device and method

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063611B (en) * 2010-01-21 2013-05-29 汉王科技股份有限公司 Method and system for inputting characters
CN102602294B (en) * 2012-03-23 2014-12-10 潍柴动力股份有限公司 Method and device for displaying fuel economy state of car engine
CN103679179A (en) * 2012-09-18 2014-03-26 成都方程式电子有限公司 High-speed local relevance calculation method and device
CN103136517B (en) * 2013-03-04 2015-11-11 杭州景联文科技有限公司 A kind of real-time joining method of rolling fingerprint image sequence selected based on key column
CN106062778B (en) * 2016-04-01 2019-05-07 深圳市汇顶科技股份有限公司 Fingerprint identification method, device and terminal
CN106104575B (en) * 2016-06-13 2019-09-17 北京小米移动软件有限公司 Fingerprint template generation method and device
CN106326869B (en) * 2016-08-26 2021-01-12 Oppo广东移动通信有限公司 Fingerprint identification method and device and mobile terminal
CN106993157A (en) * 2017-04-05 2017-07-28 宇龙计算机通信科技(深圳)有限公司 A kind of intelligent control method and device based on dual camera
AU2018407274B2 (en) 2018-02-12 2022-01-06 Huawei Technologies Co., Ltd. Fingerprint enrollment method and terminal
CN111209872B (en) * 2020-01-09 2022-05-03 浙江工业大学 Real-time rolling fingerprint splicing method based on dynamic programming and multi-objective optimization
CN111524067B (en) * 2020-04-01 2023-09-12 北京东软医疗设备有限公司 Image processing method, device and equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012163112A1 (en) * 2011-05-27 2012-12-06 汉王科技股份有限公司 Frame-skipping scanning and recognizing device and method

Also Published As

Publication number Publication date
CN101086766A (en) 2007-12-12

Similar Documents

Publication Publication Date Title
CN100517371C (en) Stitching Method of Fingerprint Image Frame Sequence Based on Waveform Matching
US11789545B2 (en) Information processing device and method, program and recording medium for identifying a gesture of a person from captured image data
JP7246104B2 (en) License plate identification method based on text line identification
AU2022201893B2 (en) Electronic device and operating method thereof
Benenson et al. Pedestrian detection at 100 frames per second
CN101267493B (en) Correction device and method for perspective deformed document image
CN103034831B (en) Method and system for identifying linear bar code
CN105205439B (en) Method for calculating area of fingerprint overlapping region and electronic device
CN102622593B (en) Text recognition method and system
US20110063468A1 (en) Method and apparatus for retrieving label
KR20130094862A (en) Object tracking device, object tracking method, and control program
US20160012600A1 (en) Image processing method, image processing apparatus, program, storage medium, production apparatus, and method of producing assembly
US20150146943A1 (en) Method of recognizing contactless fingerprint and electronic device for performing the same
CN100473141C (en) Image angle detection device and scanning line interpolation device including the device
CN105303187B (en) A kind of image-recognizing method of DNA sequencing and device
CN101626518A (en) Method and system for detecting resolution of photographic device
CN102782705A (en) Resolution adjustment of an image that includes text undergoing an OCR process
EP2924610A2 (en) Flesh color detection condition determining apparatus, and flesh color detection condition determining method
CN103630299A (en) Positioning method and device for real time centroid of large-pixel light spot image
CN110210467A (en) A kind of formula localization method, image processing apparatus, the storage medium of text image
CN113628113A (en) Image splicing method and related equipment thereof
CN1790377B (en) Reverse character recognition block sorting method and text line generation method
CN114140620B (en) A method for detecting straight line contour of an object
US7734090B2 (en) Information processing device and method, and program
CN111582177A (en) Image detection method and related device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
DD01 Delivery of document by public notice

Addressee: Wang Peng

Document name: Notification to Pay the Fees

DD01 Delivery of document by public notice

Addressee: Wang Peng

Document name: Notification of Termination of Patent Right

C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090722

Termination date: 20100604