WO2019205290A1 - 一种图像检测方法、装置、计算机设备及存储介质 - Google Patents

一种图像检测方法、装置、计算机设备及存储介质 Download PDF

Info

Publication number
WO2019205290A1
WO2019205290A1 PCT/CN2018/094399 CN2018094399W WO2019205290A1 WO 2019205290 A1 WO2019205290 A1 WO 2019205290A1 CN 2018094399 W CN2018094399 W CN 2018094399W WO 2019205290 A1 WO2019205290 A1 WO 2019205290A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
finger
finger vein
preset
Prior art date
Application number
PCT/CN2018/094399
Other languages
English (en)
French (fr)
Inventor
惠慧
侯丽
Original Assignee
平安科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 平安科技(深圳)有限公司 filed Critical 平安科技(深圳)有限公司
Publication of WO2019205290A1 publication Critical patent/WO2019205290A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/14Vascular patterns

Definitions

  • the present application relates to the field of image processing technologies, and in particular, to an image detection method, apparatus, computer device, and storage medium.
  • Finger vein recognition technology is a new generation of biometric recognition technology with potential applications. In order to locate the vein area for identification, an important task is to detect the edge of the finger.
  • the collection quality of commonly used finger vein collection devices is not high, and all of the collected low-quality finger vein images are collected, and the existing classical edge detection algorithm has higher quality requirements for the collected finger vein images, when the finger veins When the quality of the image is low, the detection of the edge of the finger is not satisfactory, and the edge of the finger cannot be accurately positioned.
  • the embodiment of the present application provides an image detecting method, device, computer device and storage medium to solve the problem that the finger vein image cannot be accurately positioned for a low-quality finger vein image.
  • An image detection method comprising:
  • An image detecting device comprising:
  • An acquisition module configured to acquire an original finger vein image from the collection device
  • a transform module configured to perform Gabor filtering transformation on the finger vein image to obtain an enhanced image
  • a binarization module configured to perform binarization processing on the enhanced image to obtain a binarized image
  • An expansion module for performing expansion processing on the binarized image to obtain an expanded image
  • a denoising module configured to identify a finger vein pattern in the expanded image, and perform a deletion process on the finger vein pattern to obtain a denoising image
  • an extension module configured to identify a finger edge texture in the denoised image, and perform extension processing on the finger edge texture to obtain a complete finger edge image.
  • a computer device comprising a memory, a processor, and computer readable instructions stored in the memory and executable on the processor, the processor implementing the image detecting method when the computer readable instructions are executed step.
  • One or more non-volatile readable storage media storing computer readable instructions, when executed by one or more processors, causing the one or more processors to perform the image detection The steps of the method.
  • FIG. 1 is a schematic diagram of an application environment of an image detecting method provided in an embodiment of the present application
  • FIG. 2 is a flowchart of an implementation of an image detecting method provided in an embodiment of the present application
  • FIG. 3 is a flowchart of an implementation of performing grayscale processing on a finger vein image in an image detecting method according to an embodiment of the present application
  • step S5 is a flowchart of an implementation of step S5 in the image detecting method provided by the embodiment of the present application.
  • FIG. 5 is a flowchart of implementing step S6 in the image detecting method provided by the embodiment of the present application.
  • FIG. 6 is a diagram showing an example of a center line and a preset amplitude range in a noise-removed image in the image detecting method provided in the embodiment of the present application;
  • FIG. 7 is a diagram showing an example of locations of a central pixel point and its neighbors in an image detecting method provided in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an image detecting apparatus provided in an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a computer device provided in an embodiment of the present application.
  • FIG. 1 shows an application environment provided by an embodiment of the present application, where the application environment includes a server and a client, wherein the server and the client are connected through a network, and the client is configured to collect the finger vein image, and Sending the collected finger vein image to the server.
  • the client can be specifically, but not limited to, a camera, a camera, a scanner, or a finger vein image capturing device with other camera functions; the server is used to perform finger edge on the finger vein image.
  • the server can be implemented by a server cluster composed of a separate server or multiple servers.
  • the image detection method provided by the embodiment of the present application is applied to a server.
  • FIG. 2 shows an implementation flow of the image detecting method provided by this embodiment. Details are as follows:
  • S1 Acquire an original finger vein image from the acquisition device.
  • the original finger vein image refers to a finger vein image collected directly from the finger vein collection device without any treatment.
  • the quality of the finger vein images collected by the commonly used finger vein collection device is relatively low, and the method provided by the embodiment of the present application is It can accurately identify the edge of low-quality finger veins, thus improving the accuracy of finger edge detection in finger vein images and applicability to a variety of different finger vein collection devices.
  • S2 Perform Gabor filtering transformation on the finger vein image to obtain an enhanced image.
  • the image is enhanced by a Gabor filter transform method, and finally the processed enhanced image is obtained.
  • the finger vein image is convoluted according to the Gabor filter function, and the enhanced image is obtained by the convolution operation result.
  • the convolution operation refers to performing a series of operations on each pixel point in the finger vein image by using a convolution kernel, which is a preset matrix template for performing operations on the finger vein image, which may specifically be A square grid structure, such as a 3*3 matrix, each element in the matrix has a preset weight value.
  • a convolution kernel for calculation the center of the convolution kernel is placed to be calculated.
  • the product between the weight value of each element in the convolution kernel and the pixel value of the image pixel it covers is calculated and summed, and the result is the new pixel value of the target pixel.
  • the Gabor filter transform belongs to the windowed Fourier transform.
  • the Gabor function can extract the relevant features of the image in different scales and different directions in the frequency domain to achieve the enhancement effect on the image.
  • S3 Perform binarization processing on the enhanced image to obtain a binarized image.
  • the image on the basis of the enhanced image acquired in step S2, in order to make the pixel value of the pixel in the image only present 0 or 255, that is, the image only presents two colors of black or white, further enhancement is needed.
  • the image is binarized.
  • Binarization is to set the pixel value of the pixel on the image to 0 or 255, that is, to present the entire image with a distinct black and white visual effect.
  • each pixel point in the enhanced image acquired in step S2 is scanned. If the pixel value of the pixel point is less than a preset pixel threshold, the pixel value of the pixel point is set to 0, that is, the pixel point becomes black. If the pixel value of the pixel is greater than or equal to the pixel threshold of the preset value, the pixel value of the pixel is set to 255, that is, the pixel becomes white, and a binarized image is obtained.
  • S4 Perform expansion processing on the binarized image to obtain an expanded image.
  • the binarized image on the basis of the binarized image acquired in step S3, since the binarized image only presents two colors of ⁇ and white, that is, only black pixel points and white pixel points in the image, and the background color is Black pixel points are formed, and the hand fingerprint is composed of white pixel points.
  • the pixel value of each pixel is obtained by traversing each pixel in the binarized image, if the pixel If the pixel value of the dot is 255, it is confirmed that the pixel is a white pixel.
  • the discontinuous white pixel it is determined that the hand fingerprint path is discontinuous, that is, the break of the fingerprint path, and the break is expanded and expanded. Processing makes the fingerprint path more complete and eventually results in an expanded image.
  • the expansion processing refers to setting the boundary pixel point of the broken fingerprint path in the binarized image as a white pixel point.
  • the break can be joined by the expansion process to complete the hand fingerprint path.
  • the image can be directly expanded, and the pixel with the pixel value of 255 in the image, that is, the white pixel is expanded, thereby obtaining the expanded image.
  • S5 Identify the finger vein pattern in the expanded image, delete the finger vein pattern, and obtain a denoising image.
  • the finger vein pattern refers to the pattern of the finger vein blood vessel, which is different from the finger edge texture, and the finger vein pattern is short in the expanded image, and is irregularly distributed in the entire expanded image.
  • the length of the texture is determined according to the number of consecutive white pixel points. For example, if the number of consecutive white pixel points is 10, the length of the texture formed is 10 .
  • the deletion process may specifically change the pixel value of the pixel of the finger vein pattern from 255 to 0, that is, convert the white pixel point into a black pixel point.
  • preset threshold can be set according to the needs of the actual application, and is not limited herein.
  • S6 Identify the edge of the finger in the denoised image, and extend the edge of the finger to obtain a complete image of the finger edge.
  • the texture formed by consecutive white pixel points in the denoised image is recognized as a finger.
  • the edge texture according to the denoising image obtained in step S5, in order to make the missing portion of the finger edge texture in the image complete, it is necessary to further extend the finger edge texture, that is, to extend the texture formed by consecutive white pixel points, The resulting complete finger edge image is finally obtained.
  • the extension processing refers to extending in the direction of the finger on the basis of the edge of the finger portion that has been found until the preset position is reached.
  • the preset position may be a boundary position of the finger vein image, or may be a position at a predetermined distance from the boundary position, which may be specifically set according to the needs of the actual application, and is not limited herein.
  • the enhanced image is obtained by performing Gabor filtering on the finger vein image, and the enhanced image is further binarized to obtain a binarized image, and then the binarized image is expanded to obtain an expanded image.
  • the processed denoised image is obtained, and finally the finger edge texture in the denoised image is identified and extended, and the processed complete finger edge image is obtained.
  • the image quality of the finger vein image is improved by the Gabor filter transform, so that the detection accuracy can be effectively improved when detecting the edge of the finger; on the other hand, the enhanced image after the Gabor transform is sequentially binarized and expanded.
  • the processing, denoising and extension processing processes can effectively remove the interfering objects, retain and accurately extract the image of the finger edge, and realize the finger edge of the low-quality finger vein image collected by the low-end finger vein collection device. Accurate positioning, effectively improve the accuracy of finger edge detection in finger vein images, and applicability to a variety of different finger vein collection devices.
  • the image detecting method further includes:
  • the pixel points in the finger vein image are traversed according to a preset traversal manner, and RGB component values of each pixel point are obtained, wherein R, G, and B respectively represent colors of three channels of red, green, and blue.
  • the preset traversal manner may specifically be that the pixel point of the upper left corner of the finger vein image is used as a starting point, and the line is traversed from top to bottom from left to right, or may be traversed from both sides of the finger vein image at the same time. It can also be other traversal methods, and there are no restrictions here.
  • x and y are the abscissa and ordinate of each pixel in the finger vein image
  • g(x, y) is the gray value after the grayscale processing of the pixel (x, y)
  • R(x, y) ) is the color component of the R channel of the pixel (x, y)
  • G(x, y) is the color component of the G channel of the pixel (x, y)
  • B(x, y) is the pixel (x, y)
  • the color component of the B channel, k 1 , k 2 , k 3 are respectively R channel, G channel and B channel corresponding proportion parameter
  • is a preset adjustment parameter.
  • the RGB model is a commonly used color information expression method, which uses the brightness of the three primary colors of red, green and blue to quantitatively represent the color.
  • This model also known as the additive color mixing model, is a method in which RGB three-color light is superimposed on each other to achieve color mixing, and thus is suitable for display of an illuminant such as a display.
  • the gradation value is calculated by using the weighted average of the formula (1).
  • the image may be grayed out by the component method, the maximum value method, or the average value method. There are no restrictions here.
  • the finger vein image is expressed by formula (1) according to the acquired RGB component values of each pixel point.
  • the grayscale processing is performed to realize setting the pixel value range of the pixel points in the image between 0 and 255, further reducing the original data amount of the image, and improving the calculation efficiency in the subsequent processing calculation.
  • the Gabor filter transform is performed on the finger vein image according to formula (2):
  • x and y are the abscissa and ordinate of the pixel point in the finger vein image
  • K is the direction index
  • ⁇ k is the direction perpendicular to the finger vein image
  • m is the scale level
  • ⁇ m is the mth scale
  • f m is the center frequency of the mth level
  • is the spatial aspect ratio
  • is the preset bandwidth
  • I(x, y) is the finger vein image.
  • the preset bandwidth ⁇ is 1, and may be set according to actual requirements.
  • the limitation is not limited here.
  • the wavelength of the Gabor filter is determined by the value of v, and the value of v is determined by the preset bandwidth ⁇ . The value is determined, so the purpose of adjusting the wavelength is achieved by setting the value of ⁇ .
  • the scale level m refers to the number of frequency domain window scales in the Gabor filter, and the number can be set according to the needs of the actual application, and is not limited herein.
  • the finger vein image is transformed by the Gabor filter function of the formula (2), thereby filtering the high frequency wave of the finger vein image, leaving only the low frequency portion, in the vertical
  • the low-frequency wave is filtered out in the direction of the grain, leaving only the high-frequency part, and finally the image is highlighted, that is, the enhanced image is obtained by Gabor filtering.
  • the Gabor filter transform is performed on the finger vein image by the formula (2), and the image can be quickly highlighted to achieve the image enhancement effect, thereby improving the image quality of the finger vein image and the finger vein.
  • the discrimination rate of the lines in the image so that accurate detection can be realized in the subsequent detection of the finger edge, and the accuracy of the finger edge recognition is improved.
  • the finger vein pattern in the identification expansion image mentioned in step S5 is deleted by the specific embodiment to delete the finger vein pattern to obtain the denoised image.
  • the specific implementation method will be described in detail.
  • FIG. 4 shows a specific implementation process of step S5 provided by the embodiment of the present application, which is described in detail as follows:
  • the preset pixel value may be specifically 255.
  • the pixel in the expanded image is traversed, and the pixel with the pixel value of 255 is identified. That is, the white pixel points acquire the texture formed by the continuous white pixel points.
  • the i+k+a pixel points to the i+th are all 255
  • the i+k+a pixel points to the i+th are all 255
  • the pixel values of k+a+b pixels are all 255, that is, all are white pixels
  • two stripe paths are identified by traversing the pixels of the row, that is, i+k consecutive white pixels and a+b Continuous white pixels.
  • S52 Calculate the length of the texture for each stripe path. If the length is less than the preset first threshold, set the pixel value of all the pixels in the texture to the target pixel value to obtain a denoising image.
  • the finger vein pattern and the finger edge pattern exist at the same time, in order to retain only the finger edge texture, it is necessary to recognize the finger vein pattern and delete it, and finally obtain an image with only the finger edge texture.
  • the length of the texture is determined according to the number of pixels included in the texture, and the length of the texture is compared with a preset first threshold. If the length of the texture is less than the preset first threshold, it is confirmed that the texture is noise, and the pixel value of each pixel on the texture is set as the target pixel value.
  • the target pixel value is 0, that is, the white pixel points on the texture are changed to black pixel points; if the length of the texture is greater than or equal to the preset first threshold, it is confirmed that the texture is not noise, and no processing is performed, and finally the reservation is greater than the preset first.
  • the threshold image of the threshold that is, the finger edge texture image, serves as a denoised image.
  • the neighbor point refers to a pixel point on the left and right sides of the pixel point.
  • a texture formed by consecutive white pixel points is obtained, and for each stripe path, according to the texture length and the preset first threshold value, Deleting the texture smaller than the preset first threshold, finally obtaining a denoised image, comparing the texture with a preset first threshold, and identifying a texture smaller than the preset first threshold as a finger vein pattern, which is greater than the pre-
  • the texture of the first threshold is identified as the finger edge texture, thereby distinguishing the two textures, and the finger vein pattern in the finger vein image is excluded by deleting the texture smaller than the preset first threshold, and the finger vein pattern is reduced to the subsequent finger edge.
  • the detected interference improves the accuracy of subsequent finger edge detection.
  • the finger edge texture in the identified denoising image mentioned in step S6 is extended by a specific embodiment to extend the finger edge texture to obtain a complete finger.
  • the specific implementation method of the edge image is described in detail.
  • FIG. 5 shows a specific implementation process of step S6 provided by the embodiment of the present application, which is described in detail as follows:
  • S61 Acquire a pixel point of a preset pixel value on a center line in the denoised image as a center pixel point.
  • the center line of the denoising image is a line in the middle of the denoising image and in a direction perpendicular to the finger, and a pixel point with a pixel value of 255 on the center line, that is, a white pixel point, is selected as Center pixel.
  • the central pixel points are the pixel points of the upper and lower boundaries of the finger.
  • the finger is horizontally placed, the center line is a line in the middle of the denoised image and perpendicular to the direction of the finger, and the central pixel point is a pixel point on the upper boundary of the finger and the lower boundary of the finger, wherein The central pixel point M is a pixel point on the upper boundary of the finger.
  • the central pixel point M is used as a starting point, and in the preset amplitude range, the left pixel point, the upper side pixel point, the lower side pixel point, and the upper left side pixel of the starting point neighbor are respectively leftward to the left.
  • the point and the lower left pixel are traversed. If the pixel is 255 from the 5 pixels, the pixel with the pixel value of 255 is used as the starting point, and the 5 pixels adjacent to the point are continued.
  • the dots are traversed until the pixel values of the five neighboring pixels are not 255, and the left edge texture composed of consecutive pixel points having a pixel value of 255, that is, the left edge texture composed of consecutive white pixel points, is acquired.
  • the position of the starting point and the pixel on the left side, the pixel on the top side, the pixel on the bottom side, the pixel on the upper left side, and the pixel on the lower left side are as shown in FIG. 7 .
  • the coordinates of the central pixel point M are (0, 0)
  • the coordinates of the left pixel point A are (-1, 0)
  • the coordinates of the upper pixel point B are (0, 1)
  • the coordinates of the lower pixel point C are (0, -1)
  • the coordinates of the upper left pixel point E are (-1, 1)
  • the coordinates of the lower left pixel point F are (-1, -1).
  • the traversal range of the finger edge is limited by the preset amplitude range, and the preset amplitude range refers to the amplitude obtained by expanding the preset number of pixels upward and downward with the central pixel as the center. Range, by presetting the range of amplitudes, can ensure the accuracy of traversal, and also reduce unnecessary traversal, thereby improving recognition efficiency.
  • edge of the finger is divided into two parts: the upper boundary of the finger and the lower boundary of the finger, when traversing, the upper boundary of the finger edge and the lower boundary of the finger edge are traversed at the same time, and the left edge texture of the upper boundary of the finger edge is respectively acquired. And the left edge of the border below the finger.
  • the length of the left edge texture is determined according to the number of pixel points included in the left edge texture acquired in step S62, and the length is compared with a preset second threshold, if the length is less than a preset second threshold. , confirming that the left edge line is incomplete, and setting the pixel value of the left neighbor point of the leftmost pixel of the left edge line to 255, that is, changing the left neighbor point from the black pixel point to the white pixel point, and After the length is updated to include the length of the left edge texture of the left neighbor point, the traversal is continued to the left according to the traversal manner of step S63 until the length is equal to the second threshold.
  • the second threshold determines the length of the fingerprint to be intercepted.
  • the left edge is considered to be a complete left edge, so the second threshold can be passed.
  • the central pixel point M is used as a starting point, and in the preset amplitude range, the right pixel point, the upper side pixel point, the lower side pixel point, and the upper right side pixel point of the vicinity of the starting point are respectively turned to the right. And traversing the pixel at the lower right side. If the pixel point with the pixel value of 255 is traversed from the 5 pixel points, the pixel point with the pixel value of 255 is taken as the starting point, and the 5 pixel points of the neighboring point are continued. The traversal is performed until the pixel values of the five neighboring pixels are not 255, and the right edge texture composed of consecutive pixel points having a pixel value of 255, that is, the right edge texture composed of consecutive white pixel points, is acquired.
  • the positions of the starting point and the pixel on the right side, the pixel on the upper side, the pixel on the lower side, the pixel on the upper right side, and the pixel on the lower right side are as shown in FIG. 7 .
  • the coordinates of the central pixel point are (0, 0)
  • the coordinates of the right pixel point are (1, 0)
  • the coordinates of the upper pixel point are (0, 1)
  • the coordinates of the lower pixel point are (0, - 1)
  • the coordinates of the upper right pixel are (1, 1)
  • the coordinates of the lower left pixel are (1, -1).
  • the length of the right edge texture is determined according to the number of pixels included in the right edge texture acquired in step S64, and the length is compared with a preset second threshold, if the length is less than a preset second threshold. , it is confirmed that the right edge line is incomplete, and the pixel value of the right neighbor point of the rightmost pixel of the right edge line is set to 255, that is, the right neighbor point is changed from the black pixel point to the white pixel point, and the After the length is updated to include the length of the right edge texture of the right neighbor point, the traversal is continued to the right according to the traversal mode of step S64 until the length is equal to the second threshold.
  • step S62 to step S63 and step S64 to step S65 which may be a parallel execution relationship, that is, processing the left edge texture and the right edge texture at the same time, which can improve the recognition efficiency.
  • the finger edge is extended to the left in the finger vein image by steps S62 and S63, and the finger edge is extended to the right in the finger vein image by steps S64 and S65, and the obtained left is obtained.
  • the edge lines and the right edge lines together form a complete finger edge image.
  • the pixel points in the preset amplitude range are traversed to the left to obtain consecutive preset pixels.
  • the left edge of the dot is formed, and the length of the left edge is compared with a preset second threshold. If the length is less than the preset second threshold, the pixel value of the left neighbor is set to a preset pixel value. And update the length to include the length of the left neighbor point, and then continue to traverse to the left until the length reaches the second threshold, to obtain the complete left edge texture, and similarly, the right edge texture is processed in the same manner. The complete right edge texture is obtained.
  • the complete left edge texture and the complete right edge texture form a complete finger edge image.
  • the image is segmented by the center line to distinguish the left and right parts of the image, and then the left and right parts are simultaneously
  • the texture is extended to quickly identify the texture and improve the recognition efficiency of the texture.
  • the extension is performed within a preset range. It can ensure the accuracy of the extension processing and improve the accuracy, so as to achieve accurate positioning of the finger edge, effectively improve the accuracy of the finger edge detection in the finger vein image, and also improve the working efficiency of the finger edge detection.
  • FIG. 8 shows an image detecting device corresponding to the image detecting method provided in the above method embodiment. For convenience of description, only the embodiment of the present application is shown. part.
  • the image detecting apparatus includes an acquisition module 81, a transformation module 82, a binarization module 83, an expansion module 84, a denoising module 85, and an extension module 86.
  • Each function module is described in detail as follows:
  • An acquisition module 81 configured to acquire an original finger vein image from the collection device
  • the transforming module 82 is configured to perform Gabor filtering transformation on the finger vein image to obtain an enhanced image
  • a binarization module 83 configured to perform binarization processing on the enhanced image to obtain a binarized image
  • the expansion module 84 is configured to perform expansion processing on the binarized image to obtain an expanded image
  • the denoising module 85 is configured to identify a finger vein pattern in the inflated image, perform a deletion process on the finger vein pattern, and acquire a denoising image;
  • the extension module 86 is configured to identify a finger edge texture in the denoised image, and extend the finger edge texture to obtain a complete finger edge image.
  • the image detecting apparatus further includes:
  • An RGB obtaining module 87 is configured to traverse pixel points in the finger vein image to obtain RGB component values of each pixel point;
  • the graying module 88 is configured to perform grayscale processing on the finger vein image according to the RGB component value of the pixel point according to the following formula:
  • x and y are the abscissa and ordinate of each pixel in the finger vein image
  • g(x, y) is the gray value after the grayscale processing of the pixel (x, y)
  • R(x, y) ) is the color component of the R channel of the pixel (x, y)
  • G(x, y) is the color component of the G channel of the pixel (x, y)
  • B(x, y) is the pixel (x, y)
  • the color component of the B channel, k 1 , k 2 , k 3 are respectively R channel, G channel and B channel corresponding proportion parameter
  • is a preset adjustment parameter.
  • transform module 82 includes:
  • the Gabor sub-module 821 is configured to perform Gabor filtering transformation on the finger vein image according to the following formula:
  • x and y are the abscissa and ordinate of the pixel point in the finger vein image
  • K is the direction index
  • ⁇ k is the direction perpendicular to the finger vein image
  • m is the scale level
  • ⁇ m is the mth scale
  • f m is the center frequency of the mth level
  • is the spatial aspect ratio
  • is the preset bandwidth
  • I(x, y) is the finger vein image.
  • the denoising module 85 includes:
  • the obtaining sub-module 851 is configured to traverse pixel points in the expanded image to obtain a texture formed by consecutive pixel points of the same preset pixel value;
  • the filtering sub-module 852 is configured to calculate the length of the texture for each stripe path. If the length is less than the preset first threshold, the pixel value of all the pixels in the texture is set as the target pixel value to obtain a denoising image.
  • extension module 86 includes:
  • a central sub-module 861 configured to acquire a pixel point of a preset pixel value on a center line in the denoised image as a central pixel point;
  • the left edge sub-module 862 is configured to traverse the pixel points in the preset amplitude range to the left, and obtain the left edge texture formed by consecutive pixel points of the same preset pixel value;
  • the left extension sub-module 863 is configured to set a pixel value of the left neighbor point to a preset pixel value if the length of the left edge texture is less than a preset second threshold, and add the left neighbor point to the left edge texture And continuing to traverse to the left until the length of the left edge line reaches a second threshold, wherein the left neighbor point refers to a pixel point adjacent to the left side of the leftmost pixel point of the edge line;
  • the right edge sub-module 864 is configured to traverse the pixel points in the preset amplitude range to the right, and obtain the right edge texture formed by consecutive pixel points of the same preset pixel value;
  • the right extension sub-module 865 is configured to set the pixel value of the right neighbor point to a preset pixel value if the length of the right edge line is less than a preset second threshold, and add the right neighbor point to the right edge line And continuing to traverse to the right until the length of the right edge line reaches a second threshold, wherein the right neighbor point refers to a pixel point adjacent to the right side of the rightmost pixel point of the edge line;
  • the sub-module 866 is configured to combine the left edge texture and the right edge texture into a finger edge image.
  • the embodiment provides one or more non-transparent readable storage media having computer readable instructions stored thereon, the computer readable instructions being stored by one or When the plurality of processors are executed, causing the one or more processors to execute the image detecting method in the method embodiment, or when the computer readable instructions are executed by one or more processors, implementing the image detecting device in the device embodiment
  • the function of each module To avoid repetition, we will not repeat them here.
  • non-volatile readable storage medium may include any entity or device capable of carrying the computer readable instruction code, a recording medium, a USB flash drive, a mobile hard disk, a magnetic disk, an optical disk, a computer memory, only Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier signals, and telecommunication signals.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • FIG. 9 is a schematic diagram of a computer device according to an embodiment of the present application.
  • computer device 90 of this embodiment includes a processor 91, a memory 92, and computer readable instructions 93, such as an image detection program, stored in memory 92 and operative on processor 91.
  • the processor 91 implements the steps in the above-described embodiments of the respective image detecting methods when the computer readable instructions 93 are executed, such as steps S1 through S6 shown in FIG.
  • the processor 91 implements the functions of the modules/units in the various apparatus embodiments described above when the computer readable instructions 93 are executed, such as the functions of the modules 81 through 86 shown in FIG.
  • computer readable instructions 93 may be partitioned into one or more modules/units, one or more modules/units being stored in memory 92 and executed by processor 91 to complete the application.
  • the one or more modules/units can be a series of computer readable instruction instructions segments that are capable of performing a particular function for describing the execution of computer readable instructions 93 in computer device 90.
  • the computer readable instructions 93 can be divided into an acquisition module, a transformation module, a binarization module, an expansion module, a denoising module, and an extension module.
  • the specific functions of each module are as shown in Embodiment 2, to avoid repetition. I will not repeat them one by one.
  • Computer device 90 can be a computing device such as a desktop computer, a notebook, a palmtop computer, and a cloud server.
  • Computer device 90 may include, but is not limited to, processor 91, memory 92. It will be understood by those skilled in the art that FIG. 9 is merely an example of computer device 90 and does not constitute a limitation to computer device 90. It may include more or fewer components than those illustrated, or may combine certain components, or different components.
  • computer device 90 may also include input and output devices, network access devices, buses, and the like.
  • the processor 91 may be a central processing unit (CPU), or may be other general-purpose processors, a digital signal processor (DSP), an application specific integrated circuit (ASIC), and an off-the-shelf device.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA Field-Programmable Gate Array
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • Memory 92 may be an internal storage unit of computer device 90, such as a hard disk or memory of computer device 90.
  • the memory 92 may also be an external storage device of the computer device 90, such as a plug-in hard disk equipped on the computer device 90, a smart memory card (SMC), a Secure Digital (SD) card, and a flash memory card (Flash). Card) and so on.
  • the memory 92 may also include both an internal storage unit of the computer device 90 and an external storage device.
  • Memory 92 is used to store computer readable instructions and other programs and data required by computer device 90.
  • the memory 92 can also be used to temporarily store data that has been output or is about to be output.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

一种图像检测方法、装置、计算机设备及存储介质,图像检测方法包括:从采集设备中获取原始的手指静脉图像(S1);对所述手指静脉图像进行Gabor滤波变换,得到增强图像(S2);对所述增强图像进行二值化处理,获取二值化图像(S3);对所述二值化图像进行膨胀处理,得到膨胀化图像(S4);识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像(S5);识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像(S6)。实现了对手指边缘的准确定位,从而有效提高手指静脉图像中手指边缘检测的准确性,以及对多种不同指静脉采集设备的适用性。

Description

一种图像检测方法、装置、计算机设备及存储介质
本申请以2018年4月28日提交的申请号为201810398765.3,名称为“一种图像检测方法、装置、终端设备及存储介质”的中国发明专利申请为基础,并要求其优先权。
技术领域
本申请涉及图像处理技术领域,尤其涉及一种图像检测方法、装置、计算机设备及存储介质。
背景技术
手指静脉识别技术是新一代的生物特征识别技术,具有潜在在广泛应用。为定位用于识别的静脉区域,一个重要的工作是对手指边缘进行检测。
目前,常用的指静脉采集设备的采集质量均不高,采集到的均为低质量手指静脉图像,而现有的经典边缘检测算法对采集到的手指静脉图像的质量要求较高,当手指静脉图像的质量较低时,其手指边缘检测效果不理想,无法准确定位手指边缘。
发明内容
本申请实施例提供一种图像检测方法、装置、计算机设备及存储介质,以解决对低质量的手指静脉图像无法准确定位手指边缘的问题。
一种图像检测方法,包括:
从采集设备中获取原始的手指静脉图像;
对所述手指静脉图像进行Gabor滤波变换,得到增强图像;
对所述增强图像进行二值化处理,获取二值化图像;
对所述二值化图像进行膨胀处理,得到膨胀化图像;
识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像;
识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像。
一种图像检测装置,包括:
采集模块,用于从采集设备中获取原始的手指静脉图像;
变换模块,用于对所述手指静脉图像进行Gabor滤波变换,得到增强图像;
二值化模块,用于对所述增强图像进行二值化处理,获取二值化图像;
膨胀化模块,用于对所述二值化图像进行膨胀处理,得到膨胀化图像;
去噪模块,用于识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像;
延伸模块,用于识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像。
一种计算机设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机可读指令,所述处理器执行所述计算机可读指令时实现上述图像检测方法的步骤。
一个或多个存储有计算机可读指令的非易失性可读存储介质,所述计算机可读指令被一个或多个处理器执行时,使得所述一个或多个处理器执行所述图像检测方法的步骤。
本申请的一个或多个实施例的细节在下面的附图和描述中提出,本申请的其他特征和优点将从说明书、附图以及权利要求变得明显。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对本申请实施例的描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例中提供的图像检测方法的应用环境示意图;
图2是本申请实施例中提供的图像检测方法的实现流程图;
图3是本申请实施例提供的图像检测方法中对手指静脉图像进行灰度化处理的实现流程图;
图4是本申请实施例提供的图像检测方法中步骤S5的实现流程图;
图5是本申请实施例提供的图像检测方法中步骤S6的实现流程图;
图6是本申请实施例中提供的图像检测方法中去噪图像中的中心线及预设幅度范围的示例图;
图7是本申请实施例中提供的图像检测方法中中心像素点及其近邻点的位置的示例图;
图8是本申请实施例中提供的图像检测装置的示意图;
图9是本申请实施例中提供的计算机设备的示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
图1示出了本申请实施例提供的应用环境,该应用环境包括服务端和客户端,其中,服务端和客户端之间通过网络进行连接,客户端用于对手指静脉图像进行采集,并且将采集到的手指静脉图像发送到服务端,客户端具体可以但不限于是摄像机、相机、扫描仪或者带有其他拍照功能的手指静脉图像采集设备;服务端用于对手指静脉图像进行手指边缘检测,服务端具体可以用独立的服务器或者多个服务器组成的服务器集群实现。本申请实施例提供的图像检测方法应用于服务端。
请参阅图2,图2示出本实施例提供的图像检测方法的实现流程。详述如下:
S1:从采集设备中获取原始的手指静脉图像。
在本申请实施例中,原始的手指静脉图像是指未经过任何处理,直接从指静脉采集设备中采集到的手指静脉图像。
需要说明的是,由于不同的指静脉采集设备获取到的手指静脉图像的质量不同,通常使用的指静脉采集设备采集到的手指静脉图像的质量均比较低,通过本申请实施例提供的方法,能够对低质量的手指静脉图像进行准确的边缘识别,从而有效提高手指静脉图像中手指边缘检测的准确性,以及对多种不同指静脉采集设备的适用性。
S2:对手指静脉图像进行Gabor滤波变换,得到增强图像。
在本申请实施例中,根据步骤S1获取的手指静脉图像,为了进一步提高该手指静脉图像的质量,采用Gabor滤波变换的方法对图像作增强处理,最终得到处理后的增强图像。
具体地,根据Gabor滤波函数对手指静脉图像进行卷积运算,通过卷积运算结果获取增强图像。其中,卷积运算指的是使用一个卷积核对手指静脉图像中的每个像素点进行一系列操作,卷积核是预设的矩阵模板,用于与手指静脉图像进行运算,其具体可以是一个四方形的网格结构,例如3*3的矩阵,该矩阵中的每个元素都有一个预设的权重值,在使 用卷积核进行计算时,将卷积核的中心放置在要计算的目标像素点上,计算卷积核中每个元素的权重值和其覆盖的图像像素点的像素值之间的乘积并求和,得到的结果即为目标像素点的新像素值。
Gabor滤波变换属于加窗傅里叶变换,Gabor函数可以在频域不同尺度、不同方向上提取图像的相关特征,实现对图像的增强效果。
S3:对增强图像进行二值化处理,获取二值化图像。
在本申请实施例中,在步骤S2获取的增强图像的基础上,为了让图像中的像素点的像素值只呈现0或者255,即图像只呈现黑色或者白色两种颜色,需要进一步对该增强图像进行二值化处理。
二值化,就是将图像上的像素点的像素值设置为0或255,也就是将整个图像呈现出明显的只有黑和白的视觉效果。
具体地,扫描步骤S2获取的增强图像中的每个像素点,若该像素点的像素值小于预设的像素阈值,则将该像素点的像素值设为0,即为像素点变为黑色;若该像素点的像素值大于等于预设值的像素阈值,则将该像素点的像素值设为255,即像素点变为白色,得到二值化图像。
S4:对二值化图像进行膨胀处理,得到膨胀化图像。
在本申请实施例中,在步骤S3获取的二值化图像的基础上,由于二值化图像只呈现黒和白两种颜色,即图像中只有黑色像素点和白色像素点,且背景色由黑色像素点构成,手指纹路由白色像素点构成,为了使该图像中断裂的手指纹路连接起来,通过遍历二值化图像中的每个像素点,获取每个像素点的像素值,若该像素点的像素值为255则确认该像素点为白色像素点,通过检测不连续的白色像素点,确定手指纹路不连续的地方,即手指纹路的断裂处,对该断裂处进行膨胀处理,通过膨胀处理使手指纹路变得更加完整,最终得到膨胀化图像。
膨胀处理,是指将二值化图像中断裂的手指纹路的边界像素点设置为白色像素点,当两条手指纹路的边界相距较近,即手指纹路发生断裂的地方其断裂的长度较小时,通过膨胀处理能够将该断裂处连接起来,从而使手指纹路变得完整。
例如:在matlab应用工具中,通过调用该工具提供的膨胀函数imdilate()可以直接对图像进行膨胀处理,将图像中像素值为255的像素点,即白色像素点进行膨胀,从而得到膨胀化图像。
S5:识别膨胀化图像中的指静脉纹路,对指静脉纹路作删除处理,获取去噪图像。
在本申请实施例中,指静脉纹路是指手指静脉血管的纹路,不同于手指边缘纹路,指静脉纹路在膨胀化图像中长度较短,并且不规则的分布在整个膨胀化图像中。
具体地,在步骤S4得到的膨胀化图像中,根据连续的白色像素点的数量确定纹路的长度,例如,若连续的白色像素点的个数为10个,则其构成的纹路的长度为10。
若连续的白色像素点构成的纹路的长度小于预设长度阈值,则确认该纹路为指静脉纹路,并对其进行删除处理,得到去噪图像。其中,删除处理具体可以通过将指静脉纹路的像素点的像素值由255修改为0,即将白色像素点转化为黑色像素点。
需要说明的是,预设阈值可以根据实际应用的需要进行设置,此处不做限制。
S6:识别去噪图像中的手指边缘纹路,对手指边缘纹路作延伸处理,得到完整的手指边缘图像。
具体地,由于经过去噪处理后的去噪图像中只剩下手指边缘纹路,且背景色为黑色,手指边缘纹路为白色,故将去噪图像中连续的白色像素点构成的纹路识别为手指边缘纹路,根据步骤S5获取的去噪图像,为了让图像中手指边缘纹路缺少的部分补充完整,需要进一步对手指边缘纹路作延伸处理,即对由连续的白色像素点构成的纹路作延伸处理,最终得到处理后的完整的手指边缘图像。
延伸处理,是指在已经找到的部分手指边缘纹路的基础上,按照手指的方向进行延伸,直到达到预设位置为止。其中,预设位置可以是手指静脉图像的边界位置,还可以是与边界位置相距预定距离的位置,其具体可以根据实际应用的需要进行设置,此处不做限制。
在图2对应的实施例中,通过对手指静脉图像进行Gabor滤波变换得到增强图像,进一步对增强图像进行二值化处理获取二值化图像,再对二值化图像进行膨胀处理得到膨胀化图像,通过识别膨胀化图像中的指静脉纹路并作删除处理,得到处理后的去噪图像,最后识别去噪图像中的手指边缘纹路并对其作延伸处理,获取处理后的完整的手指边缘图像。一方面,通过Gabor滤波变换提高手指静脉图像的图像质量,使得在对手指边缘检测时能够有效提高检测的准确性;另一方面,通过对Gabor变换后的增强图像依次进行二值化处理、膨胀处理、去噪处理和延伸处理这一系列的处理过程,能够有效去除干扰对象,保留并准确提取手指边缘图像,从而实现对低端指静脉采集设备采集到的低质量手指静脉图像进行手指边缘的准确定位,有效提高手指静脉图像中手指边缘检测的准确性,以及对多种不同指静脉采集设备的适用性。
接下来,在图2对应的实施例的基础之上,在步骤S2提及的对手指静脉图像进行Gabor滤波变换,得到增强图像之前,还可以进一步对图像作灰度化处理,如图3所示,该图像检测方法还包括:
S7:对手指静脉图像中的像素点进行遍历,获取每个像素点的RGB分量值。
具体地,按照预设的遍历方式对手指静脉图像中的像素点进行遍历,获取每个像素点的RGB分量值,其中,R、G、B分别代表红、绿、蓝三个通道的颜色。
其中,预设的遍历方式具体可以是以手指静脉图像的左上角像素点为起点,从上往下从左往右的逐行遍历,也可以是从手指静脉图像的中线位置同时向两边遍历,还可以是其他遍历方式,此处不做限制。
S8:根据像素点的RGB分量值,按照公式(1)对手指静脉图像作灰度化处理:
Figure PCTCN2018094399-appb-000001
其中,x和y为手指静脉图像中每个像素点的横坐标和纵坐标,g(x,y)为像素点(x,y)灰度化处理后的灰度值,R(x,y)为像素点(x,y)的R通道的颜色分量,G(x,y)为像素点(x,y)的G通道的颜色分量,B(x,y)为像素点(x,y)的B通道的颜色分量,k 1,k 2,k 3分别为R通道,G通道和B通道对应的占比参数,σ为预设的调节参数。
在本申请实施例中,为了实现对手指静脉图像中信息内容的准确提取,首先需要对手指静脉图像进行灰度化处理,其中,k 1,k 2,k 3和σ的参数值可以根据实际应用的需要进行设置,此处不做限制,通过调节k 1,k 2,k 3的取值范围可以分别对R通道,G通道和B通道的占比进行调整,通过调节σ的取值范围对g(x,y)进行调整。
RGB模型是目前常用的一种彩色信息表达方式,它使用红、绿、蓝三原色的亮度来定量表示颜色。该模型也称为加色混色模型,是以RGB三色光互相叠加来实现混色的方法,因而适合于显示器等发光体的显示。
灰度化是指在RGB模型中,如果R=G=B时,则色彩表示只有一种灰度颜色,其中R=G=B的值叫灰度值,因此,灰度图像每个像素只需一个字节存放灰度值,灰度范围为 0-255。
需要说明的是,在本申请实施例中,通过公式(1)进行加权平均计算灰度值,在其他实施例中还可以采用分量法、最大值法或者平均值法对图像进行灰度化处理,此处不做限制。
在图3对应的实施例中,通过遍历手指静脉图像中的像素点并获取对应像素点的RGB分量值,根据获取到的每个像素点的RGB分量值,利用公式(1)对手指静脉图像进行灰度化处理,从而实现将图像中像素点的像素值范围设定在0-255之间,进一步减少图像原始数据量,提高在后续处理计算中的计算效率。
在图2对应的实施例的基础之上,下面通过一个具体的实施例对步骤S2提及的对手指静脉图像进行Gabor滤波变换,得到增强图像的具体实现方法进行详细说明,详述如下:
按照公式(2)对手指静脉图像进行Gabor滤波变换:
Figure PCTCN2018094399-appb-000002
其中,
Figure PCTCN2018094399-appb-000003
为Gabor滤波函数,x和y为手指静脉图像中像素点的横坐标和纵坐标,K为方向指数,θ k为垂直于手指静脉图像的方向,m为尺度级别,σ m为第m级尺度的标准差,f m为第m级的中心频率,γ为空间纵横比,ΔΦ为预设带宽,I(x,y)为手指静脉图像,
Figure PCTCN2018094399-appb-000004
为Gabor滤波变换后的增强图像。
在本申请实施例中,预设带宽ΔΦ为1,也可以根据实际需求进行设定,此处不做限制,Gabor滤波的波长由v的取值所决定,v的取值由预设带宽ΔΦ的取值决定,因此通过设置ΔΦ的值达到调节波长的目的。尺度级别m是指Gabor滤波中的频域窗尺度的数量,该数量可以根据实际应用的需要进行设置,此处不做限制。
具体地,使用预设带宽以及垂直于手指静脉图像的方向,利用公式(2)的Gabor滤波函数对手指静脉图像进行变换,从而将手指静脉图像的高频波滤掉,只留下低频部分,在垂直于纹路的方向上将低频波滤掉,只留下高频部分,最终使图像变得高亮,即通过Gabor滤波变换后得到增强图像。
在本申请实施例中,通过公式(2)对手指静脉图像进行Gabor滤波变换,能够快速地将图像变得高亮,达到图像增强的效果,从而提高手指静脉图像的图像质量,以及对手指静脉图像中纹路的辨别率,以便在后续进行手指边缘检测时能够实现准确检测,提高手指 边缘识别的准确性。
在图2对应的实施例的基础之上,下面通过一个具体的实施例对步骤S5中所提及的识别膨胀化图像中的指静脉纹路,对指静脉纹路作删除处理,获取去噪图像的具体实现方法进行详细说明。
请参阅图4,图4示出了本申请实施例提供的步骤S5的具体实现流程,详述如下:
S51:对膨胀化图像中的像素点进行遍历,获取相同预设像素值的连续像素点构成的纹路。
具体地,在本申请实施例中,预设像素值具体可以为255,根据步骤S4获取到的膨胀化图像,对该膨胀化图像中的像素点进行遍历,识别像素值为255的像素点,即白色像素点,获取连续的白色像素点构成的纹路。
例如,若膨胀化图像中的某行存在N个像素点,且第i个像素点至第i+k个像素点的像素值均为255,第i+k+a个像素点至第i+k+a+b个像素点的像素值均为255,即都为白色像素点,则通过遍历该行的像素点识别出两条纹路,即i+k个连续白色像素点和a+b个连续白色像素点。
S52:针对每条纹路,计算该纹路的长度,若长度小于预设的第一阈值,则将该纹路中所有像素点的像素值设置为目标像素值,得到去噪图像。
在本申请实施例中,由于手指静脉纹路和手指边缘纹路同时存在,为了只保留手指边缘纹路,需要识别出手指静脉纹路并将其删除,最终得到只有手指边缘纹路的图像。
具体地,针对步骤S51获取的每条纹路,根据该纹路包含的像素点数量确定该纹路的长度,并将该纹路的长度与预设的第一阈值进行比较。若该纹路的长度小于预设的第一阈值,则确认该纹路为噪点,并将该纹路上每个像素点的像素值均设为目标像素值,在本申请实施例中,目标像素值为0,即将该纹路上的白色像素点都改为黑色像素点;若该纹路的长度大于或者等于预设第一阈值,则确认该纹路不为噪点,不做处理,最终保留大于预设第一阈值的纹路图像,即手指边缘纹路图像,作为去噪图像。
可以理解的是,若检测到像素值为255的像素点,且该像素点的近邻点的像素值均为0,即该像素点为单个白色的像素点,其对应的纹路的长度的值为1,则将该像素点的像素值设为0,即该白色像素点被改为黑色像素点。其中,近邻点是指该像素点左边和右边的一个像素点。
在图4对应的实施例中,通过对膨胀化图像中的像素点进行遍历,获取由连续的白色像素点构成的纹路,针对每条纹路,根据纹路长度与预设的第一阈值进行对比,将小于预设的第一阈值的纹路进行删除,最终得到去噪图像,将纹路与预设的第一阈值进行比较,将小于预设的第一阈值的纹路识别为手指静脉纹路,将大于预设的第一阈值的纹路识别为手指边缘纹路,以此区分两种纹路,通过删除小于预设的第一阈值的纹路来排除手指静脉图像中的手指静脉纹路,减少手指静脉纹路对后续手指边缘检测的干扰,提高后续手指边缘检测的准确性。
在图2对应的实施例的基础之上,下面通过一个具体的实施例来对步骤S6中所提及的识别去噪图像中的手指边缘纹路,对手指边缘纹路作延伸处理,得到完整的手指边缘图像的具体实现方法进行详细说明。
请参阅图5,图5示出了本申请实施例提供的步骤S6的具体实现流程,详述如下:
S61:获取去噪图像中的中心线上预设像素值的像素点作为中心像素点。
在本申请实施例中,去噪图像的中心线是在去噪图像的中间位置并且与与手指垂直的方向的直线,选取该中心线上像素值为255的像素点,即白色像素点,作为中心像素点。可以理解的,中心像素点为手指上边界和下边界的像素点。
如图6所示,在该去噪图像中,手指水平放置,中心线为去噪图像的中间位置且与手指方向垂直的直线,中心像素点为手指上边界和手指下边界的像素点,其中,中心像素点 M为手指上边界的像素点。
S62:以中心像素点为起始点,向左对预设幅度范围内的像素点进行遍历,获取相同预设像素值的连续像素点构成的左边缘纹路。
具体地,继续如图6所示,以中心像素点M作为起始点,在预设幅度范围内,向左分别对该起始点近邻的左边像素点、上边像素点、下边像素点、左上边像素点和左下边像素点进行遍历,若从这5个像素点中遍历到像素值为255的像素点,则以该像素值为255的像素点为起始点,继续对该点近邻的5个像素点进行遍历,直到近邻的5个像素点的像素值都不为255为止,获取由像素值为255的连续的像素点构成的左边缘纹路,即由连续的白色像素点构成的左边缘纹路。
其中,起始点及其左边像素点、上边像素点、下边像素点、左上边像素点和左下边像素点的位置如图7所示。
例如,若中心像素点M的坐标为(0,0),则左边像素点A的坐标为(-1,0),上边像素点B的坐标为(0,1),下边像素点C的坐标为(0,-1),左上边像素点E的坐标为(-1,1),左下边像素点F的坐标为(-1,-1)。
由于手指边缘往往不是一条直线,因此通过预设幅度范围限定对手指边缘的遍历范围,预设幅度范围是指以中心像素点为中心,分别向上和向下扩展预设数量的像素点得到的幅度范围,通过预设幅度范围可以保证遍历的准确性,同时也减少不必要的遍历,从而提高识别效率。
需要说明的是,由于手指边缘分为手指上边界和手指下边界两部分,故在遍历时,同时对手指边缘上边界和手指边缘下边界进行遍历,分别获取到手指边缘上边界的左边缘纹路和手指下边界的左边缘纹路。
S63:若左边缘纹路的长度小于预设的第二阈值,则将左侧近邻点的像素值设置为预设像素值,并将左侧近邻点添加到左边缘纹路后,继续向左进行遍历,直到该左边缘纹路的长度达到第二阈值为止,其中,左侧近邻点是指与边缘纹路最左边的像素点左侧相邻的像素点。
具体地,根据步骤S62获取的左边缘纹路包含的像素点的数量,确定该左边缘纹路的长度,并将该长度与预设的第二阈值进行对比,若该长度小于预设的第二阈值,则确认该左边缘纹路不完整,将该左边缘纹路最左边像素点的左侧近邻点的像素值设为255,即将该左侧近邻点由黑色像素点改为白色像素点,并将该长度更新为包含该左侧近邻点的左边缘纹路的长度后,继续按照步骤S63的遍历方式向左进行遍历,直到长度等于第二阈值为止。
需要说明的是,第二阈值决定了需要截取指纹的长度,当左边缘纹路的长度与第二阈值相等的情况下,则认为该左边缘纹路为完整的左边缘纹路,故可以通过第二阈值来判断左边缘纹路的完整性。
S64:以中心像素点为起始点,向右对预设幅度范围内的像素点进行遍历,获取相同预设像素值的连续像素点构成的右边缘纹路。
具体地,如图6所示,以中心像素点M作为起始点,在预设幅度范围内,向右分别对该起始点近邻的右边像素点、上边像素点、下边像素点、右上边像素点和右下边像素点进行遍历,若从这5个像素点中遍历到像素值为255的像素点,则以该像素值为255的像素点为起始点,继续对该点近邻的5个像素点进行遍历,直到近邻的5个像素点的像素值都不为255为止,获取由像素值为255的连续的像素点构成的右边缘纹路,即由连续的白色像素点构成的右边缘纹路。
其中,起始点及其右边像素点、上边像素点、下边像素点、右上边像素点和右下边像素点的位置如图7所示。例如,若中心像素点的坐标为(0,0),则右边像素点的坐标为(1,0),上边像素点的坐标为(0,1),下边像素点的坐标为(0,-1),右上边像素点 的坐标为(1,1),左下边像素点的坐标为(1,-1)。S65:若右边缘纹路的长度小于预设的第二阈值,则将右侧近邻点的像素值设置为预设像素值,并将右侧近邻点添加到右边缘纹路后,继续向右进行遍历,直到该右边缘纹路的长度达到第二阈值为止,其中,右侧近邻点是指与边缘纹路最右边的像素点右侧相邻的像素点。
具体地,根据步骤S64获取的右边缘纹路包含的像素点的数量,确定该右边缘纹路的长度,并将该长度与预设的第二阈值进行对比,若该长度小于预设的第二阈值,则确认该右边缘纹路不完整,将该右边缘纹路最右边像素点的右侧近邻点的像素值设为255,即将该右侧近邻点由黑色像素点改为白色像素点,并将该长度更新为包含该右侧近邻点的右边缘纹路的长度后,继续按照步骤S64的遍历方式向右进行遍历,直到长度等于第二阈值为止。需要说明的是,步骤S62至步骤S63与步骤S64至步骤S65之间没有必然的先后执行顺序,其可以是并列执行的关系,即同时对左边缘纹路和右边缘纹路进行处理,可以提高识别效率。
S66:将左边缘纹路和右边缘纹路组成手指边缘图像。
在本申请实施例中,通过步骤S62和S63对指静脉图像中手指边缘向左进行延伸处理,并通过步骤S64和S65对指静脉图像中手指边缘向右进行延伸处理后,将得到的的左边缘纹路和右边缘纹路,共同组成完整的手指边缘图像。
在图5对应的实施例中,通过以指静脉图像中的中心线上的预设中心像素点作为起始点,向左对预设幅度范围内的像素点进行遍历,获取由连续的预设像素点构成的左边缘纹路,将该左边缘纹路的长度与预设的第二阈值进行对比,若该长度小于预设的第二阈值,则将左侧近邻点的像素值设置为预设像素值,并将该长度更新为包含左侧近邻点的长度,接着继续向左进行遍历,直到长度达到第二阈值为止,得到完整的左边缘纹路,同理,按照同样的方式对右边缘纹路进行处理,得到完整的右边缘纹路,最后将完整的左边缘纹路和完整的右边缘纹路组成完整的手指边缘图像,一方面利用中心线对图像进行分割达到区分图像左右两部分,再同时对左右两部分纹路进行延伸处理,能够快速识别纹路,提高纹路的识别效率,另一方面在预设幅度范围内进行延伸处理,能够保证延伸处理的准确性,提高准确度,从而实现对手指边缘的准确定位,有效提高手指静脉图像中手指边缘检测的准确性,同时也提高了手指边缘检测的工作效率。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
对应于上述方法实施例中的图像检测方法,图8示出了与上述方法实施例中提供的图像检测方法一一对应的图像检测装置,为了便于说明,仅示出了与本申请实施例相关的部分。
如图8所示,该图像检测装置包括:采集模块81,变换模块82,二值化模块83,膨胀化模块84,去噪模块85,延伸模块86。各功能模块详细说明如下:
采集模块81,用于从采集设备中获取原始的手指静脉图像;
变换模块82,用于对手指静脉图像进行Gabor滤波变换,得到增强图像;
二值化模块83,用于对增强图像进行二值化处理,获取二值化图像;
膨胀化模块84,用于对二值化图像进行膨胀处理,得到膨胀化图像;
去噪模块85,用于识别膨胀化图像中的指静脉纹路,对指静脉纹路作删除处理,获取去噪图像;
延伸模块86,用于识别去噪图像中的手指边缘纹路,对手指边缘纹路作延伸处理,得到完整的手指边缘图像。
进一步地,该图像检测装置还包括:
RGB获取模块87,用于对手指静脉图像中的像素点进行遍历,获取每个像素点的RGB分量值;
灰度化模块88,用于根据像素点的RGB分量值,按照如下公式对手指静脉图像作灰度化处理:
Figure PCTCN2018094399-appb-000005
其中,x和y为手指静脉图像中每个像素点的横坐标和纵坐标,g(x,y)为像素点(x,y)灰度化处理后的灰度值,R(x,y)为像素点(x,y)的R通道的颜色分量,G(x,y)为像素点(x,y)的G通道的颜色分量,B(x,y)为像素点(x,y)的B通道的颜色分量,k 1,k 2,k 3分别为R通道,G通道和B通道对应的占比参数,σ为预设的调节参数。
进一步地,变换模块82包括:
Gabor子模块821,用于按照如下公式对手指静脉图像进行Gabor滤波变换:
Figure PCTCN2018094399-appb-000006
Figure PCTCN2018094399-appb-000007
Figure PCTCN2018094399-appb-000008
Figure PCTCN2018094399-appb-000009
Figure PCTCN2018094399-appb-000010
Figure PCTCN2018094399-appb-000011
其中,
Figure PCTCN2018094399-appb-000012
为Gabor滤波函数,x和y为手指静脉图像中像素点的横坐标和纵坐标,K为方向指数,θ k为垂直于手指静脉图像的方向,m为尺度级别,σ m为第m级尺度的标准差,f m为第m级的中心频率,γ为空间纵横比,ΔΦ为预设带宽,I(x,y)为手指静脉图像,
Figure PCTCN2018094399-appb-000013
为增强图像。
进一步地,去噪模块85包括:
获取子模块851,用于对膨胀化图像中的像素点进行遍历,获取相同预设像素值的连续像素点构成的纹路;
筛选子模块852,用于针对每条纹路,计算该纹路的长度,若长度小于预设的第一阈值,则将该纹路中所有像素点的像素值设置为目标像素值,得到去噪图像。
进一步地,延伸模块86包括:
中心子模块861,用于获取去噪图像中的中心线上预设像素值的像素点作为中心像素点;
左边缘子模块862,用于以中心像素点为起始点,向左对预设幅度范围内的像素点进行遍历,获取相同预设像素值的连续像素点构成的左边缘纹路;
左延伸子模块863,用于若左边缘纹路的长度小于预设的第二阈值,则将左侧近邻点 的像素值设置为预设像素值,并将左侧近邻点添加到左边缘纹路后,继续向左进行遍历,直到左边缘纹路的长度达到第二阈值为止,其中,左侧近邻点是指与边缘纹路最左边的像素点左侧相邻的像素点;
右边缘子模块864,用于以中心像素点为起始点,向右对预设幅度范围内的像素点进行遍历,获取相同预设像素值的连续像素点构成的右边缘纹路;
右延伸子模块865,用于若右边缘纹路的长度小于预设的第二阈值,则将右侧近邻点的像素值设置为预设像素值,并将右侧近邻点添加到右边缘纹路后,继续向右进行遍历,直到右边缘纹路的长度达到第二阈值为止,其中,右侧近邻点是指与边缘纹路最右边的像素点右侧相邻的像素点;
组成子模块866,用于将左边缘纹路和右边缘纹路组成手指边缘图像。
本实施例提供的一种图像检测装置中各模块实现各自功能的过程,具体可参考前述方法实施例的描述,此处不再赘述。
本实施例提供一个或多个存储有计算机可读指令的非易失性可读存储介质,该非易失性可读存储介质上存储有计算机可读指令,该计算机可读指令被被一个或多个处理器执行时,使得所述一个或多个处理器执行方法实施例中图像检测方法,或者,该计算机可读指令被一个或多个处理器执行时实现装置实施例中图像检测装置中各模块的功能。为避免重复,这里不再赘述。
可以理解地,所述非易失性可读存储介质可以包括:能够携带所述计算机可读指令代码的任何实体或装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、电载波信号和电信信号等。
图9是本申请一实施例提供的计算机设备的示意图。如图9所示,该实施例的计算机设备90包括:处理器91、存储器92以及存储在存储器92中并可在处理器91上运行的计算机可读指令93,例如图像检测程序。处理器91执行计算机可读指令93时实现上述各个图像检测方法实施例中的步骤,例如图2所示的步骤S1至步骤S6。或者,处理器91执行计算机可读指令93时实现上述各装置实施例中各模块/单元的功能,例如图8所示模块81至模块86的功能。
示例性的,计算机可读指令93可以被分割成一个或多个模块/单元,一个或者多个模块/单元被存储在存储器92中,并由处理器91执行,以完成本申请。一个或多个模块/单元可以是能够完成特定功能的一系列计算机可读指令指令段,该指令段用于描述计算机可读指令93在计算机设备90中的执行过程。例如,计算机可读指令93可以被分割成采集模块,变换模块,二值化模块,膨胀化模块,去噪模块,延伸模块,各模块的具体功能如实施例2所示,为避免重复,此处不一一赘述。
计算机设备90可以是桌上型计算机、笔记本、掌上电脑及云端服务器等计算设备。计算机设备90可包括,但不仅限于,处理器91、存储器92。本领域技术人员可以理解,图9仅仅是计算机设备90的示例,并不构成对计算机设备90的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如计算机设备90还可以包括输入输出设备、网络接入设备、总线等。
所称处理器91可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(ApplicationSpecific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
存储器92可以是计算机设备90的内部存储单元,例如计算机设备90的硬盘或内存。存储器92也可以是计算机设备90的外部存储设备,例如计算机设备90上配备的插接式 硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,存储器92还可以既包括计算机设备90的内部存储单元也包括外部存储设备。存储器92用于存储计算机可读指令以及/计算机设备90所需的其他程序和数据。存储器92还可以用于暂时地存储已经输出或者将要输出的数据。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (20)

  1. 一种图像检测方法,其特征在于,所述图像检测方法包括:
    从采集设备中获取原始的手指静脉图像;
    对所述手指静脉图像进行Gabor滤波变换,得到增强图像;
    对所述增强图像进行二值化处理,获取二值化图像;
    对所述二值化图像进行膨胀处理,得到膨胀化图像;
    识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像;
    识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像。
  2. 如权利要求1所述的图像检测方法,其特征在于,所述对所述手指静脉图像进行Gabor滤波变换,得到增强图像之前,所述图像检测方法还包括:
    对所述手指静脉图像中的像素点进行遍历,获取每个所述像素点的RGB分量值;
    根据所述像素点的RGB分量值,按照如下公式对所述手指静脉图像作灰度化处理:
    Figure PCTCN2018094399-appb-100001
    其中,x和y为所述手指静脉图像中每个像素点的横坐标和纵坐标,g(x,y)为像素点(x,y)灰度化处理后的灰度值,R(x,y)为所述像素点(x,y)的R通道的颜色分量,G(x,y)为所述像素点(x,y)的G通道的颜色分量,B(x,y)为所述像素点(x,y)的B通道的颜色分量,k 1,k 2,k 3分别为所述R通道,所述G通道和所述B通道对应的占比参数,σ为预设的调节参数。
  3. 如权利要求1所述的图像检测方法,其特征在于,所述对所述手指静脉图像进行Gabor滤波变换,得到增强图像包括:
    按照如下公式对所述手指静脉图像进行Gabor滤波变换:
    Figure PCTCN2018094399-appb-100002
    Figure PCTCN2018094399-appb-100003
    Figure PCTCN2018094399-appb-100004
    Figure PCTCN2018094399-appb-100005
    Figure PCTCN2018094399-appb-100006
    Figure PCTCN2018094399-appb-100007
    其中,
    Figure PCTCN2018094399-appb-100008
    为Gabor滤波函数,x和y为所述手指静脉图像中像素点的横坐标和纵坐标,θ k为垂直于所述手指静脉图像的方向,m为尺度级别,σ m为第m级尺度的标准差,f m为第m级尺度的中心频率,γ为空间纵横比,ΔΦ为预设带宽,I(x,y)为所述手指静脉 图像,
    Figure PCTCN2018094399-appb-100009
    为所述增强图像。
  4. 如权利要求1所述的图像检测方法,其特征在于,所述识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像包括:
    对所述膨胀化图像中的所述像素点进行遍历,获取相同预设像素值的连续像素点构成的纹路;
    针对每条所述纹路,计算该纹路的长度,若所述长度小于预设的第一阈值,则将该纹路中所有像素点的像素值设置为目标像素值,得到所述去噪图像。
  5. 如权利要求1所述的图像检测方法,其特征在于,所述识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像包括:
    获取所述去噪图像中的中心线上预设像素值的像素点作为中心像素点;
    以所述中心像素点为起始点,向左对预设幅度范围内的像素点进行遍历,获取相同所述预设像素值的连续像素点构成的左边缘纹路;
    若所述左边缘纹路的长度小于预设的第二阈值,则将左侧近邻点的像素值设置为所述预设像素值,并将所述左侧近邻点添加到所述左边缘纹路后,继续向左进行遍历,直到所述左边缘纹路的长度达到所述第二阈值为止,其中,所述左侧近邻点是指与所述边缘纹路最左边的像素点左侧相邻的像素点;
    以所述中心像素点为所述起始点,向右对所述预设幅度范围内的像素点进行遍历,获取相同所述预设像素值的连续像素点构成的右边缘纹路;
    若所述右边缘纹路的长度小于所述预设的第二阈值,则将右侧近邻点的像素值设置为所述预设像素值,并将所述右侧近邻点添加到所述右边缘纹路后,继续向右进行遍历,直到所述右边缘纹路的长度达到所述第二阈值为止,其中,所述右侧近邻点是指与所述边缘纹路最右边的像素点右侧相邻的像素点;
    将所述左边缘纹路和所述右边缘纹路组成所述手指边缘图像。
  6. 一种图像检测装置,其特征在于,所述图像检测装置包括:
    采集模块,用于从采集设备中获取原始的手指静脉图像;
    变换模块,用于对所述手指静脉图像进行Gabor滤波变换,得到增强图像;
    二值化模块,用于对所述增强图像进行二值化处理,获取二值化图像;
    膨胀化模块,用于对所述二值化图像进行膨胀处理,得到膨胀化图像;
    去噪模块,用于识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像;
    延伸模块,用于识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像。
  7. 如权利要求6所述的图像检测装置,其特征在于,所述去噪模块包括:
    获取子模块,用于对所述膨胀化图像中的所述像素点进行遍历,获取相同预设像素值的连续像素点构成的纹路;
    筛选子模块,用于针对每条所述纹路,计算该纹路的长度,若所述长度小于预设的第一阈值,则将该纹路中所有像素点的像素值设置为目标像素值,得到所述去噪图像。
  8. 如权利要求6所述的图像检测装置,其特征在于,所述延伸模块包括:
    中心子模块,用于获取所述去噪图像中的中心线上预设像素值的像素点作为中心像素点;
    左边缘子模块,用于以所述中心像素点为起始点,向左对预设幅度范围内的像素点进行遍历,获取相同所述预设像素值的连续像素点构成的左边缘纹路;
    左延伸子模块,用于若所述左边缘纹路的长度小于预设的第二阈值,则将左侧近邻点的像素值设置为所述预设像素值,并将所述左侧近邻点添加到所述左边缘纹路后,继续向 左进行遍历,直到所述左边缘纹路的长度达到所述第二阈值为止,其中,所述左侧近邻点是指与所述边缘纹路最左边的像素点左侧相邻的像素点;
    右边缘子模块,用于以所述中心像素点为所述起始点,向右对所述预设幅度范围内的像素点进行遍历,获取相同所述预设像素值的连续像素点构成的右边缘纹路;
    右延伸子模块,用于若所述右边缘纹路的长度小于所述预设的第二阈值,则将右侧近邻点的像素值设置为所述预设像素值,并将所述右侧近邻点添加到所述右边缘纹路后,继续向右进行遍历,直到所述右边缘纹路的长度达到所述第二阈值为止,其中,所述右侧近邻点是指与所述边缘纹路最右边的像素点右侧相邻的像素点;
    组成子模块,用于将所述左边缘纹路和所述右边缘纹路组成所述手指边缘图像。
  9. 如权利要求6所述的图像检测装置,其特征在于,所述图像检测装置还包括:
    对所述手指静脉图像中的像素点进行遍历,获取每个所述像素点的RGB分量值;
    根据所述像素点的RGB分量值,按照如下公式对所述手指静脉图像作灰度化处理:
    Figure PCTCN2018094399-appb-100010
    其中,x和y为所述手指静脉图像中每个像素点的横坐标和纵坐标,g(x,y)为像素点(x,y)灰度化处理后的灰度值,R(x,y)为所述像素点(x,y)的R通道的颜色分量,G(x,y)为所述像素点(x,y)的G通道的颜色分量,B(x,y)为所述像素点(x,y)的B通道的颜色分量,k 1,k 2,k 3分别为所述R通道,所述G通道和所述B通道对应的占比参数,σ为预设的调节参数。
  10. 如权利要求6所述的图像检测装置,其特征在于,所述变换模块包括:
    按照如下公式对所述手指静脉图像进行Gabor滤波变换:
    Figure PCTCN2018094399-appb-100011
    Figure PCTCN2018094399-appb-100012
    Figure PCTCN2018094399-appb-100013
    Figure PCTCN2018094399-appb-100014
    Figure PCTCN2018094399-appb-100015
    Figure PCTCN2018094399-appb-100016
    其中, 为Gabor滤波函数,x和y为所述手指静脉图像中像素点的横坐标和纵坐标,θ k为垂直于所述手指静脉图像的方向,m为尺度级别,σ m为第m级尺度的标准差,f m为第m级尺度的中心频率,γ为空间纵横比,ΔΦ为预设带宽,I(x,y)为所述手指静脉图像,
    Figure PCTCN2018094399-appb-100018
    为所述增强图像。
  11. 一种计算机设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机可读指令,其特征在于,所述处理器执行所述计算机可读指令时实现如下步骤:
    从采集设备中获取原始的手指静脉图像;
    对所述手指静脉图像进行Gabor滤波变换,得到增强图像;
    对所述增强图像进行二值化处理,获取二值化图像;
    对所述二值化图像进行膨胀处理,得到膨胀化图像;
    识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像;
    识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像。
  12. 如权利要求11所述的终端设备,其特征在于,所述对所述手指静脉图像进行Gabor滤波变换,得到增强图像之前,所述处理器执行所述计算机可读指令时还包括实现如下步骤:
    对所述手指静脉图像中的像素点进行遍历,获取每个所述像素点的RGB分量值;
    根据所述像素点的RGB分量值,按照如下公式对所述手指静脉图像作灰度化处理:
    Figure PCTCN2018094399-appb-100019
    其中,x和y为所述手指静脉图像中每个像素点的横坐标和纵坐标,g(x,y)为像素点(x,y)灰度化处理后的灰度值,R(x,y)为所述像素点(x,y)的R通道的颜色分量,G(x,y)为所述像素点(x,y)的G通道的颜色分量,B(x,y)为所述像素点(x,y)的B通道的颜色分量,k 1,k 2,k 3分别为所述R通道,所述G通道和所述B通道对应的占比参数,σ为预设的调节参数。
  13. 如权利要求11所述的终端设备,其特征在于,所述对所述手指静脉图像进行Gabor滤波变换,得到增强图像包括:
    按照如下公式对所述手指静脉图像进行Gabor滤波变换:
    Figure PCTCN2018094399-appb-100020
    Figure PCTCN2018094399-appb-100021
    Figure PCTCN2018094399-appb-100022
    Figure PCTCN2018094399-appb-100023
    Figure PCTCN2018094399-appb-100024
    Figure PCTCN2018094399-appb-100025
    其中,
    Figure PCTCN2018094399-appb-100026
    为Gabor滤波函数,x和y为所述手指静脉图像中像素点的横坐标和纵坐标,θ k为垂直于所述手指静脉图像的方向,m为尺度级别,σ m为第m级尺度的标准差, f m为第m级尺度的中心频率,γ为空间纵横比,ΔΦ为预设带宽,I(x,y)为所述手指静脉图像,
    Figure PCTCN2018094399-appb-100027
    为所述增强图像。
  14. 如权利要求11所述的终端设备,其特征在于,所述识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像包括:
    对所述膨胀化图像中的所述像素点进行遍历,获取相同预设像素值的连续像素点构成的纹路;
    针对每条所述纹路,计算该纹路的长度,若所述长度小于预设的第一阈值,则将该纹路中所有像素点的像素值设置为目标像素值,得到所述去噪图像。
  15. 如权利要求11所述的终端设备,其特征在于,所述识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像包括:
    获取所述去噪图像中的中心线上预设像素值的像素点作为中心像素点;
    以所述中心像素点为起始点,向左对预设幅度范围内的像素点进行遍历,获取相同所述预设像素值的连续像素点构成的左边缘纹路;
    若所述左边缘纹路的长度小于预设的第二阈值,则将左侧近邻点的像素值设置为所述预设像素值,并将所述左侧近邻点添加到所述左边缘纹路后,继续向左进行遍历,直到所述左边缘纹路的长度达到所述第二阈值为止,其中,所述左侧近邻点是指与所述边缘纹路最左边的像素点左侧相邻的像素点;
    以所述中心像素点为所述起始点,向右对所述预设幅度范围内的像素点进行遍历,获取相同所述预设像素值的连续像素点构成的右边缘纹路;
    若所述右边缘纹路的长度小于所述预设的第二阈值,则将右侧近邻点的像素值设置为所述预设像素值,并将所述右侧近邻点添加到所述右边缘纹路后,继续向右进行遍历,直到所述右边缘纹路的长度达到所述第二阈值为止,其中,所述右侧近邻点是指与所述边缘纹路最右边的像素点右侧相邻的像素点;
    将所述左边缘纹路和所述右边缘纹路组成所述手指边缘图像。
  16. 一个或多个存储有计算机可读指令的非易失性可读存储介质,其特征在于,所述计算机可读指令被一个或多个处理器执行时,使得所述一个或多个处理器执行如下步骤:
    从采集设备中获取原始的手指静脉图像;
    对所述手指静脉图像进行Gabor滤波变换,得到增强图像;
    对所述增强图像进行二值化处理,获取二值化图像;
    对所述二值化图像进行膨胀处理,得到膨胀化图像;
    识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像;
    识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像。
  17. 如权利要求16所述的非易失性可读存储介质,其特征在于,所述对所述手指静脉图像进行Gabor滤波变换,得到增强图像之前,所述计算机可读指令被一个或多个处理器执行时,使得所述一个或多个处理器还执行如下步骤:
    对所述手指静脉图像中的像素点进行遍历,获取每个所述像素点的RGB分量值;
    根据所述像素点的RGB分量值,按照如下公式对所述手指静脉图像作灰度化处理:
    Figure PCTCN2018094399-appb-100028
    其中,x和y为所述手指静脉图像中每个像素点的横坐标和纵坐标,g(x,y)为像素点(x,y)灰度化处理后的灰度值,R(x,y)为所述像素点(x,y)的R通道的颜色分量,G(x,y)为 所述像素点(x,y)的G通道的颜色分量,B(x,y)为所述像素点(x,y)的B通道的颜色分量,k 1,k 2,k 3分别为所述R通道,所述G通道和所述B通道对应的占比参数,σ为预设的调节参数。
  18. 如权利要求16所述的非易失性可读存储介质,其特征在于,所述对所述手指静脉图像进行Gabor滤波变换,得到增强图像包括:
    按照如下公式对所述手指静脉图像进行Gabor滤波变换:
    Figure PCTCN2018094399-appb-100029
    Figure PCTCN2018094399-appb-100030
    Figure PCTCN2018094399-appb-100031
    Figure PCTCN2018094399-appb-100032
    Figure PCTCN2018094399-appb-100033
    Figure PCTCN2018094399-appb-100034
    其中,
    Figure PCTCN2018094399-appb-100035
    为Gabor滤波函数,x和y为所述手指静脉图像中像素点的横坐标和纵坐标,θ k为垂直于所述手指静脉图像的方向,m为尺度级别,σ m为第m级尺度的标准差,f m为第m级尺度的中心频率,γ为空间纵横比,ΔΦ为预设带宽,I(x,y)为所述手指静脉图像,
    Figure PCTCN2018094399-appb-100036
    为所述增强图像。
  19. 如权利要求16所述的非易失性可读存储介质,其特征在于,所述识别所述膨胀化图像中的指静脉纹路,对所述指静脉纹路作删除处理,获取去噪图像包括:
    对所述膨胀化图像中的所述像素点进行遍历,获取相同预设像素值的连续像素点构成的纹路;
    针对每条所述纹路,计算该纹路的长度,若所述长度小于预设的第一阈值,则将该纹路中所有像素点的像素值设置为目标像素值,得到所述去噪图像。
  20. 如权利要求16所述的非易失性可读存储介质,其特征在于,所述识别所述去噪图像中的手指边缘纹路,对所述手指边缘纹路作延伸处理,得到完整的手指边缘图像包括:
    获取所述去噪图像中的中心线上预设像素值的像素点作为中心像素点;
    以所述中心像素点为起始点,向左对预设幅度范围内的像素点进行遍历,获取相同所述预设像素值的连续像素点构成的左边缘纹路;
    若所述左边缘纹路的长度小于预设的第二阈值,则将左侧近邻点的像素值设置为所述预设像素值,并将所述左侧近邻点添加到所述左边缘纹路后,继续向左进行遍历,直到所述左边缘纹路的长度达到所述第二阈值为止,其中,所述左侧近邻点是指与所述边缘纹路最左边的像素点左侧相邻的像素点;
    以所述中心像素点为所述起始点,向右对所述预设幅度范围内的像素点进行遍历,获取相同所述预设像素值的连续像素点构成的右边缘纹路;
    若所述右边缘纹路的长度小于所述预设的第二阈值,则将右侧近邻点的像素值设置为所述预设像素值,并将所述右侧近邻点添加到所述右边缘纹路后,继续向右进行遍历,直到所述右边缘纹路的长度达到所述第二阈值为止,其中,所述右侧近邻点是指与所述边缘纹路最右边的像素点右侧相邻的像素点;
    将所述左边缘纹路和所述右边缘纹路组成所述手指边缘图像。
PCT/CN2018/094399 2018-04-28 2018-07-04 一种图像检测方法、装置、计算机设备及存储介质 WO2019205290A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810398765.3 2018-04-28
CN201810398765.3A CN108805023B (zh) 2018-04-28 2018-04-28 一种图像检测方法、装置、计算机设备及存储介质

Publications (1)

Publication Number Publication Date
WO2019205290A1 true WO2019205290A1 (zh) 2019-10-31

Family

ID=64093064

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/094399 WO2019205290A1 (zh) 2018-04-28 2018-07-04 一种图像检测方法、装置、计算机设备及存储介质

Country Status (2)

Country Link
CN (1) CN108805023B (zh)
WO (1) WO2019205290A1 (zh)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110866503A (zh) * 2019-11-19 2020-03-06 圣点世纪科技股份有限公司 指静脉设备的异常检测方法及系统
CN110956596A (zh) * 2019-12-09 2020-04-03 深圳元华医疗设备技术有限公司 基于近红外成像的图像处理方法及终端
CN111353957A (zh) * 2020-02-28 2020-06-30 北京三快在线科技有限公司 图像处理方法、装置、存储介质及电子设备
CN112613523A (zh) * 2020-12-15 2021-04-06 中冶赛迪重庆信息技术有限公司 一种转炉出钢口钢流识别方法、系统、介质及电子终端
CN113034524A (zh) * 2019-12-25 2021-06-25 深圳怡化电脑股份有限公司 一种图像边缘检测方法和装置
CN113155860A (zh) * 2020-12-17 2021-07-23 华能澜沧江水电股份有限公司 一种基于流态视频监测的过水建筑物结构损伤诊断方法及系统
CN114565517A (zh) * 2021-12-29 2022-05-31 骨圣元化机器人(深圳)有限公司 红外相机的图像去噪方法、装置及计算机设备
CN114693531A (zh) * 2020-12-28 2022-07-01 富泰华工业(深圳)有限公司 图像对比方法及相关设备
CN117474999A (zh) * 2023-12-25 2024-01-30 惠州市德立电子有限公司 一种微型片式电感双线绕线异常定位方法及系统

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109522842B (zh) * 2018-11-16 2023-01-17 中国民航大学 一种基于手指静脉图像的血管网络修复方法
CN111260603B (zh) * 2018-11-30 2024-02-02 金风科技股份有限公司 识别风力发电机组叶片尖端的方法及装置
CN109766831A (zh) * 2019-01-09 2019-05-17 深圳市三宝创新智能有限公司 一种道路色带识别方法、装置、计算机设备、及存储介质
CN109902586A (zh) * 2019-01-29 2019-06-18 平安科技(深圳)有限公司 掌纹提取方法、装置及存储介质、服务器
CN110717372A (zh) * 2019-08-13 2020-01-21 平安科技(深圳)有限公司 基于指静脉识别的身份验证方法和装置
CN110705341A (zh) * 2019-08-13 2020-01-17 平安科技(深圳)有限公司 基于指静脉图像的验证方法、装置、存储介质
CN111832423A (zh) * 2020-06-19 2020-10-27 北京邮电大学 一种票据信息识别方法、装置及系统
CN112508024A (zh) * 2020-11-11 2021-03-16 广西电网有限责任公司南宁供电局 一种变压器电气铭牌钢印字体智能识别方法
CN112862703B (zh) * 2021-01-21 2023-06-02 平安科技(深圳)有限公司 基于移动拍照的图像校正方法、装置、电子设备及介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679700A (zh) * 2013-10-29 2014-03-26 成都三泰电子实业股份有限公司 票据图像倒置检测系统
CN104851074A (zh) * 2015-03-26 2015-08-19 温州大学 基于特征相似性的非局部邻域灰度图像彩色化方法
CN105512656A (zh) * 2014-09-22 2016-04-20 郭进锋 采集手掌静脉图像的方法
CN106408025A (zh) * 2016-09-20 2017-02-15 西安工程大学 基于图像处理的航拍图像绝缘子分类识别方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100395770C (zh) * 2005-06-27 2008-06-18 北京交通大学 一种基于特征关系度量的手部特征融合认证方法
CN104688184B (zh) * 2014-12-05 2017-08-04 南京航空航天大学 可见光皮肤图像的静脉显像方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679700A (zh) * 2013-10-29 2014-03-26 成都三泰电子实业股份有限公司 票据图像倒置检测系统
CN105512656A (zh) * 2014-09-22 2016-04-20 郭进锋 采集手掌静脉图像的方法
CN104851074A (zh) * 2015-03-26 2015-08-19 温州大学 基于特征相似性的非局部邻域灰度图像彩色化方法
CN106408025A (zh) * 2016-09-20 2017-02-15 西安工程大学 基于图像处理的航拍图像绝缘子分类识别方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YU , YUJIE: "Research of Finger Vein Image Acquisition and Recognition Technology Based on FPGA", CHINA MASTER'S THESES FULL-TEXT DATABASE, 15 May 2017 (2017-05-15), ISSN: 1674-0246 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110866503B (zh) * 2019-11-19 2024-01-05 圣点世纪科技股份有限公司 指静脉设备的异常检测方法及系统
CN110866503A (zh) * 2019-11-19 2020-03-06 圣点世纪科技股份有限公司 指静脉设备的异常检测方法及系统
CN110956596A (zh) * 2019-12-09 2020-04-03 深圳元华医疗设备技术有限公司 基于近红外成像的图像处理方法及终端
CN110956596B (zh) * 2019-12-09 2023-08-29 深圳元华医疗设备技术有限公司 基于近红外成像的图像处理方法及终端
CN113034524A (zh) * 2019-12-25 2021-06-25 深圳怡化电脑股份有限公司 一种图像边缘检测方法和装置
CN111353957A (zh) * 2020-02-28 2020-06-30 北京三快在线科技有限公司 图像处理方法、装置、存储介质及电子设备
CN112613523A (zh) * 2020-12-15 2021-04-06 中冶赛迪重庆信息技术有限公司 一种转炉出钢口钢流识别方法、系统、介质及电子终端
CN113155860A (zh) * 2020-12-17 2021-07-23 华能澜沧江水电股份有限公司 一种基于流态视频监测的过水建筑物结构损伤诊断方法及系统
CN114693531A (zh) * 2020-12-28 2022-07-01 富泰华工业(深圳)有限公司 图像对比方法及相关设备
CN114565517A (zh) * 2021-12-29 2022-05-31 骨圣元化机器人(深圳)有限公司 红外相机的图像去噪方法、装置及计算机设备
CN114565517B (zh) * 2021-12-29 2023-09-29 骨圣元化机器人(深圳)有限公司 红外相机的图像去噪方法、装置及计算机设备
CN117474999A (zh) * 2023-12-25 2024-01-30 惠州市德立电子有限公司 一种微型片式电感双线绕线异常定位方法及系统
CN117474999B (zh) * 2023-12-25 2024-04-19 惠州市德立电子有限公司 一种微型片式电感双线绕线异常定位方法及系统

Also Published As

Publication number Publication date
CN108805023B (zh) 2023-12-19
CN108805023A (zh) 2018-11-13

Similar Documents

Publication Publication Date Title
WO2019205290A1 (zh) 一种图像检测方法、装置、计算机设备及存储介质
WO2019232945A1 (zh) 图像处理方法、装置、计算机设备及存储介质
US10726557B2 (en) Method and system for preparing text images for optical-character recognition
CN104751142B (zh) 一种基于笔划特征的自然场景文本检测方法
WO2014160433A2 (en) Systems and methods for classifying objects in digital images captured using mobile devices
WO2017088637A1 (zh) 自然背景中图像边缘的定位方法及装置
KR20130016213A (ko) 광학 문자 인식되는 텍스트 영상의 텍스트 개선
WO2016029555A1 (zh) 图像插值方法和装置
US10748023B2 (en) Region-of-interest detection apparatus, region-of-interest detection method, and recording medium
US11151402B2 (en) Method of character recognition in written document
CN108389215B (zh) 一种边缘检测方法、装置、计算机存储介质及终端
WO2020155764A1 (zh) 掌纹提取方法、装置及存储介质、服务器
WO2023070593A1 (zh) 线宽测量方法、装置、计算处理设备、计算机程序及计算机可读介质
CN113781406B (zh) 电子元器件的划痕检测方法、装置及计算机设备
CN106599891A (zh) 一种基于尺度相位谱显著性的遥感图像兴趣区快速提取方法
CN113673515A (zh) 一种计算机视觉目标检测算法
CN106023105B (zh) 一种植物叶片的二值图像生成方法及系统
CN115410191B (zh) 文本图像识别方法、装置、设备和存储介质
CN111898408A (zh) 一种快速人脸识别方法及装置
CN110633705A (zh) 一种低照度成像车牌识别方法及装置
CN104408452A (zh) 一种基于旋转投影宽度的拉丁字符倾斜纠正方法及系统
CN113379785B (zh) 一种融合边界先验与频域信息的显著性目标检测方法
CN115564727A (zh) 一种爆光显影异常缺陷检测方法及系统
CN111931688A (zh) 船只识别方法、装置、计算机设备及存储介质
CN110648341B (zh) 一种基于尺度空间和子图的目标边界检测方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18916638

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS (EPO FORM 1205A DATED 08.02.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18916638

Country of ref document: EP

Kind code of ref document: A1