WO2022041146A1 - 图像偏移量计算方法、指纹检测模组、装置及电子设备 - Google Patents
图像偏移量计算方法、指纹检测模组、装置及电子设备 Download PDFInfo
- Publication number
- WO2022041146A1 WO2022041146A1 PCT/CN2020/112270 CN2020112270W WO2022041146A1 WO 2022041146 A1 WO2022041146 A1 WO 2022041146A1 CN 2020112270 W CN2020112270 W CN 2020112270W WO 2022041146 A1 WO2022041146 A1 WO 2022041146A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- fringe
- offset
- model
- fingerprint detection
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 106
- 238000004364 calculation method Methods 0.000 title claims abstract description 42
- 238000000034 method Methods 0.000 claims abstract description 49
- 238000012545 processing Methods 0.000 claims abstract description 25
- 230000003287 optical effect Effects 0.000 claims description 57
- 230000002159 abnormal effect Effects 0.000 claims description 26
- 206010034972 Photosensitivity reaction Diseases 0.000 claims description 3
- 230000036211 photosensitivity Effects 0.000 claims description 3
- 230000004927 fusion Effects 0.000 description 14
- 238000012360 testing method Methods 0.000 description 11
- 230000003628 erosive effect Effects 0.000 description 6
- 238000001914 filtration Methods 0.000 description 6
- 238000003672 processing method Methods 0.000 description 6
- 230000000903 blocking effect Effects 0.000 description 4
- 230000010339 dilation Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000003709 image segmentation Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
Definitions
- the present application relates to the technical field of fingerprint identification, and in particular, to an image offset calculation method, a fingerprint detection module, an apparatus and an electronic device.
- the optical fingerprint detection module includes a pixel array and a plurality of light guide channels. Specifically, the end of each light guide channel has one pixel, the ends of the four light guide channels are provided with four pixels, each pixel receives an optical signal in a certain fixed optical path direction, and the four pixels receive four optical path directions light signal. Pixels in the pixel array that receive light signals in the same light path direction generate an image. The distance between the two adjacent images in the X direction and the Y direction of the two-dimensional plane is called the image offset in the multi-optical path direction.
- the multi-optical path direction offset can be used to generate a fusion map of fingerprint images to improve the accuracy of fingerprint recognition. However, how to obtain the multi-optical path direction offset has become an urgent problem to be solved in the multi-optical path direction fingerprint detection technology.
- one of the technical problems solved by the embodiments of the present application is to provide an image offset calculation method, a fingerprint detection module, an apparatus, and an electronic device, so as to overcome at least some of the defects in the prior art.
- an embodiment of the present application provides a method for calculating an image offset.
- the method is applied to a fingerprint detection module in a fingerprint detection device, the fingerprint detection module is arranged below the screen toward the direction of incident light, there is an air gap between the screen and the fingerprint detection module, and the screen faces the incident light.
- a film layer is arranged in the light direction, and the method includes: collecting image data on a 3D fringe model to obtain 3D fringe model image data in multiple optical path directions; The reflection amount data of the 3D fringe model is subjected to binarization processing to obtain binarized image data; at least according to the centroid coordinates of two pieces of binarized image data along the diagonal direction in the four pairs of binarized image data, Calculate the image offset of any two adjacent images. Obviously, any two adjacent images here do not include two adjacent images in the diagonal direction.
- an embodiment of the present application provides a fingerprint detection module, the fingerprint detection module is disposed below the screen toward the direction of incident light, and an air gap is formed between the screen and the fingerprint detection module, The screen is provided with a film layer facing the direction of incident light, and the fingerprint detection module adopts the above-mentioned calculation method of image offset to obtain the offset of any two adjacent images for fingerprint detection.
- an embodiment of the present application provides a fingerprint sensing device, the fingerprint sensing device includes a film layer, a screen, an air gap and a fingerprint detection module arranged in sequence toward the incident light direction, the fingerprint detection module
- the above-mentioned calculation method of image offset is used to obtain the offset of any two images for fingerprint detection.
- an embodiment of the present application provides an electronic device, the electronic device includes a fingerprint detection device, and the fingerprint detection device includes a film layer, a screen, an air gap, and a fingerprint detection module arranged in sequence toward the incident light direction,
- the fingerprint detection module adopts the above-mentioned calculation method of image offset to obtain the offset of any two images for fingerprint detection.
- the fingerprint sensor device includes a film layer, a screen, an air gap, and a fingerprint detection device arranged in sequence toward the incident light direction. module.
- image data is collected on a 3D fringe model in a normal test environment to obtain 3D fringe model image data in multiple optical path directions, and then the 3D fringe model image data or the 3D fringe model image data obtained according to the 3D fringe model image data is obtained.
- the reflection amount data of the fringe model is subjected to binarization processing, at least according to the centroid coordinates of the two binarized image data along the diagonal direction in the four binarized image data, the calculation of the distance between any two adjacent images is calculated.
- Image offset In this embodiment of the present application, the image offsets of adjacent multi-optical path directions can be obtained, so that the image offsets can be used to generate a fusion map of fingerprint images to improve the accuracy of fingerprint recognition, and the image offsets can also be used for Extract the eigenvalues of the overlapping area of fingerprint images for fingerprint anti-counterfeiting detection.
- FIG. 1 is a schematic structural diagram of a fingerprint detection device according to an embodiment of the present application.
- FIG. 2 is a schematic diagram of a multi-optical path direction image of a fingerprint detection device provided by an embodiment of the present application
- FIG. 3 is a flowchart of an image offset calculation method provided by an embodiment of the present application.
- FIG. 4 is a flowchart of an implementation manner of step S103 of an image offset calculation method in FIG. 3 provided by an embodiment of the present application;
- FIG. 5 is a flowchart of another image offset calculation method provided by an embodiment of the present application.
- FIG. 6 is a flowchart of an implementation manner of step S204 of another image offset calculation method in FIG. 5 provided by an embodiment of the present application;
- FIG. 7 is a schematic diagram of imaging optical paths of a black model and a 3D fringe model in an image offset calculation method provided by an embodiment of the present application;
- FIG. 9 is a flowchart of an implementation manner of step S304 of still another image offset calculation method in FIG. 8 provided by an embodiment of the present application.
- FIG. 11 is a flowchart of still another image offset calculation method provided by an embodiment of the present application.
- the fingerprint detection device can receive incident light from multiple directions, and includes a film layer 11 , a screen 12 , an air gap 13 and a fingerprint detection module 14 arranged in sequence toward the incident light direction.
- P_Film is the thickness of the film layer 11
- P_Oled is the thickness of the screen 12
- P_Gap is the thickness of the air gap 13 .
- the offset d of two adjacent images in the diagonal direction is the offset distances S1 , S2 and S3 that are refracted through the film layer 11 , the screen 12 , and the air gap 13 .
- the incident angle of the incident light irradiating the film layer is ⁇
- the refraction angle of the incident light in the film layer 11 is ⁇
- the refraction angle in the screen 12 is ⁇
- the refraction angle after entering the air gap 13 is still ⁇ .
- the offset in the diagonal direction of the two images 21a and 23a along the diagonal direction is d
- the offset in the X direction of the two images 21a and 23a on the two-dimensional plane is dx
- the offset of the two images 21a and 23a in the Y direction on the two-dimensional plane is dy.
- the offset amount d of the two images in the diagonal direction includes the offset amount in the X direction and the offset amount in the Y direction on the two-dimensional plane. Since the incident light passes through the film layer 11, the screen 12 and the air gap 13 shown in FIG.
- the offset d is the accumulation of the offset distances S1 , S2 and S3 that are refracted through the film layer 11 , the screen 12 , and the air gap 13 .
- the fingerprint detection device includes a film layer 11 , a screen 12 , an air gap 13 and a fingerprint detection module 14 arranged in sequence toward the incident light direction.
- the screen 12 may be used to display various preset fixed patterns.
- the preset fixed pattern includes at least one of circle, square and triangle.
- the embodiment of the present application does not require the size and spacing of the fixed pattern, and the size and spacing of the fixed pattern have no effect on the calculation of the image offset in the embodiment of the present application.
- the fingerprint detection module includes a microlens array (for example, a microlens array of n rows*n columns, where n is a natural number) and a multi-layer light blocking layer.
- the microlens array includes a plurality of microlenses, and each light blocking layer in the multilayer light blocking layer includes a plurality of small holes.
- Each microlens of the microlens array may correspond to light guide channels with multiple directions, for example, light guide channels in four directions, and the light guide channels are optical path channels formed by holes in the multi-layer light blocking layer.
- each light guide channel has one pixel
- the ends of the four light guide channels have four pixels
- the four pixels form a pixel unit
- the fingerprint detection module includes a plurality of the pixel units (ie, pixel unit arrays). ).
- Fig. 2 only schematically shows one pixel unit in the pixel array of the fingerprint detection module.
- the pixel unit includes four pixels 21 , 22 , 23 , and 24 , and the four pixels 21 , 22 , 23 , and 24 respectively receive four optical signals with fixed optical path directions.
- the pixels in the pixel array of the fingerprint detection module that receive light signals in the same direction form an image. Taking one pixel unit shown in FIG.
- the four pixels 21 , 22 , 23 , and 24 respectively only receive optical signals in a certain fixed optical path direction.
- the pixels that receive the optical signals in the four optical path directions form four images 21a, 22a, 23a, and 24a respectively, and the distance between the two adjacent images in the X and Y directions of the two-dimensional plane is called an image.
- the two adjacent images refer to two images that are adjacent to each other up and down or left and right, excluding images adjacent to each other in the diagonal direction.
- the calculation of the image offset in the embodiment of the present application is usually completed in the whole-machine testing phase of the electronic device to which the fingerprint detection device is applied, and the image offset obtained in the testing phase is used to generate a fusion map of the fingerprint image, so as to improve the
- the accuracy of fingerprint identification can also be used to extract the feature value of the overlapping area of the fingerprint image by using the image offset for fingerprint anti-counterfeiting detection.
- a test environment in which the fingerprint detection module 14 operates normally is selected, that is, the fingerprint detection module performs serial peripheral interface (Serial Peripheral Interface, SPI) testing, one-time programmable (One Time Programable, OTP) inspection, memory ( Ram Stress Test, RST) test and integration test (integration test, INT) pass the test environment.
- serial Peripheral Interface SPI
- OTP One Time Programable
- memory Ram Stress Test, RST
- integration test integration test
- image data is collected for the 3D stripe model, and subsequent calculation processing is performed on the obtained image data of the 3D stripe model, so that the fault of the fingerprint detection module will not affect the image data. The accuracy of subsequent calculations.
- the method includes:
- the 3D fringe model is a model object selected to simulate the user's finger for calculating the offset of images in multiple optical path directions, and the fringes on the 3D fringe model are used to simulate the fingerprint of the user's finger.
- the 3D fringe model image data in multiple optical path directions may be processed by low-pass filtering, median filtering, etc.
- the specific processing methods are not limited in the embodiments of the present application.
- S102 Perform binarization processing on the 3D fringe model image data or the reflection amount data of the 3D fringe model obtained according to the 3D fringe model image data, to obtain binarized image data.
- image binarization is the simplest method for image segmentation, that is, the gray value of the pixel on the image is set to 0 or 255, and the gray value of the pixel greater than the critical gray threshold is set to 0 or 255. is the grayscale maximum value, and the pixel grayscale less than the critical grayscale threshold is set as the grayscale minimum value, so as to obtain the binarized image data.
- This embodiment of the present application performs binarization processing on the 3D fringe model image data or the reflection amount data of the 3D fringe model obtained according to the 3D fringe model image data, to obtain the binarized 3D fringe model image data or binary image data. Valued 3D fringe model reflectance data.
- image dilation and erosion processing may be used to eliminate outliers in the binarized image data, and the embodiments of the present application do not limit the specific image dilation and erosion processing methods.
- step S103 includes:
- S1031 Calculate at least the centroid coordinates of two pieces of binarized image data along the diagonal direction among the four pairs of binarized image data.
- four pieces of 3D fringe model image data or four pieces of reflection quantity data of the 3D fringe model calculated according to the four pieces of 3D fringe model image data are obtained, and image binarization processing is performed on them to obtain four pieces of binarized 3D image data.
- the fringe model image data or the reflection quantity image data of the four binarized 3D fringe models are obtained.
- Calculate the centroid coordinates of the reflection amount image data of the 3D fringe model and obtain the two centroid coordinates of the two binarized 3D fringe model image data in the diagonal direction or the reflection amount image data of the two binarized 3D fringe models.
- x i is the abscissa value of pixel i in a pair of image data
- yi is the ordinate value of pixel i in a pair of image data
- mi is the binary value of pixel i in a pair of image data.
- dx is the image offset of the adjacent multi-path directions in the X direction on the two-dimensional plane
- dy is the image offset of the adjacent multi-path directions in the Y direction on the two-dimensional plane
- x(0) and x(3) are respectively are the coordinates of the two binarized image data in the diagonal direction in the X direction on the two-dimensional plane
- y(0) and y(3) are the two binarized image data in the diagonal direction.
- another binarized image data in the four pairs of binarized image data may be used to calculate the difference between any two adjacent images. Image offset.
- the image offsets of adjacent multi-optical path directions are obtained, so that the image offsets can be used to generate a fusion map of fingerprint images, so as to improve the accuracy of fingerprint recognition, and can also be used to extract features of overlapping areas of fingerprint images. value for fingerprint anti-counterfeiting detection.
- the method includes:
- the screen light is reflected by the objects (finger, 3D stripe model, black model, etc.) arranged above the screen, and the reflected light is received by the fingerprint detection module as incident light, and the 3D stripe model is used for multi-optical path direction images.
- the model object that simulates the user's finger selected by the calculation of the offset uses the stripes on the 3D stripe model to simulate the fingerprint of the user's finger.
- the black model as the model object will absorb the irradiated light, and the fingerprint detection module can only receive ambient light. Therefore, the image obtained by selecting the black model as the model object is compared with the image obtained by the 3D stripe model as the model object.
- the light emitted by the screen that is, the screen brightness, is evaluated.
- the 3D fringe model image data in multiple optical path directions is a frame of 3D fringe model image data in multiple optical path directions
- the black model image data in multiple optical path directions is a frame of black model image data in multiple optical path directions.
- only one frame of 3D fringe model image data in multiple optical path directions and one frame of black model image data in multiple optical path directions can be used to calculate the offset with high accuracy, and it also avoids the consumption caused by collecting multiple frames of images. time increase.
- the 3D fringe model image data in multiple optical path directions may be processed by low-pass filtering, median filtering, etc.
- the specific processing methods are not limited in the embodiments of the present application.
- S202 Determine whether the screen brightness is abnormal according to the 3D stripe model image data and the black model image data.
- the screen brightness is abnormal; otherwise, the screen brightness is normal.
- the accuracy of the collected image data will be affected.
- it is determined whether the brightness of the screen is abnormal by directly comparing the brightness of the 3D stripe model image data and the brightness of the black model image data. This method can more accurately determine whether the screen brightness is abnormal.
- the abnormal brightness information when it is judged that the brightness is abnormal, the abnormal brightness information may also be returned.
- the available image offsets cannot be obtained for generating a fusion map of fingerprint images or extracting feature values of overlapping areas of fingerprint images. Therefore, when it is judged that the brightness of the screen is abnormal, the calculation of the offset is ended, so as to avoid a situation where the subsequent calculation is time-consuming but cannot provide a usable image offset.
- image binarization is the simplest method for image segmentation, that is, the gray value of the pixel on the image is set to 0 or 255, and the gray value of the pixel greater than the critical gray threshold is set to 0 or 255. is the grayscale maximum value, and the pixel grayscale less than the critical grayscale threshold is set as the grayscale minimum value, so as to obtain the binarized image data.
- This embodiment of the present application performs binarization processing on the reflection amount data of the 3D fringe model obtained according to the 3D fringe model image data, to obtain the binarized 3D fringe model image data or the binarized reflection amount data .
- the step S204 includes:
- the light received by the fingerprint detection module is divided into two parts.
- One part is not reflected by the objects on the screen (for example, fingers, 3D striped models or black models), and the fixed pattern on the screen directly illuminates the fingerprint detection module, which is called light leakage. , belongs to useless interference light; the other part is reflected back to the fingerprint detection module by the object on the screen, which is called reflected light and belongs to useful light.
- the black model 71 since the black model 71 is located above the screen, the black model absorbs the light emitted by the screen, and the light emitted by the screen is rarely reflected back to the fingerprint detection module 14 through the black model. Therefore, when the black model image data is collected, the fingerprint detection module The light received by the group 14 can be regarded as all of the amount of leakage light, that is, useless disturbing light.
- the 3D stripe model 72 when image data is collected on the 3D stripe model, the light received by the fingerprint detection module 14 includes useful light and useless interference light.
- subtracting the black model image data from the 3D striped model image data is equivalent to subtracting the useless interference from the useful light and useless interference light received by the fingerprint detection module 14 when the 3D striped model is located above the screen light, so as to obtain the reflection amount data of the image data of the 3D fringe model when the 3D fringe model is located above the screen, that is, the data of useful light.
- the big law also known as the maximum inter-class difference method, realizes the automatic selection of the global threshold by counting the histogram characteristics of the whole image.
- the big law is used to perform binarization processing on the reflection amount data of the 3D fringe model, and the calculation is simple and fast, and the obtained binarized image data is not affected by the brightness and contrast of the image.
- image dilation and erosion processing may be used to eliminate outliers in the binarized image data, and the embodiments of the present application do not limit the specific image dilation and erosion processing methods.
- S205 Calculate the image offset of any two adjacent images according to at least the centroid coordinates of the two binarized image data along the diagonal direction among the four pairs of binarized image data.
- four 3D fringe model image data or reflection quantity data of four 3D fringe models calculated according to the four 3D fringe model image data are obtained, and image binarization processing is performed on them to obtain four binarized 3D images.
- the fringe model image data or the reflection quantity image data of the four binarized 3D fringe models are obtained.
- x i is the abscissa value of pixel i in a pair of image data
- yi is the ordinate value of pixel i in a pair of image data
- mi is the binary value of pixel i in a pair of image data.
- the offsets of adjacent images in multiple optical path directions are obtained.
- dx is the image offset of the adjacent multi-path directions in the X direction on the two-dimensional plane
- dy is the image offset of the adjacent multi-path directions in the Y direction on the two-dimensional plane
- x(0) and x(3) are respectively are the coordinates of the two binarized image data in the diagonal direction in the X direction on the two-dimensional plane
- y(0) and y(3) are the two binarized image data in the diagonal direction.
- the image offsets of adjacent multi-optical path directions are obtained, so that the image offsets can be used to generate a fusion map of fingerprint images, so as to improve the accuracy of fingerprint recognition, and can also be used to extract features of overlapping areas of fingerprint images. value for fingerprint anti-counterfeiting detection.
- the method includes:
- S301 Collect image data of the 3D fringe model and the black model respectively, and obtain the 3D fringe model image data in multiple optical path directions and the black model image data in multiple optical path directions.
- the screen light is reflected by the objects (finger, 3D stripe model, black model, etc.) arranged above the screen, and the reflected light is received as incident light by the fingerprint detection module.
- the 3D stripe model performs image shift in multiple optical path directions.
- a model object that simulates the user's finger selected by calculating the quantity, and uses the stripes on the 3D stripe model to simulate the fingerprint of the user's finger.
- the black model as the model object will absorb the irradiated light, and the fingerprint detection module can only receive ambient light. Therefore, the image obtained by selecting the black model as the model object is compared with the image obtained by the 3D stripe model as the model object.
- the light emitted by the screen that is, the screen brightness, is evaluated.
- the 3D fringe model image data in multiple optical path directions is a frame of 3D fringe model image data in multiple optical path directions
- the black model image data in multiple optical path directions is a frame of black model image data in multiple optical path directions.
- only one frame of 3D fringe model image data in multiple optical path directions and one frame of black model image data in multiple optical path directions can be used to calculate the offset with high accuracy, avoiding the time-consuming acquisition of multiple frames of images. Increase.
- the 3D fringe model image data in multiple optical path directions can be processed by low-pass filtering, median filtering, etc.
- the specific processing methods are not limited in the embodiments of the present application.
- S302. Determine whether the screen brightness is abnormal according to the 3D stripe model image data and the black model image data.
- the screen brightness is abnormal; otherwise, the screen brightness is normal.
- the accuracy of the collected image data will be affected.
- it is determined whether the brightness of the screen is abnormal by directly comparing the brightness of the 3D stripe model image data and the brightness of the black model image data. This method can more accurately determine whether the screen brightness is abnormal.
- the screen brightness is abnormal, otherwise, the screen brightness is normal .
- the subsequent step S304 since the subsequent step S304 does not need to calculate the reflection amount of the 3D fringe model, it is only necessary to perform binarization processing on the image data of the 3D fringe model.
- This embodiment has lower requirements on screen brightness, and the embodiment of the present application only needs to satisfy that the average brightness value of the 3D fringe model image data is greater than or equal to half of the brightness average value of the black model image data. It can be judged that the screen brightness is normal.
- the abnormal brightness information when it is judged that the brightness is abnormal, the abnormal brightness information may also be returned.
- the available image offset cannot be obtained for generating a fusion map of fingerprint images, or extracting feature values of overlapping areas of fingerprint images. Therefore, when it is judged that the brightness of the screen is abnormal, the calculation of the offset is ended, so as to avoid a situation where the subsequent calculation is time-consuming but cannot provide a usable image offset.
- image binarization is the simplest method for image segmentation, that is, the gray value of the pixel on the image is set to 0 or 255, and the gray value of the pixel greater than the critical gray threshold is set to 0 or 255. is the grayscale maximum value, and the pixel grayscale less than the critical grayscale threshold is set as the grayscale minimum value, so as to obtain the binarized image data.
- the embodiment of the present application performs binarization processing on the 3D fringe model image data to obtain the binarized 3D fringe model image data or the binarized reflection amount data of the 3D fringe model.
- the step S304 includes:
- the grayscale average value of a preset number of image data points with the largest grayscale is taken as the grayscale maximum value of the 3D fringe model image data.
- the preset number of image data points may be 100 image data points.
- the threshold value of the binarization process MAX(maxData/10,offset), wherein the offset is a preset value determined according to the photosensitivity of the image recognition module.
- S3043. Perform binarization processing on the 3D fringe model image data according to the threshold to obtain binarized image data.
- the binarized 3D fringe model image data can be obtained to calculate the image offsets in adjacent multi-optical path directions, which is simple and time-consuming. short.
- the embodiment of the present application only needs to perform binarization processing on the image data of the 3D fringe model, and does not need to perform binarization processing on the reflection amount image data of the 3D fringe model, it has lower requirements on screen brightness, and only needs to meet the requirements of the 3D fringe model. If the average brightness of the image data is greater than or equal to half of the average brightness of the black model image data, it can be determined that the screen brightness is normal, so that the binarization process of the 3D fringe model image data is performed to obtain the image offset. Therefore, if the brightness of the screen of the electronic device is abnormal, the available image offset can also be obtained through the embodiments of the present application, and the application scenarios of the embodiments of the present application are more extensive.
- image expansion and erosion processing may be used to eliminate outliers in the binarized image data, and the embodiment of the present application does not limit the specific expansion and erosion processing method.
- S305 Calculate the image offset of any two adjacent images according to at least the centroid coordinates of the two binarized image data along the diagonal direction among the four pairs of binarized image data.
- four 3D fringe model image data or reflection quantity data of four 3D fringe models calculated according to the four 3D fringe model image data are obtained, and image binarization processing is performed on them to obtain four binarized 3D images.
- the fringe model image data or the reflection quantity image data of the four binarized 3D fringe models. Select the four binarized 3D fringe model image data or the two binarized 3D fringe model image data in the diagonal direction or the two binarized 3D fringe model image data in the diagonal direction of the four binarized 3D fringe image data.
- the centroid coordinates are calculated for the reflection amount image data to obtain two centroid coordinates of the two binarized 3D fringe model image data in the diagonal direction or the two centroid coordinates of the reflection amount image data of the two binarized 3D fringe models.
- x i is the abscissa value of pixel i in a pair of image data
- yi is the ordinate value of pixel i in a pair of image data
- mi is the binary value of pixel i in a pair of image data.
- the offsets of adjacent images in multiple optical path directions are obtained.
- dx is the image offset of the adjacent multi-path directions in the X direction on the two-dimensional plane
- dy is the image offset of the adjacent multi-path directions in the Y direction on the two-dimensional plane
- x(0) and x(3) are respectively are the coordinates of the two binarized image data in the diagonal direction in the X direction on the two-dimensional plane
- y(0) and y(3) are the two binarized image data in the diagonal direction.
- the image offsets of adjacent multi-optical path directions are obtained, so that the image offsets can be used to generate a fusion map of fingerprint images, so as to improve the accuracy of fingerprint recognition, and can also be used to extract features of overlapping areas of fingerprint images. value for fingerprint anti-counterfeiting detection.
- the method includes the steps described in any of the above embodiments.
- the method further includes:
- the adjacent multi-optical path direction image offsets are within the theoretical range interval, the adjacent multi-optical path direction image offsets are usable multi-optical path direction image offsets.
- the calculated multi-optical path image offset is controlled according to whether the calculated multi-optical path image offset falls within the theoretical range of the multi-optical path image offset, so as to ensure that the obtained multi-optical path image offset can be obtained.
- the multi-path direction image offset to use.
- the image offsets of adjacent multi-optical path directions are obtained, so that the image offsets can be used to generate a fusion map of fingerprint images, so as to improve the accuracy of fingerprint recognition, and can also be used to extract features of overlapping areas of fingerprint images. value for fingerprint anti-counterfeiting detection.
- the method includes the steps described in any of the above embodiments.
- the method further includes:
- the image offset obtained by calculation is saved in the file system of the electronic device where the fingerprint detection device is located, so that the image offset in the multi-optical path direction can be generated by calling the file system of the electronic device.
- the fusion map of fingerprint images can also be used to extract the feature value of the overlapping area of the fingerprint images by using the image offset in the multi-optical path direction, so as to be used for fingerprint anti-counterfeiting detection.
- the image offsets of adjacent multi-optical path directions are obtained, so that the image offsets can be used to generate a fusion map of fingerprint images, so as to improve the accuracy of fingerprint recognition, and can also be used to extract features of overlapping areas of fingerprint images. value for fingerprint anti-counterfeiting detection.
- An embodiment of the present application further provides a fingerprint detection module, the fingerprint detection module is disposed below the screen toward the direction of incident light, an air gap is formed between the screen and the fingerprint detection module, and the screen faces the direction of the incident light.
- a film layer is arranged in the direction of incident light, and the fingerprint detection module adopts the calculation method of the image offset in the multiple optical path directions described in any of the above embodiments to obtain the adjacent image offsets in the multiple optical path directions for fingerprint detection. .
- the image offsets of adjacent multi-optical path directions are obtained, so that the image offsets can be used to generate a fusion map of fingerprint images, so as to improve the accuracy of fingerprint recognition, and can also be used to extract features of overlapping areas of fingerprint images. value for fingerprint anti-counterfeiting detection.
- the embodiment of the present application also provides a fingerprint sensing device, the fingerprint detection device includes a film layer, a screen, an air gap and a fingerprint detection module arranged in sequence toward the incident light direction, and the fingerprint detection module adopts any one of the above-mentioned implementations.
- the method for calculating the image offsets in the multiple optical path directions described in the example obtains the adjacent image offsets in the multiple optical path directions for fingerprint detection.
- the image offsets of adjacent multi-optical path directions are obtained, so that the image offsets can be used to generate a fusion map of fingerprint images, so as to improve the accuracy of fingerprint recognition, and can also be used to extract features of overlapping areas of fingerprint images. value for fingerprint anti-counterfeiting detection.
- the embodiment of the present application also provides an electronic device, the electronic device includes a fingerprint detection device, and the fingerprint detection device includes a film layer, a screen, an air gap, and a fingerprint detection module arranged in sequence toward the incident light direction.
- the module adopts the method for calculating the image offset in the multiple optical path directions described in any of the above embodiments to obtain the adjacent image offsets in the multiple optical path directions for fingerprint detection.
- the image offsets of adjacent multi-optical path directions are obtained, so that the image offsets can be used to generate a fusion map of fingerprint images, so as to improve the accuracy of fingerprint recognition, and can also be used to extract features of overlapping areas of fingerprint images. value for fingerprint anti-counterfeiting detection.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Image Input (AREA)
- Collating Specific Patterns (AREA)
Abstract
一种图像偏移量计算方法及指纹检测模组、装置、电子设备,所述方法包括:对3D条纹模型进行图像数据采集,获得多光路方向的3D条纹模型图像数据;对所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据;至少根据对角线方向的两幅二值化的图像数据的质心坐标,计算相邻的任意两幅图像的偏移量。利用计算获得的多光路方向图像偏移量生成指纹图像的融合图,以提高指纹识别的准确性,也可利用所述多光路方向图像偏移量提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
Description
本申请涉及指纹识别技术领域,尤其涉及图像偏移量计算方法、指纹检测模组、装置及电子设备。
随着指纹识别技术的发展,多光路方向指纹检测技术因其检测准确性高而越来越多地被使用以改善用户的体验。
多光路方向指纹检测技术使用的一个主要部件是光学指纹检测模组。光学指纹检测模组包括像素阵列和多个导光通道。具体地,每个导光通道的端部具有一个像素,四个导光通道的端部设置有四个像素,每个像素接收某一固定光路方向的光信号,四个像素接收四个光路方向的光信号。像素阵列中接收同一个光路方向光信号的像素生成一幅图像。其中相邻两幅图像在二维平面的X方向和Y方向的距离称为多光路方向图像偏移量。多光路方向偏移量可用于生成指纹图像的融合图,以提高指纹识别的准确性。但是如何获得多光路方向偏移量成为多光路方向指纹检测技术中亟需解决的问题。
发明内容
鉴于此,本申请实施例所解决的技术问题之一在于提供一种图像偏移量计算方法及指纹检测模组、装置及电子设备,用以至少克服现有技术中的部分缺陷。
第一方面,本申请实施例提供了一种图像偏移量的计算方法。该方法应用于指纹检测装置中的指纹检测模组,所述指纹检测模组朝向入射光方向设置在屏幕下方,所述屏幕与所述指纹检测模组之间具有空气间隙,所述屏幕朝向入射光方向设置膜层,所述方法包括:对3D条纹模型进行图像数据采集,获得多光路方向的3D条纹模型图像数据;对所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据;至少根据四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标,计算相邻的任意两幅图像的图像偏移量。明显地,这里相邻的任意两幅图像不包括在对角线方向上临接的两幅图像。
第二方面,本申请实施例提供了一种指纹检测模组,所述指纹检测模组朝向入射光方向设置在所述屏幕下方,所述屏幕与所述指纹检测模组之间具有空气间隙,所述屏幕朝向入射光方向设置膜层,所述指纹检测模组采用上述的图像偏移量的计算方法获得所述相邻的任意两幅图像的偏移量以用于指纹检测。
第三方面,本申请实施例提供了一种指纹传感装置,所述指纹传感装置包括朝向入射光方向依次设置的膜层、屏幕、空气间隙以及指纹检测模组,所述指纹检测模组采用上述的图像偏移量的计算方法获得所述任意两幅图像的偏移量以用于指纹检测。
第四方面,本申请实施例提供了一种电子设备,所述电子设备包括指纹检测装置,所述指纹检测装置包括朝向入射光方向依次设置的膜层、屏幕、空气间隙以及指纹检测模组,所述指纹检测模组采用上述的图像偏移量的计算方法获得所述任意两幅图像的偏移量以用于指纹检测。
本申请实施例提供的一种图像偏移量计算方法及指纹检测模组、指纹传感装置、电子设备,指纹传感装置包括朝向入射光方向依次设置的膜层、屏幕、空气间隙以及指纹检测模组。本申请实施例在正常测试环境下对3D条纹模型进行图像数据采集,获得多光路方向的3D条纹模型图像数据,进而对所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,至少根据四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标,计算相邻的任意两幅图像的图像偏移量。本申请实施例可以获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可将图像偏移量用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
后文将参照附图以示例性而非限制性的方式详细描述本申请实施例的一些具体实施例。附图中相同的附图标记标示了相同或类似的部件或部分。本领域技术人员应该理解,这些附图未必是按比值绘制的。附图中:
图1为本申请实施例提供的一种指纹检测装置的结构示意图;
图2为本申请实施例提供的一种指纹检测装置的多光路方向图像的示意图;
图3为本申请实施例提供的一种图像偏移量计算方法的流程图;
图4为本申请实施例提供的图3中一种图像偏移量计算方法的步骤S103的一种实现方式的流程图;
图5为本申请实施例提供的另一种图像偏移量计算方法的流程图;
图6为本申请实施例提供的图5中另一种图像偏移量计算方法的步骤S204的一种实现方式的流程图;
图7为本申请实施例提供的一种图像偏移量计算方法中黑色模型与3D条纹模型的成像光路示意图;
图8为本申请实施例提供的再一种图像偏移量计算方法的流程图;
图9为本申请实施例提供的图8中再一种图像偏移量计算方法的步骤S304的一种实现方式的流程图;
图10为本申请实施例提供的再一种图像偏移量计算方法的流程图;
图11为本申请实施例提供的再一种图像偏移量计算方法的流程图。
参见图1,本申请实施例应用于指纹检测装置。该指纹检测装置可以接收多方向的入射光,包括朝向入射光方向依次设置的膜层11、屏幕12、空气间隙13以及指纹检测模组14。在图1中,P_Film为膜层11的厚度,P_Oled为屏幕12的厚度,P_Gap为空气间隙13的厚度。相邻两个图像在对角线方向上的偏移量d为经过膜层11、屏幕12、空气间隙13发生折射的偏移距离S1、S2和S3。入射光照射膜层的入射角度为θ,入射光在膜层11中的折射角度为α,在屏幕12中的折射角度为β,进入空气间隙13后的折射角度仍为θ。
参见图2,其中假设沿对角线方向的两个图像21a和23a在对角线方向上的偏移量为d,则两个图像21a和23a在二维平面上X方向的偏移量为dx,两个图像21a和23a在二维平面上Y方向的偏移量为dy。对角线方向的两个图像的偏移量d包括在二维平面上X方向的偏移量以及Y方向的偏移量。由于入射光依次经过图1中示出的膜层11、屏幕12和空气间隙13,经过同一被检测对象7(被检测对象例如可以为用户手指,或者下文中提到的3D条纹模型或者黑色模型)反射的光进入到不同方向的光路中,不同的光路方向的光信号被指纹检测模组14中的不同像素所接收,形成不同的图像数据,则相邻两个图像在对角线方向上的偏移量d为经过膜层11、屏幕12、空气间隙13发生折射的偏移距离S1、S2和S3的累加。
即,d=S1+S2+S3(公式一)。
其中,
S1=P_Film/tan(α),(公式二);
S2=P_Oled/tan(β),(公式三);
S3=P_Gap/tan(θ),(公式四)。
由于图2中像素21、23分别与22和24上方的导光通道或者说光路方向在像素平面的正投影相互垂直,则被检测对象7在像素21、23、22和24上形成的四幅图像中相邻两个图像在二维平面上的X方向的偏移量dx和相邻两个图像在二维平面上的Y方向的偏移量dy相等。
但是由于无法获知膜层的厚度、屏幕的厚度以及空气间隙的厚度的准确数值,且不同的指纹检测装置所应用的电子设备上述厚度也存在差异,无法根据上述公式一至公式五获得相邻两个图像在二维平面上X方向的偏移量dx和相邻两个图像在二维平面上Y方向的偏移量dy,即无法获得图像偏移量以进行后续应用。
下面结合本发明实施例附图进一步说明本发明实施例具体实现。为方便理解本申请的实施例,对本申请实施例的基本技术原理进行示例性说明。
本申请实施例提供一种图像偏移量的计算方法,应用于指纹检测装置。参见图1,所述指纹检测装置包括朝向入射光方向依次设置的膜层11、屏幕12、空气间隙13以及指纹检测模组14。
屏幕12可以用于显示各种预置的固定图案(pattern)。所述预置的固定图案包括:圆形、方形、三角形中至少其一。本申请实施例对所述固定图案的大小、间距无要求,所述固定图案的大小、间距对本申请实施例图像偏移量的计算并无影响。
指纹检测模组包括微透镜阵列(例如可以为n行*n列的微透镜阵列,其中n为自然数)和多层挡光层。微透镜阵列包括多个微透镜,多层挡光层中的每个挡光层包括多个小孔。微透镜阵列的每个微透镜下方可以对应具有多个方向的导光通道,例如四个方向的导光通道,导光通道是由多层挡光层的小孔形成的光路通道。每个导光通道的端部具有一个像素,四个导光通道的端部具有四个像素,该四个像素组成一个像素单元,指纹检测模组包括多个所述像素单元(即像素单元阵列)。需要说明的是,图2中仅仅示意性的示出指纹检测模 组像素阵列中的一个像素单元。该像素单元包括四个像素21、22、23、24,该四个像素21、22、23、24分别接收四个固定光路方向的光信号。指纹检测模组像素阵列中接收同一方向的光信号的像素形成一幅图像。以图2中示出的一个像素单元为例,四个像素21、22、23、24分别只接收某一固定光路方向的光信号。指纹检测模组像素阵列中接收四个光路方向的光信号的像素分别形成四幅图像21a、22a、23a、24a,其中相邻两幅图像在二维平面的X方向和Y方向的距离称为图像偏移量。这里,相邻两幅图像是指上下或者左右相邻的两幅图像,不包括沿对角线方向临接的图像。
具体地,本申请实施例计算图像偏移量通常在指纹检测装置所应用的电子设备的整机测试阶段完成,并将测试阶段获得的图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可将图像偏移量用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
通常选择指纹检测模组14运行正常的测试环境,即所述指纹检测模组进行串行外设接口(Serial Peripheral Interface,SPI)测试,一次性可编程(One Time Programable,OTP)检查、内存(Ram Stress Test,RST)测试以及集成测试(integration testing,INT)均通过的测试环境。
本申请实施例在指纹检测模组运行正常的测试环境,对3D条纹模型进行图像数据采集,并对获得的3D条纹模型图像数据进行后续计算处理,不会因指纹检测模组故障,影响图像数据后续计算的准确性。
参见图3,所述方法包括:
S101、对3D条纹模型进行图像数据采集,获得多光路方向的3D条纹模型图像数据。
具体地,3D条纹模型为进行多光路方向图像偏移量的计算而选择的模拟用户手指的模型对象,利用所述3D条纹模型上的条纹模拟用户手指指纹。
为了改善采集的多光路方向的3D条纹模型图像数据质量,可对所述多光路方向的3D条纹模型图像数据进行低通滤波、中值滤波等处理,具体处理方式本申请实施例并不限定。
S102、对所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据。
具体地,图像的二值化(Thresholding),是图像分割的一种最简单的方法,即将图像上的像素点的灰度值设置为0或255,把大于临界灰度阈值的像 素灰度设为灰度极大值,把小于临界灰度阈值的像素灰度设为灰度极小值,从而得到二值化的图像数据。
本申请实施例对所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的所述3D条纹模型图像数据或者二值化的3D条纹模型的反射量数据。
为了进一步改善二值化的图像数据,可以采用图像膨胀和腐蚀处理以消除二值化图像数据中的孤点,本申请实施例对具体图像膨胀和腐蚀处理方法并不限定。
S103、至少根据四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标(center of mass),计算相邻的任两幅图像的图像偏移量。
在本申请实施例的一具体实现参见图4,所述步骤S103包括:
S1031、至少计算所述四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标。
具体地,本申请实施例获得四幅3D条纹模型图像数据或者根据四幅3D条纹模型图像数据计算得到的3D条纹模型的四幅反射量数据,对其进行图像二值化处理,获得四幅二值化的3D条纹模型图像数据或者四幅二值化的3D条纹模型的反射量图像数据。选择四幅二值化的3D条纹模型图像数据或者四幅二值化的3D条纹模型的反射量图像数据中处于对角线方向的两幅二值化的3D条纹模型图像数据或者两幅二值化的3D条纹模型的反射量图像数据进行质心坐标计算,获得对角线方向的两幅二值化的3D条纹模型图像数据或者两幅二值化的3D条纹模型的反射量图像数据的两个质心坐标。
所述质心坐标计算公式如下:
其中,x
i为一副图像数据中像素i的横坐标值,y
i为一副图像数据中像素i的纵坐标值,m
i为一副图像数据中像素i的二进制(Binary)值。
S1032、根据所述质心坐标在二维平面上X方向和Y方向上的差值,获得相邻的任意两幅图像的图像偏移量。
具体计算公式如下:
dx=x(0)–x(3),(公式八)
dy=y(0)–y(3),(公式九)
其中,dx为二维平面上X方向相邻的多光路方向图像偏移量,dy为二维平面上Y方向相邻的多光路方向图像偏移量,x(0)和x(3)分别为对角线方向的两幅二值化的图像数据在二维平面上X方向的坐标,y(0)和y(3)分别为对角线方向的两幅二值化的图像数据在二维平面上Y方向的坐标。
本申请实施例也可以在沿对角线方向的两幅二值化的图像数据以外,采用四副二值化的图像数据中的另外的二值化图像数据计算相邻的任两幅图像的图像偏移量。
本申请实施例获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
在本申请另一实施例中,参见图5,所述方法包括:
S201、分别对3D条纹模型以及黑色模型进行图像数据采集,获得多光路方向的3D条纹模型图像数据以及多光路方向的黑色模型图像数据。
具体地,屏幕光线经过设置在屏幕上方的对象(手指、3D条纹模型以及黑色模型等)发生反射,反射光线作为入射光被所述指纹检测模组所接收,3D条纹模型为进行多光路方向图像偏移量的计算而选择的模拟用户手指的模型对象,利用所述3D条纹模型上的条纹模拟用户手指指纹。黑色模型作为模型对象会吸收照射的光线,指纹检测模组只能接收到环境光,因此选择黑色模型作为模型对象所获得的图像与3D条纹模型作为模型对象所获得的图像进行对比,可以对所述屏幕所发射光线的情况,即屏幕亮度进行评估。
具体地,所述多光路方向的3D条纹模型图像数据为一帧多光路方向的3D条纹模型图像数据;所述多光路方向的黑色模型图像数据为一帧多光路方向的黑色模型图像数据。
本申请实施例仅采集一帧多光路方向的3D条纹模型图像数据以及一帧多光路方向的黑色模型图像数据即可实现计算偏移量有较高的精度,也避免采集多帧图像造成的耗时增加。
为了改善采集的多光路方向的3D条纹模型图像数据质量,可对所述多光路方向的3D条纹模型图像数据进行低通滤波、中值滤波等处理,具体处理方式本申请实施例并不限定。
S202、根据所述3D条纹模型图像数据以及所述黑色模型图像数据确定 所述屏幕亮度是否异常。
在本申请实施例的一具体实现中,如果所述3D条纹模型图像数据的亮度小于所述黑色模型图像数据的亮度,则所述屏幕亮度异常,否则,屏幕亮度正常。
如果电子设备屏幕的亮度异常,则会影响采集图像数据的准确性,本申请实施例通过直接比较3D条纹模型图像数据的亮度与黑色模型图像数据的亮度判断所述屏幕亮度是否异常,这种比较方式能够更加准确地判断屏幕亮度异常。
S203、如果屏幕亮度异常,结束所述偏移量的计算。
本申请实施例当判断亮度异常时,还可以返回亮度异常信息。
如果屏幕的亮度异常,则无法获得可用的图像偏移量,以用于生成指纹图像的融合图,或者提取指纹图像重叠区域的特征值。因此,当判断屏幕的亮度异常,则结束所述偏移量的计算,避免后续计算耗时却无法提供可用的图像偏移量的情况。
S204、如果屏幕亮度正常,根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据。
具体地,图像的二值化(Thresholding),是图像分割的一种最简单的方法,即将图像上的像素点的灰度值设置为0或255,把大于临界灰度阈值的像素灰度设为灰度极大值,把小于临界灰度阈值的像素灰度设为灰度极小值,从而得到二值化的图像数据。
本申请实施例对根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的所述3D条纹模型图像数据或者二值化的所述反射量数据。
在本申请实施例的一具体实现中,参见图6,所述步骤S204包括:
S2041、将所述3D条纹模型图像数据减去所述黑色模型图像数据获得所述3D条纹模型的反射量数据。
指纹检测模组接收的光分两部分,一部分未经过屏幕上的物体(例如,手指、3D条纹模型或者黑色模型)的反射,由屏幕上的固定图案直接照向指纹检测模组,称为漏光,属于无用的干扰光;另一部分经屏幕上的物体反射回指纹检测模组,称为反射光,属于有用光。
参见图7,由于黑色模型71位于屏幕上方时,黑色模型吸收屏幕发射的 光,屏幕发射的光经过黑色模型反射回指纹检测模组14的很少,因此采集黑色模型图像数据时,指纹检测模组14接收的光可认为全部为漏光量,即无用的干扰光。而3D条纹模型72位于屏幕上方时,对3D条纹模型进行图像数据采集时,指纹检测模组14接收的光包括有用光和无用的干扰光。
本申请实施例将所述3D条纹模型图像数据减去所述黑色模型图像数据相当于将3D条纹模型位于屏幕上方时,指纹检测模组14接收的有用光和无用的干扰光减去无用的干扰光,从而获得所述3D条纹模型位于屏幕上方时所述3D条纹模型图像数据的反射量数据,即有用光的数据。
S2042、采用大律法对所述3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据。
大律法(OSTU)又名最大类间差方法,通过统计整个图像的直方图特性来实现全局阈值的自动选取。本申请实施例采用大律法对所述3D条纹模型的反射量数据进行二值化处理,其计算简单快速,得到二值化的图像数据不受图像亮度和对比度的影响。
为了进一步改善二值化的图像数据,可以采用图像膨胀和腐蚀处理以消除二值化图像数据中的孤点,本申请实施例对具体图像膨胀和腐蚀处理方法并不限定。
S205、至少根据四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标,计算相邻的任两幅图像的图像偏移量。
具体地,本申请实施例获得四幅3D条纹模型图像数据或者根据四幅3D条纹模型图像数据计算得到的四幅3D条纹模型的反射量数据,对其进行图像二值化处理,获得四幅二值化的3D条纹模型图像数据或者四幅二值化的3D条纹模型的反射量图像数据。选择四幅二值化的3D条纹模型图像数据或者四幅二值化的3D条纹模型的反射量图像数据中处于对角线方向的两幅二值化的3D条纹模型图像数据或者两幅二值化的3D条纹模型的反射量图像数据进行质心坐标计算,获得对角线方向的两幅二值化的3D条纹模型图像数据或者两幅二值化的3D条纹模型的反射量图像数据的两个质心坐标。
所述质心坐标计算公式如下:
其中,x
i为一副图像数据中像素i的横坐标值,y
i为一副图像数据中像素i的纵坐标值,m
i为一副图像数据中像素i的二进制(Binary)值。
根据所述质心坐标在二维平面上X方向和Y方向上的差值,获得相邻的多光路方向图像偏移量。
具体计算公式如下:
dx=x(0)–x(3),(公式八)
dy=y(0)–y(3),(公式九)
其中,dx为二维平面上X方向相邻的多光路方向图像偏移量,dy为二维平面上Y方向相邻的多光路方向图像偏移量,x(0)和x(3)分别为对角线方向的两幅二值化的图像数据在二维平面上X方向的坐标,y(0)和y(3)分别为对角线方向的两幅二值化的图像数据在二维平面上Y方向的坐标。
本申请实施例获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
在本申请再一实施例中,参见图8,所述方法包括:
S301、分别对3D条纹模型以及黑色模型进行图像数据采集,获得多光路方向的3D条纹模型图像数据以及多光路方向的黑色模型图像数据。
具体地,屏幕光线经过设置在屏幕上方的对象(手指、3D条纹模型以及黑色模型等)发生反射,反射光线作为入射光被所述指纹检测模组所接收3D条纹模型进行多光路方向图像偏移量的计算而选择的模拟用户手指的模型对象,利用所述3D条纹模型上的条纹模拟用户手指指纹。黑色模型作为模型对象会吸收照射的光线,指纹检测模组只能接收到环境光,因此选择黑色模型作为模型对象所获得的图像与3D条纹模型作为模型对象所获得的图像进行对比,可以对所述屏幕所发射光线的情况,即屏幕亮度进行评估。
具体地,所述多光路方向的3D条纹模型图像数据为一帧多光路方向的3D条纹模型图像数据;所述多光路方向的黑色模型图像数据为一帧多光路方向的黑色模型图像数据。
本申请实施例仅采集一帧多光路方向的3D条纹模型图像数据以及一帧多光路方向的黑色模型图像数据即可实现计算偏移量有较高的精度,避免采集多帧图像造成的耗时增加。
为了改善采集的多光路方向的3D条纹模型图像数据质量,可对所述多 光路方向的3D条纹模型图像数据进行低通滤波、中值滤波等处理,具体处理方式本申请实施例并不限定。
S302、根据所述3D条纹模型图像数据以及所述黑色模型图像数据确定所述屏幕亮度是否异常。
在本申请再一实施例具体实现中,如果所述3D条纹模型图像数据的亮度小于所述黑色模型图像数据的亮度,所述屏幕亮度异常,否则,屏幕亮度正常。
如果电子设备屏幕的亮度异常,则会影响采集图像数据的准确性,本申请实施例通过直接比较3D条纹模型图像数据的亮度与黑色模型图像数据的亮度判断所述屏幕亮度是否异常,这种比较方式能够更加准确地判断屏幕亮度异常。
在本申请实施例的另一具体实现中,如果所述3D条纹模型图像数据的亮度均值小于所述黑色模型图像数据的亮度均值的二分之一,所述屏幕亮度异常,否则,屏幕亮度正常。
本申请实施例由于后续步骤S304无需对3D条纹模型的反射量进行计算,仅需要针对所述3D条纹模型图像数据进行二值化处理,相对于需要通过对3D条纹模型的反射量来进行二值化处理的实施例,本实施例对于屏幕亮度的要求更低,本申请实施例仅需要满足3D条纹模型图像数据的亮度均值大于或者等于所述黑色模型图像数据的亮度均值的二分之一即可判断为屏幕亮度正常。
S303、如果屏幕亮度异常,结束所述偏移量的计算。
本申请实施例当判断亮度异常时,还可以返回亮度异常信息。
如果屏幕的亮度异常,则无法获得可用的图像偏移量,以用于生成指纹图像的融合图,或者提取指纹图像重叠区域的特征值。因此,当判断屏幕的亮度异常,则结束所述偏移量的计算,避免后续计算耗时却无法提供可用的图像偏移量的情况。
S304、如果屏幕亮度正常,对所述3D条纹模型图像数据进行二值化处理,得到二值化的图像数据。
具体地,图像的二值化(Thresholding),是图像分割的一种最简单的方法,即将图像上的像素点的灰度值设置为0或255,把大于临界灰度阈值的像素灰度设为灰度极大值,把小于临界灰度阈值的像素灰度设为灰度极小值,从而得到二值化的图像数据。
本申请实施例对所述3D条纹模型图像数据进行二值化处理,得到二值化的所述3D条纹模型图像数据或者二值化的所述3D条纹模型的反射量数据。
在本申请实施例的一具体实现中,参见图9,所述步骤S304包括:
S3041、确定所述3D条纹模型图像数据的灰度最大值。
具体地,为了减少计算量,取灰度最大的预设数量的图像数据点的灰度平均值作为所述3D条纹模型图像数据的灰度最大值。例如,预设数量的图像数据点可以为100个图像数据点。
S3042、取所述最大值的十分之一和预设值两者中的最大值作为二值化处理的阈值,所述预设值根据所述图像识别模组的感光灵敏度确定。
即,二值化处理的阈值=MAX(maxData/10,offset),其中offset为根据所述图像识别模组的感光灵敏度确定的预设值。
S3043、根据所述阈值对所述3D条纹模型图像数据进行二值化处理,得到二值化的图像数据。
本申请实施例仅需要对3D条纹模型图像数据进行二值化处理,得到二值化的所述3D条纹模型图像数据即可进行相邻的多光路方向图像偏移量计算,计算简便且耗时短。
由于本申请实施例仅需要对3D条纹模型图像数据进行二值化处理,无需针对3D条纹模型的反射量图像数据进行二值化处理,其对于屏幕亮度的要求较低,仅需要满足3D条纹模型图像数据的亮度均值大于或者等于所述黑色模型图像数据的亮度均值的二分之一即可判断为屏幕亮度正常,从而进行3D条纹模型图像数据的二值化处理获得图像偏移量。因此,如果电子设备屏幕的亮度异常,也可以通过本申请实施例获得可用的图像偏移量,本申请实施例的应用场景更加广泛。
为了进一步改善二值化的图像数据,可以采用图像膨胀和腐蚀处理以消除二值化图像数据中的孤点,本申请实施例对具体膨胀腐蚀处理方法并不限定。
S305、至少根据四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标,计算相邻的任两幅图像的图像偏移量。
具体地,本申请实施例获得四幅3D条纹模型图像数据或者根据四幅3D条纹模型图像数据计算得到的四幅3D条纹模型的反射量数据,对其进行图像二值化处理,获得四幅二值化的3D条纹模型图像数据或者四幅二值化的3D条纹模型的反射量图像数据。选择四幅二值化的3D条纹模型图像数据或者四幅 二值化的反射量图像数据中处于对角线方向的两幅二值化的3D条纹模型图像数据或者两幅二值化的3D条纹模型的反射量图像数据进行质心坐标计算,获得对角线方向的两幅二值化的3D条纹模型图像数据或者两幅二值化的3D条纹模型的反射量图像数据的两个质心坐标。
所述质心坐标计算公式如下:
其中,x
i为一副图像数据中像素i的横坐标值,y
i为一副图像数据中像素i的纵坐标值,m
i为一副图像数据中像素i的二进制(Binary)值。
根据所述质心坐标在二维平面上X方向和Y方向上的差值,获得相邻的多光路方向图像偏移量。
具体计算公式如下:
dx=x(0)–x(3),(公式八)
dy=y(0)–y(3),(公式九)
其中,dx为二维平面上X方向相邻的多光路方向图像偏移量,dy为二维平面上Y方向相邻的多光路方向图像偏移量,x(0)和x(3)分别为对角线方向的两幅二值化的图像数据在二维平面上X方向的坐标,y(0)和y(3)分别为对角线方向的两幅二值化的图像数据在二维平面上Y方向的坐标。
本申请实施例获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
在本申请再一实施例中,所述方法包括上述任一实施例所述的步骤。
参见图10,所述方法还包括:
S404、根据所述膜层厚度区间、所述屏幕厚度区间、所述空气间隙厚度区间、所述膜层的折射率、所述屏幕的折射率计算获得所述相邻的多光路方向图像偏移量的理论范围区间。
即,将所述膜层厚度区间、所述屏幕厚度区间、所述空气间隙厚度区间、所述膜层的折射率、所述屏幕的折射率计算获得所述相邻的多光路方向图像偏移量的理论范围区间输入上述公式一至公式五,获得相邻两个图像在二维平面上的X方向的偏移量的理论范围区间和相邻两个图像在二维平面上的Y方向的 偏移量的理论范围区间。
S405、如所述相邻的多光路方向图像偏移量处于所述理论范围区间内,则所述相邻的多光路方向图像偏移量为可使用的多光路方向图像偏移量。
本申请实施例根据计算获得的多光路方向图像偏移量是否落入多光路方向图像偏移量的理论范围区间内,对计算获得的多光路方向图像偏移量进行卡控,从而保证获得可使用的多光路方向图像偏移量。
本申请实施例获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
在本申请再一实施例中,所述方法包括上述任一实施例所述的步骤。
参见图11,所述方法还包括:
S504、保存所述相邻的多光路方向图像偏移量至指纹检测装置所在的电子设备的文件系统中。
本申请实施例通过将计算获得的图像偏移量保存至指纹检测装置所在的电子设备的文件系统中,从而通过调用所述电子设备的文件系统即可利用所述多光路方向图像偏移量生成指纹图像的融合图,以提高指纹识别的准确性,也可利用所述多光路方向图像偏移量提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
本申请实施例获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
本申请实施例还提供一种指纹检测模组,所述指纹检测模组朝向入射光方向设置在所述屏幕下方,所述屏幕与所述指纹检测模组之间具有空气间隙,所述屏幕朝向入射光方向设置膜层,所述指纹检测模组采用上述任一实施例所述的多光路方向图像偏移量的计算方法获得所述相邻的多光路方向图像偏移量以用于指纹检测。
本申请实施例获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
本申请实施例还提供一种指纹传感装置,所述指纹检测装置包括朝向入射光方向依次设置的膜层、屏幕、空气间隙以及指纹检测模组,所述指纹检测 模组采用上述任一实施例所述的多光路方向图像偏移量的计算方法获得所述相邻的多光路方向图像偏移量以用于指纹检测。
本申请实施例获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
本申请实施例还提供一种电子设备,所述电子设备包括指纹检测装置,所述指纹检测装置包括朝向入射光方向依次设置的膜层、屏幕、空气间隙以及指纹检测模组,所述指纹检测模组采用上述任一实施例所述的多光路方向图像偏移量的计算方法获得所述相邻的多光路方向图像偏移量以用于指纹检测。
本申请实施例获得相邻的多光路方向图像偏移量,从而可将图像偏移量用于生成指纹图像的融合图,以提高指纹识别的准确性,也可用于提取指纹图像重叠区域的特征值,以用于进行指纹防伪检测。
至此,已经对本主题的特定实施例进行了描述。其它实施例在所附权利要求书的范围内。在一些情况下,权利要求书中记载的动作可以按照不同的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定要求示出的特定顺序或者连续顺序,以实现期望的结果。在某些实施方式中,多任务处理和并行处理可以是有利的。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述较简单,相关之处参见方法实施例的部分说明即可。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。
Claims (14)
- 一种图像偏移量的计算方法,其特征在于,应用于指纹检测装置中的指纹检测模组,所述指纹检测模组朝向入射光方向设置在屏幕下方,所述屏幕与所述指纹检测模组之间具有空气间隙,所述屏幕朝向入射光方向设置膜层,所述方法包括:对3D条纹模型进行图像数据采集,获得多光路方向的3D条纹模型图像数据;对所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据;至少根据四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标,计算相邻的任意两幅图像的图像偏移量。
- 根据权利要求1所述的图像偏移量的计算方法,其特征在于,所述方法还包括:对黑色模型进行图像数据采集,获得多光路方向的黑色模型图像数据;所述方法,还包括:根据所述3D条纹模型图像数据以及所述黑色模型图像数据确定所述屏幕亮度是否异常;如果屏幕亮度异常,结束所述偏移量的计算。
- 根据权利要求2所述的图像偏移量的计算方法,其特征在于,所述方法还包括:如果屏幕亮度正常,执行所述对所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据。
- 根据权利要求3所述的图像偏移量的计算方法,其特征在于,所述根据所述3D条纹模型图像数据以及所述黑色模型图像数据确定所述屏幕亮度是否正常,包括:如果所述3D条纹模型图像数据的亮度均值小于所述黑色模型图像数据的亮度均值的二分之一,所述屏幕亮度异常,否则,所述屏幕亮度正常。
- 根据权利要求3所述的图像偏移量的计算方法,其特征在于,所述根据所述3D条纹模型图像数据以及所述黑色模型图像数据确定所述屏幕亮度是否正常,包括:如果所述3D条纹模型图像数据的亮度小于所述黑色模型图像数据的亮度, 所述屏幕亮度异常,否则,所述屏幕亮度正常。
- 根据权利要求4或5所述的图像偏移量的计算方法,其特征在于,所述对所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据,包括:确定所述3D条纹模型图像数据的灰度最大值;取所述最大值的十分之一和预设值两者中的最大值作为二值化处理的阈值,所述预设值根据所述图像识别模组的感光灵敏度确定;根据所述阈值对所述3D条纹模型图像数据进行二值化处理,得到二值化的图像数据。
- 根据权利要求5所述的图像偏移量的计算方法,其特征在于,所述3D条纹模型图像数据或者根据所述3D条纹模型图像数据获得的3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据,包括:将所述3D条纹模型图像数据减去所述黑色模型图像数据获得所述3D条纹模型的反射量数据;采用大律法对所述3D条纹模型的反射量数据进行二值化处理,得到二值化的图像数据。
- 根据权利要求1-5中任一项所述的图像偏移量的计算方法,其特征在于,所述至少根据四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标,计算相邻的任两幅图像的图像偏移量,包括:至少计算所述四副二值化的图像数据中沿对角线方向的两幅二值化的图像数据的质心坐标;根据所述质心坐标在二维平面上X方向和Y方向上的差值,获得相邻的任两幅图像的图像偏移量。
- 根据权利要求1-5中任一项所述的图像偏移量的计算方法,其特征在于,所述多光路方向的3D条纹模型图像数据为一帧多光路方向的3D条纹模型图像数据;和/或,所述多光路方向的黑色模型图像数据为一帧多光路方向的黑色模型图像数据。
- 根据权利要求1-5中任一项所述的图像偏移量的计算方法,其特征在于,所述方法还包括:根据所述膜层厚度区间、所述屏幕厚度区间、所述空气间隙厚度区间、所述膜层的折射率、所述屏幕的折射率计算获得所述相邻的多光路方向图像偏移 量的理论范围区间;如所述相邻的多光路方向图像偏移量处于所述理论范围区间内,则所述相邻的多光路方向图像偏移量为可用的多光路方向图像偏移量。
- 根据权利要求1-5中任一项所述的图像偏移量的计算方法,其特征在于,所述方法还包括:保存所述相邻的多光路方向图像偏移量至指纹检测装置所在的电子设备的文件系统中。
- 一种指纹检测模组,其特征在于,所述指纹检测模组朝向入射光方向设置在所述屏幕下方,所述屏幕与所述指纹检测模组之间具有空气间隙,所述屏幕朝向入射光方向设置膜层,所述指纹检测模组采用权利要求1-11中任一项所述的图像偏移量的计算方法获得所述相邻的任意两幅图像的偏移量,以用于指纹检测。
- 一种指纹传感装置,其特征在于,所述指纹检测装置包括朝向入射光方向依次设置的膜层、屏幕、空气间隙以及指纹检测模组,所述指纹检测模组采用权利要求1-11中任一项所述的图像偏移量的计算方法获得所述相邻的任意两幅图像的偏移量以用于指纹检测。
- 一种电子设备,其特征在于,所述电子设备包括指纹检测装置,所述指纹检测装置包括朝向入射光方向依次设置的膜层、屏幕、空气间隙以及指纹检测模组,所述指纹检测模组采用权利要求1-11中任一项所述的图像偏移量的计算方法获得所述相邻的任意两幅图像的偏移量以用于指纹检测。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/112270 WO2022041146A1 (zh) | 2020-08-28 | 2020-08-28 | 图像偏移量计算方法、指纹检测模组、装置及电子设备 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/112270 WO2022041146A1 (zh) | 2020-08-28 | 2020-08-28 | 图像偏移量计算方法、指纹检测模组、装置及电子设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022041146A1 true WO2022041146A1 (zh) | 2022-03-03 |
Family
ID=80352467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/112270 WO2022041146A1 (zh) | 2020-08-28 | 2020-08-28 | 图像偏移量计算方法、指纹检测模组、装置及电子设备 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022041146A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115656978A (zh) * | 2022-10-31 | 2023-01-31 | 哈尔滨工业大学 | 目标条纹图像中光斑位置的获得方法和装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103295232A (zh) * | 2013-05-15 | 2013-09-11 | 西安电子科技大学 | 基于直线和区域的sar图像配准方法 |
US20180121707A1 (en) * | 2016-01-07 | 2018-05-03 | Shanghai Oxi Technology Co., Ltd | Optical Fingerprint Module |
CN111095282A (zh) * | 2019-10-18 | 2020-05-01 | 深圳市汇顶科技股份有限公司 | 指纹检测装置和电子设备 |
CN111108511A (zh) * | 2019-07-12 | 2020-05-05 | 深圳市汇顶科技股份有限公司 | 指纹检测装置和电子设备 |
-
2020
- 2020-08-28 WO PCT/CN2020/112270 patent/WO2022041146A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103295232A (zh) * | 2013-05-15 | 2013-09-11 | 西安电子科技大学 | 基于直线和区域的sar图像配准方法 |
US20180121707A1 (en) * | 2016-01-07 | 2018-05-03 | Shanghai Oxi Technology Co., Ltd | Optical Fingerprint Module |
CN111108511A (zh) * | 2019-07-12 | 2020-05-05 | 深圳市汇顶科技股份有限公司 | 指纹检测装置和电子设备 |
CN111095282A (zh) * | 2019-10-18 | 2020-05-01 | 深圳市汇顶科技股份有限公司 | 指纹检测装置和电子设备 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115656978A (zh) * | 2022-10-31 | 2023-01-31 | 哈尔滨工业大学 | 目标条纹图像中光斑位置的获得方法和装置 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109255323B (zh) | 一种指纹识别结构、显示基板和显示装置 | |
CN105517677B (zh) | 深度图/视差图的后处理方法和装置 | |
KR102674646B1 (ko) | 뷰로부터 거리 정보를 획득하는 장치 및 방법 | |
CN110473179B (zh) | 一种基于深度学习的薄膜表面缺陷检测方法、系统及设备 | |
CN104981105A (zh) | 一种快速精确获得元件中心和偏转角度的检测及纠偏方法 | |
CN106524909B (zh) | 三维图像采集方法及装置 | |
TWI765442B (zh) | 瑕疵等級判定的方法及存儲介質 | |
CN109341668A (zh) | 基于折射投影模型和光束追踪法的多相机测量方法 | |
CN112945141A (zh) | 基于微透镜阵列的结构光快速成像方法及系统 | |
WO2022041146A1 (zh) | 图像偏移量计算方法、指纹检测模组、装置及电子设备 | |
CN113192013A (zh) | 一种反光表面缺陷检测方法、系统及电子设备 | |
RU2363018C1 (ru) | Способ селекции объектов на удаленном фоне | |
CN101726264A (zh) | 一种针对投射条纹图像的残差滤波方法 | |
CN116777877A (zh) | 电路板缺陷检测方法、装置、计算机设备及存储介质 | |
CN111583191B (zh) | 基于光场epi傅里叶变换的折射特征检测方法 | |
CN112052769B (zh) | 图像偏移量计算方法、指纹检测模组、装置及电子设备 | |
CN110896469B (zh) | 用于三摄的解像力测试方法及其应用 | |
CN116883981A (zh) | 一种车牌定位识别方法、系统、计算机设备及存储介质 | |
Song et al. | Automatic calibration method based on improved camera calibration template | |
CN113554688B (zh) | 一种基于单目视觉的o型密封圈尺寸测量方法 | |
CN215769752U (zh) | 指纹识别模组及电子设备 | |
CN113033635B (zh) | 一种硬币隐形图文检测方法及装置 | |
CN115047006A (zh) | 一种二极管表面质量检测方法 | |
JP2012118732A (ja) | 3次元形状認識装置及び方法 | |
CN113128499A (zh) | 视觉成像设备的震动测试方法、计算机设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20950816 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20950816 Country of ref document: EP Kind code of ref document: A1 |