CN117994825A - Method and device for acquiring palm biological characteristics - Google Patents

Method and device for acquiring palm biological characteristics Download PDF

Info

Publication number
CN117994825A
CN117994825A CN202410366183.2A CN202410366183A CN117994825A CN 117994825 A CN117994825 A CN 117994825A CN 202410366183 A CN202410366183 A CN 202410366183A CN 117994825 A CN117994825 A CN 117994825A
Authority
CN
China
Prior art keywords
image
value
palm
coordinates
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410366183.2A
Other languages
Chinese (zh)
Other versions
CN117994825B (en
Inventor
毛华鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Bedit Information Technology Co ltd
Original Assignee
Chengdu Bedit Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Bedit Information Technology Co ltd filed Critical Chengdu Bedit Information Technology Co ltd
Priority to CN202410366183.2A priority Critical patent/CN117994825B/en
Publication of CN117994825A publication Critical patent/CN117994825A/en
Application granted granted Critical
Publication of CN117994825B publication Critical patent/CN117994825B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a method and a device for acquiring palm biological characteristics, which relate to the technical field of palm biological characteristic acquisition and comprise the following steps: s1, carrying four ranging sensors arranged at four corners on a front panel of an image sensor, and respectively obtaining the space distance between a palm and each ranging sensor; s2, acquiring an original picture, dividing point location data corresponding to four ranging sensors into two groups, and carrying out coordinate deformation based on a space distance difference value of each group to obtain corrected point location coordinates; s3, substituting the original coordinates and correction coordinate data of the four points into a perspective transformation matrix, and calculating to obtain the perspective transformation matrix; s4, correcting the original picture through the perspective transformation matrix to obtain a target corrected picture for identification. The invention also provides a device for acquiring the palm biological characteristics, which integrates the infrared light supplementing array on the light supplementing lamp panel and is separated from the acquisition main board, so that the light brightness can be improved, and no additional light guide component is required.

Description

Method and device for acquiring palm biological characteristics
Technical Field
The invention relates to the technical field of biological feature acquisition, in particular to a method and a device for acquiring palm biological features.
Background
In recent years, the requirements of people on identity authentication technology are continuously improved, and the biological characteristic-based identification technology has the characteristics of outstanding convenience and safety, wherein the biological characteristic identification technology based on palmprint and palmvein is particularly rapidly developed. On the one hand, in the acquisition mode, the traditional contact palm acquisition system has the problems of low convenience, low user acceptance, poor suitability and the like, and the problems can be effectively solved by adopting a non-contact acquisition mode; and on the other hand, the vein image under the palm cortex is used for identifying the personnel, so that the personnel is more difficult to steal and imitate compared with other biological identification means, and the safety coefficient is higher.
Most of the palm vein acquisition devices at present are based on near infrared irradiation acquisition, but because the palm surface is difficult to realize completely parallel with the image acquisition device in the use process of a user, the imaging quality is affected, unsuccessful recognition is easily caused, and the use difficulty is increased.
Disclosure of Invention
Aiming at the problem of difficult recognition caused by non-parallelism between the palm surface and the image acquisition device, the invention provides a method for acquiring palm biological characteristics. The inclined palm image is transformed into the flat-laid image in a perspective mode, and quick and accurate biological recognition is facilitated.
In order to achieve the purpose of the invention, the technical scheme adopted by the invention is as follows:
a method of acquiring palm biometric features, comprising the steps of:
s1, carrying four ranging sensors arranged at four corners on a front panel of an image sensor, and respectively obtaining the space distance between a palm and each ranging sensor;
s2, acquiring an original picture, dividing point location data corresponding to four distance measuring sensors into two groups, wherein the maximum difference value of the space distances is one group, the other two points are the other group, calculating the difference value of the space distances of each group respectively, and carrying out coordinate deformation based on the difference value of the space distances to obtain corrected point location coordinates;
S3, substituting the original coordinates and correction coordinate data of the four point positions into a perspective transformation formula, and calculating to obtain a perspective transformation matrix;
s4, correcting the original picture through perspective transformation, and obtaining a target corrected picture for recognition.
The method sets the original coordinates of the four points as (a i,bi), the corrected coordinates as (a' i,b'i), and the spatial distances between the palm and each ranging sensor are sequentially d 1、d2、d3、d4, wherein i=1, 2,3 and 4;
If the group with the largest difference value of the space distances is adjacent points, calculating the difference value of the space distances:
ΔZ12=|d1-d2|
Calculating corrected point coordinates according to the following function:
And similarly calculating the spatial distance difference value of the other group of points and the corrected coordinates (a' i,b'i), wherein i=3 and 4.
The method sets the original coordinates of the four points as (a i,bi), the corrected coordinates as (a' i,b'i) and i=1, 2,3 and 4; the space distance between the palm and each ranging sensor is sequentially d 1、d2、d3、d4;
if the group with the largest difference value of the space distances is the diagonal point, calculating the difference value of the space distances:
ΔZ13=|d1-d3|
Calculating corrected point coordinates according to the following function:
And similarly calculating the spatial distance difference value of the other group of points and the corrected coordinates (a' i,b'i), wherein i=2 and 4.
The image sensor uses the average brightness value of the image as the brightness value of the image, inquires the exposure value to be set according to the set brightness value interval and exposure value relation table, and then writes the exposure value into a register of the image sensor.
Preferably, the calculating method of the brightness average value comprises the following steps: let the image resolution of the image sensor be W×H, W be the width of the image, H be the height of the image, the single channel image, each pixel point represents the brightness by a value, divide into 20 areas, 5 rows and 4 columns, the center point coordinate calculation formula of each area is:
v (x, y) represents the pixel value of the center point of the designated area, x is the number of the area rows, and y is the number of the area columns;
P (r, c) represents the value of the pixel point at the designated position of the whole image, r is the number of image lines, and c is the number of image columns;
V(x,y)=P(W/8×y,H/10×x);
luminance average=20 pixel value sum of the center points of the areas/20.
The distance measuring sensor performs distance detection once at regular intervals, if the read distance value is within a set threshold value, the processor starts image acquisition and transmission, and after the transmission is completed, the processor enters a sleep mode; otherwise, the processor remains in sleep mode.
Preferably, the certain time is 100-200ms.
The front panel of the image sensor is provided with the light supplementing lamp module, and the light supplementing lamp module is only started during the exposure period of the image sensor. The problem of lamp pearl heating caused by the fact that the light supplementing lamp is always started is avoided.
The invention also provides a device for acquiring the palm biological characteristics, which comprises an image acquisition main board, a light supplementing lamp board and four ranging sensors, wherein the image acquisition main board is integrated with a processor, a memory and the image sensors, the light supplementing lamp board is arranged above the image acquisition main board, an infrared light supplementing array and the four ranging sensors are integrated on the light supplementing lamp board, and an infrared polaroid is arranged above the infrared light supplementing array; the memory, the image sensor and the ranging sensor are respectively and electrically connected with the processor, and the infrared light supplementing array is electrically connected with the image sensor.
Preferably, the emission angle of the infrared light filling lamp beads of the infrared light filling array is 120 degrees, and the light gathering effect of the array is optimal.
Preferably, an infrared filter is disposed above the infrared polarizer.
The invention has the beneficial effects that:
1. According to the invention, the point data is corrected by acquiring the spatial distances between the palm and the four points, so that the mapping process of perspective transformation is calculated, the inclined palm image is subjected to perspective transformation to obtain a flat graph, and the same plane graph can be obtained even if the inclination angles of the palms are different during acquisition and recognition, and the problem that the inclination angles of the palms are inconsistent and cannot be recognized is avoided.
2. The invention corrects coordinates of four vertex angle points based on depth differences and calculates a transformation matrix M for space transformation, namely perspective transformation. The perspective transformation uses the condition that the perspective center, the image point and the target point are collinear, and the shadow bearing surface (perspective surface) rotates a certain angle around the trace (perspective axis) according to the perspective rotation law, so that the original projection light beam is destroyed, and the projection geometric figure on the shadow bearing surface can be kept unchanged. A plane is successfully projected onto a designated plane by a projection matrix.
3. The distance measuring sensor detects the distance every a certain time, if the read distance value is not within the set threshold value, the processor keeps the sleep mode, and the power consumption can be effectively reduced to uW level.
4. When the external light environment of the acquisition device changes, the acquired palm vein image is overexposed or underexposed, and effective vein feature values cannot be extracted through an algorithm. The invention takes the brightness average value of the image as the brightness value of the image, inquires the exposure value which needs to be set according to the set brightness value interval and the exposure value relation table, writes the exposure value into a register corresponding to the image Sensor, and acquires the next frame of image to obtain proper brightness, so that even if the external light environment of the Sensor changes, the Sensor can acquire palm vein images with proper brightness quickly.
5. The acquisition device integrates the infrared light supplementing array on the light supplementing lamp panel and is separated from the acquisition main board, so that the light can be prevented from being blocked by the inner wall of the shell and the lens seat, the brightness of the light is improved, and no additional light guide component is needed.
Drawings
FIG. 1 is a flow chart for acquiring palm biometric features.
Fig. 2 is an original palm image.
Fig. 3 is a perspective transformed palm image.
Fig. 4 is an exploded view of the device for acquiring palm biological characteristics according to the present invention.
Fig. 5 is a block diagram of the apparatus for acquiring palm biometric features according to the present invention.
Fig. 6 is a block circuit diagram of an apparatus for acquiring palm biometric features of the present invention.
Reference numerals: 1. the infrared light source comprises an infrared filter, 2, an infrared polaroid, 3, a polaroid support, 4, a distance measuring sensor, 5, a light supplementing lamp panel, 6, a lens seat, 7, an image sensor, 8, an image acquisition main board, 9, a shell, 51 and an infrared light supplementing array.
Detailed Description
The invention will be further described by the following examples for the purpose of more clearly and specifically describing the object of the invention. The following examples are only for specific illustration of the implementation method of the present invention and do not limit the protection scope of the present invention.
Example 1
A method of acquiring palm biometric features, comprising the steps of:
s1, carrying four ranging sensors arranged at four corners on a front panel of an image sensor, and respectively obtaining the space distance between a palm and each ranging sensor;
s2, acquiring an original picture, dividing point location data corresponding to four distance measuring sensors into two groups, wherein the maximum difference value of the space distances is one group, the other two points are the other group, calculating the difference value of the space distances of each group respectively, and carrying out coordinate deformation based on the difference value of the space distances to obtain corrected point location coordinates;
S3, substituting the original coordinates and correction coordinate data of the four point positions into a perspective transformation formula, and calculating to obtain a perspective transformation matrix;
s4, correcting the original picture through perspective transformation, and obtaining a target corrected picture for recognition.
Example 2
This example is based on example 1:
Four range finding sensors are distributed into 4 vertexes of a square with the side length of 20mm, an original picture is obtained, the four range finding sensors can map 4 points on a palm picture, coordinates of the 4 points are set to be (0, 0), (0, 20), (20, 20) and (20, 0), distances from the inclined palm 4 points to the sensors are obtained, and distance values measured by the four range finding sensors are d 1、d2、d3、d4. The x, y coordinates of the 4 vertices of the 20x20mm square area after palm tilting when palm lies flat were calculated from these 4 distance values.
The 4 selected points are positioned at four vertex angles and can be square or rectangular.
And dividing the point location data corresponding to the four ranging sensors into two groups, wherein the adjacent point locations with the largest difference value of the space distances are one group, and the remaining two point locations are the other group. Let these four points be p1, p2, p3, p4.
Setting the original coordinates of the four points as (a i,bi), the corrected coordinates as (a' i,b'i), and the spatial distances between the palm and each ranging sensor as d 1、d2、d3、d4, wherein i=1, 2,3,4, respectively:
[p1,p2,p3,p4]=[(a1,b1,d1),(a2,b2,d2),(a3,b3,d3),(a4,b4,d4)]=[(0,0,d1),(0,20,d2),(20,20,d3),(20,0,d4)];
Calculating a spatial distance difference of the first group of points:
ΔZ12=|d1-d2|
Calculating corrected point coordinates according to the following function:
And similarly calculating the space distance difference value of the second group of points:
ΔZ34=|d3-d4|
Calculating corrected point coordinates according to the following function:
Example 3
This example is based on example 1:
Four range finding sensors are distributed into 4 vertexes of a square with the side length of 20mm, an original picture is obtained, the four range finding sensors can map 4 points on a palm picture, coordinates of the 4 points are set to be (0, 0), (0, 20), (20, 20) and (20, 0), distances from the inclined palm 4 points to the sensors are obtained, and distance values measured by the four range finding sensors are d 1、d2、d3、d4. The x, y coordinates of the 4 vertices of the 20x20mm square area after palm tilting when palm lies flat were calculated from these 4 distance values.
The point location data corresponding to the four ranging sensors are divided into two groups, the diagonal point location with the largest difference of the space distances is one group, and the remaining two points are the other group. Let these four points be p1, p2, p3, p4.
Setting the original coordinates of the four points as (a i,bi), the corrected coordinates as (a' i,b'i), and the spatial distances between the palm and each ranging sensor as d 1、d2、d3、d4, wherein i=1, 2,3,4, respectively:
[p1,p2,p3,p4]=[(a1,b1,d1),(a2,b2,d2),(a3,b3,d3),(a4,b4,d4)]=[(0,0,d1),(0,20,d2),(20,20,d3),(20,0,d4)];
Calculating a spatial distance difference of the first group of points:
ΔZ13=|d1-d3|
Calculating corrected point coordinates according to the following function:
And similarly calculating the space distance difference value of the second group of points:
ΔZ24=|d2-d4|
Calculating corrected point coordinates according to the following function:
Example 4
This example is based on example 2:
the perspective transformation is to project the picture to a new view plane, and the mapping process in the middle can be solved through the original coordinates and the correction coordinates of the four ranging points. Let the original image coordinates be (u, v), the perspective transformed image coordinates be (x, y), the perspective transformation formula be:
Wherein the transformation matrix is in the form of 3 x 3. And then can obtain:
x=a11u+a12v+a13
y=a21u+a22v+a23
z=a31u+a32v+a33
Substituting original coordinates (a i,bi) of four ranging points of the original image into (u, v), substituting corrected coordinates (a' i,b'i,di) into (x, y, z) after perspective transformation, which is equivalent to listing equation sets, solving a transformation matrix, assuming that the transformation matrix is M, the original image vector is A, and after transformation is B, solving the formula of M as follows:
M=A*B-1
Under the condition of inputting original image coordinates, directly solving new image plane coordinates by the calculated transformation matrix:
wherein, (u, v) is the original coordinates, (x ', y') is the transformed coordinates; a 11,a12,a21,a22,a31,a32 is a rotation amount, and a 13,a23,a33 is a translation amount.
The perspective transformed target correction picture is shown in figure 3. The recognition accuracy is high, the false recognition rate is less than or equal to 0.00001%, and the false rejection rate is less than or equal to 1.0%. Regarding palm vein identification, user capacity nmax in 1:n comparison mode supports 10000.
Example 5
This example is based on example 1:
The image sensor takes the average brightness value of the image as the brightness value of the image, inquires the exposure value required to be set according to the set brightness value interval and exposure value relation table, and then writes the exposure value into a register of the image sensor.
The calculation method of the brightness average value comprises the following steps: the image resolution of the image sensor is 640×400, each pixel point represents brightness by a value, the image sensor is divided into 20 areas, 5 rows and 4 columns, and then the average value is obtained after summing 20 image values:
v (x, y) represents the pixel value of the center point of the designated area, x is the number of the area rows, and y is the number of the area columns;
P (r, c) represents the value of the pixel point at the designated position of the whole image, r is the number of image lines, and c is the number of image columns;
V(x,y)=P(W/8×y,H/10×x);
Luminance average value= (V (1, 1) +v (1, 2) + … +v (5, 4))/20.
The division of the area is not necessarily 5 rows and 4 columns, and the more the division of the row and column number is, the longer the calculation time is; the fewer the row and column number divisions, the lower the accuracy. The 5 row 4 column region division is an optimal value.
Example 6
This example is based on example 1:
The distance measuring sensor detects the distance every 100ms, if the read distance value is within a set threshold value, the processor starts the acquisition and transmission of the image, and the processor enters a sleep mode after the transmission is completed; otherwise, the processor remains in sleep mode.
Example 7
This example is based on example 1:
The distance measuring sensor detects the distance every 200ms, if the read distance value is within a set threshold value, the processor starts the acquisition and transmission of the image, and the processor enters a sleep mode after the transmission is completed; otherwise, the processor remains in sleep mode.
The front panel of the image sensor carries a light supplementing lamp module, and only the image sensor is started during exposure. The problem of lamp pearl heating caused by the fact that the light supplementing lamp is always started is avoided.
Example 8
As shown in fig. 4, a device for acquiring palm biological characteristics comprises an image acquisition main board 8, a light supplementing lamp board 5 and four ranging sensors 4, wherein the image acquisition main board 8 is integrated with a processor, a memory and an image sensor 7, the light supplementing lamp board 5 is arranged above the image acquisition main board 8, an infrared light supplementing array 51 and the four ranging sensors 4 are integrated on the light supplementing lamp board 5, and an infrared polaroid 2 is arranged above the infrared light supplementing array 51; the memory, the image sensor 7 and the ranging sensor 4 are respectively electrically connected with the processor, and the infrared light compensating array 51 is electrically connected with the image sensor 7.
The acquisition device integrates the infrared light supplementing array 51 on the light supplementing lamp panel 5 and is separated from the image acquisition main board 8, so that the light can be prevented from being blocked by the inner wall of the shell 9 and the lens seat 6, the light brightness is improved, and no additional light guide component is needed.
The processor starts the distance measuring sensor 4 to detect the distance once at regular time, and if the read distance value is not within the set threshold value, the processor immediately enters a sleep mode; if the read distance value is within the set threshold value, the image sensor 7 is started to collect and transmit the image, and the processor enters a sleep mode after the transmission is completed. This reduces the power consumption of the module to the uW level. The infrared light supplementing array is electrically connected with the image sensor, and only the light supplementing lamp module of the image sensor is started during exposure, and the image sensor directly controls the switch of the infrared light supplementing array. The structure block diagram of the acquisition device is shown in fig. 5, and the circuit block diagram of the acquisition device is shown in fig. 6.
And a memory is integrated on the image acquisition main board, and can store newly registered palm vein information and can also provide data for the comparison of acquired palm veins. The device is integrated with a USB data transmission module, and can transmit collected data to other equipment or computers for processing and analysis. The device is also integrated with a clock generator which provides a stable operating clock signal for the digital circuit.
Example 9
This example is based on example 8:
The emission angle of the infrared light filling lamp beads of the infrared light filling array 51 is 120 degrees, and the light gathering effect of the array is optimal.
An infrared filter is arranged above the infrared polaroid 2.
The foregoing examples merely illustrate specific embodiments of the invention, which are described in greater detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention.

Claims (10)

1. A method of acquiring palm biometric features, comprising the steps of:
s1, carrying four ranging sensors arranged at four corners on a front panel of an image sensor, and respectively obtaining the space distance between a palm and each ranging sensor;
s2, acquiring an original picture, dividing point location data corresponding to four distance measuring sensors into two groups, wherein the maximum difference value of the space distances is one group, the other two points are the other group, calculating the difference value of the space distances of each group respectively, and carrying out coordinate deformation based on the difference value of the space distances to obtain corrected point location coordinates;
S3, substituting the original coordinates and correction coordinate data of the four point positions into a perspective transformation formula, and calculating to obtain a perspective transformation matrix;
s4, correcting the original picture through perspective transformation, and obtaining a target corrected picture for recognition.
2. The method for acquiring palm biological characteristics according to claim 1, wherein the original coordinates of the four points are set to be (a i,bi), the corrected coordinates are set to be (a' i,b'i), and the spatial distances between the palm and each ranging sensor are sequentially d 1、d2、d3、d4, wherein i=1, 2,3,4;
If the group with the largest difference value of the space distances is adjacent points, calculating the difference value of the space distances:
ΔZ12=|d1-d2|
Calculating corrected point coordinates according to the following function:
And similarly calculating the spatial distance difference value of the other group of points and the corrected coordinates (a' i,b'i), wherein i=3 and 4.
3. The method of claim 1, wherein the original coordinates of four of the points are set to (a i,bi), the corrected coordinates are set to (a' i,b'i), i = 1,2,3,4; the space distance between the palm and each ranging sensor is sequentially d 1、d2、d3、d4;
if the group with the largest difference value of the space distances is the diagonal point, calculating the difference value of the space distances:
ΔZ13=|d1-d3|
Calculating corrected point coordinates according to the following function:
And similarly calculating the spatial distance difference value of the other group of points and the corrected coordinates (a' i,b'i), wherein i=2 and 4.
4. The method for acquiring palm biological features according to claim 1, wherein the image sensor uses the average brightness value of the image as the brightness value of the image, and inquires the exposure value to be set according to the set brightness value interval and exposure value relation table, and then writes the exposure value into a register of the image sensor.
5. The method for acquiring palm biometric features according to claim 4, wherein the calculating method of the brightness average value comprises: let the image resolution of the image sensor be W×H, W be the width of the image, H be the height of the image, the single channel image, each pixel point represents the brightness by a value, divide into 20 areas, 5 rows and 4 columns, the center point coordinate calculation formula of each area is:
v (x, y) represents the pixel value of the center point of the designated area, x is the number of the area rows, and y is the number of the area columns;
P (r, c) represents the value of the pixel point at the designated position of the whole image, r is the number of image lines, and c is the number of image columns;
V(x,y)=P(W/8×y,H/10×x);
luminance average=20 pixel value sum of the center points of the areas/20.
6. The method for acquiring palm biological characteristics according to claim 1, wherein the distance measuring sensor performs distance detection once at intervals, if the read distance value is within a set threshold value, the processor starts image acquisition and transmission, and the processor enters a sleep mode after the transmission is completed; otherwise, the processor remains in sleep mode.
7. The method of claim 1, wherein the front panel of the image sensor carries a light filling module that is turned on only during exposure of the image sensor.
8. A device for implementing the method for acquiring palm biological characteristics according to any one of claims 1 to 7, which is characterized by comprising an image acquisition main board, a light supplementing lamp board and four ranging sensors, wherein the image acquisition main board is integrated with a processor, a memory and the image sensors, the light supplementing lamp board is arranged above the image acquisition main board, an infrared light supplementing array and four ranging sensors are integrated on the light supplementing lamp board, and an infrared polarizer is arranged above the infrared light supplementing array; the memory, the image sensor and the ranging sensor are respectively and electrically connected with the processor, and the infrared light supplementing array is electrically connected with the image sensor.
9. The apparatus of claim 8, wherein an infrared filter is disposed over the infrared polarizer.
10. The apparatus of claim 8, wherein the infrared light supplemental light beads of the infrared light supplemental array have an emission angle of 120 °.
CN202410366183.2A 2024-03-28 2024-03-28 Method and device for acquiring palm biological characteristics Active CN117994825B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410366183.2A CN117994825B (en) 2024-03-28 2024-03-28 Method and device for acquiring palm biological characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410366183.2A CN117994825B (en) 2024-03-28 2024-03-28 Method and device for acquiring palm biological characteristics

Publications (2)

Publication Number Publication Date
CN117994825A true CN117994825A (en) 2024-05-07
CN117994825B CN117994825B (en) 2024-06-11

Family

ID=90902303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410366183.2A Active CN117994825B (en) 2024-03-28 2024-03-28 Method and device for acquiring palm biological characteristics

Country Status (1)

Country Link
CN (1) CN117994825B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120281890A1 (en) * 2011-05-06 2012-11-08 Fujitsu Limited Biometric authentication device, biometric information processing device, biometric authentication system, biometric authentication server, biometric authentication client, and biometric authentication device controlling method
JP5685272B2 (en) * 2011-02-15 2015-03-18 富士通フロンテック株式会社 Authentication apparatus, authentication program, and authentication method
WO2022089263A1 (en) * 2020-10-27 2022-05-05 深圳Tcl数字技术有限公司 Display image correction method and device, and computer-readable storage medium
CN115641615A (en) * 2022-11-25 2023-01-24 湖南工商大学 Extraction method of closed palm interested region under complex background

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5685272B2 (en) * 2011-02-15 2015-03-18 富士通フロンテック株式会社 Authentication apparatus, authentication program, and authentication method
US20120281890A1 (en) * 2011-05-06 2012-11-08 Fujitsu Limited Biometric authentication device, biometric information processing device, biometric authentication system, biometric authentication server, biometric authentication client, and biometric authentication device controlling method
WO2022089263A1 (en) * 2020-10-27 2022-05-05 深圳Tcl数字技术有限公司 Display image correction method and device, and computer-readable storage medium
CN115641615A (en) * 2022-11-25 2023-01-24 湖南工商大学 Extraction method of closed palm interested region under complex background

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KAIJUN ZHOU 等: "Region of Interest Extraction for Closed Palm with Complex Background", 《CCBR 2023》, 2 December 2023 (2023-12-02), pages 34 - 45, XP047677428, DOI: 10.1007/978-981-99-8565-4_4 *
毕运波 等: "基于视觉测量的沉头孔垂直度检测方法", 《浙江大学学报(工学版)》, vol. 51, no. 2, 28 February 2017 (2017-02-28), pages 312 - 318 *

Also Published As

Publication number Publication date
CN117994825B (en) 2024-06-11

Similar Documents

Publication Publication Date Title
US10547833B2 (en) Camera calibration system, target, and process
US7391450B2 (en) Techniques for modifying image field data
US7492357B2 (en) Apparatus and method for detecting a pointer relative to a touch surface
US11042994B2 (en) Systems and methods for gaze tracking from arbitrary viewpoints
US7256772B2 (en) Auto-aligning touch system and method
US6252973B1 (en) Apparatus and method for determining vehicle wheel alignment measurements from three dimensional wheel positions and orientations
KR100942081B1 (en) Finger sensor apparatus using image resampling and associated methods
US9194931B2 (en) Length measurement method and device of the same
US20030210407A1 (en) Image processing method, image processing system and image processing apparatus
CN108305288B (en) Fixed star centroid extraction method for stationary orbit earth observation satellite line instrument
WO2011024232A1 (en) Luminance adjusting apparatus
US20120038588A1 (en) Optical Position Input System And Method
CN108627121B (en) Mirror surface shape detection device and detection method thereof
CN112509060B (en) CT secondary scanning positioning method and system based on image depth learning
US11544966B2 (en) Image acquisition system for off-axis eye images
CN117994825B (en) Method and device for acquiring palm biological characteristics
US20220088455A1 (en) Golf ball set-top detection method, system and storage medium
CN111833379B (en) Method for tracking target position in moving object by monocular camera
US20090046063A1 (en) Coordinate positioning system and method with in-the-air positioning function
CN115457055A (en) Illuminance meter value identification method, electronic device, and storage medium
CN114820376A (en) Fusion correction method and device for stripe noise, electronic equipment and storage medium
CN112711982B (en) Visual detection method, device, system and storage device
JP4756753B2 (en) Image processing apparatus, method, and program
CN117333506B (en) Three-dimensional eye movement tracking method based on TOF camera
US20090310834A1 (en) Portal/non-portal image registration system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant