WO2019146906A1 - Appareil de reconnaissance de pupille - Google Patents

Appareil de reconnaissance de pupille Download PDF

Info

Publication number
WO2019146906A1
WO2019146906A1 PCT/KR2018/015820 KR2018015820W WO2019146906A1 WO 2019146906 A1 WO2019146906 A1 WO 2019146906A1 KR 2018015820 W KR2018015820 W KR 2018015820W WO 2019146906 A1 WO2019146906 A1 WO 2019146906A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
pupil
row
pixels
column
Prior art date
Application number
PCT/KR2018/015820
Other languages
English (en)
Korean (ko)
Inventor
이호영
임장열
Original Assignee
주식회사 파트론
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 파트론 filed Critical 주식회사 파트론
Publication of WO2019146906A1 publication Critical patent/WO2019146906A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Definitions

  • the present invention relates to a pupil recognition apparatus.
  • a password authentication method and an identification number authentication method are known, but these methods are likely to be exposed to theft and the risk of loss.
  • Biometric information used for this biometric method is fingerprint, face, and iris in general.
  • the iris authentication method using the double iris is known to be the best in terms of uniqueness, invariance, and stability among individual identification methods, and is being applied to fields requiring high security.
  • the iris authentication is performed through a process of extracting the iris image of the subject and comparing the iris image already registered with the extracted iris image.
  • the position of the pupil In order to recognize the iris, the position of the pupil must first be recognized. The accuracy of the iris recognition method is increased if the position of the pupil is accurately recognized.
  • a problem to be solved by the present invention is to accurately extract a pupil region from an eye image of a photographee.
  • a pupil recognition apparatus including a storage unit storing final eye image data, and a pupil extracting unit connected to the storage unit, Extracting a pixel having a pixel value equal to or greater than a first set value and a pixel having a pixel value equal to or less than a second set value greater than the first set value in the stored final eye image data,
  • a first preliminary pupil pixel is selected using a pixel to determine a primary preliminary pupil region that includes the primary preliminary pupil pixel, and an uppermost pixel row of the primary preliminary pupil region, a lower-most pixel row,
  • first to fourth reference pixels positioned respectively in the rightmost pixel column to expand the primary preliminary pupil region in the first and second column directions and in the first and second row directions, Determining a second preliminary pupil region defined by the first and second rows and the first and second rows, determining a center pixel positioned at the center of the secondary preliminary pupil region and the first column of the secondary preliminary pupil region, Determining a second preliminary pupil region included in the
  • the pupil extracting unit includes a first preliminary pupil pixel located in the middle of the uppermost pixel row in the primary preliminary pupil region, a primary preliminary pupil pixel located in the middle of the lowest pixel row, And the primary preliminary pupil pixel located at the center of the next preliminary pupil pixel and the rightmost pixel column may be defined as the first to fourth reference pixels, respectively.
  • the pupil extracting unit sequentially moves the positions of the first comparison pixels in the column direction along the first column direction from the first reference pixel while the difference between the pixel value of the first reference pixel and the pixel value of the first comparison pixel is Determining a first comparison pixel that exceeds a set value and determining a column position of a first comparison pixel immediately before the determined first comparison pixel as a position of the first column, In which the difference between the pixel value of the second reference pixel and the pixel value of the second comparison pixel exceeds a set value while sequentially moving the position of the second comparison pixel in a column unit along the second column direction, The second comparison pixel is determined to be a second column, and the third reference pixel is determined to be the second comparison pixel in the row direction along the first row direction, 3 rain A third comparison pixel in which the difference between the pixel value of the third reference pixel and the pixel value of the third comparison pixel exceeds a set value while sequentially moving the position of the pixel, The position of the third comparison
  • the pupil extracting unit extracts, from the final eye image data, a pixel having a pixel value that is equal to or greater than the first set value and a pixel value that is equal to or less than a second set value that is greater than the first set value as an estimated pupil pixel, It is possible to extract the first-order preliminary pupil pixels by removing the estimated pupil pixels that are not consecutively positioned by a predetermined number in any one of the row direction and the column direction and neighboring other estimated pupil pixels in the pixel.
  • the neighboring different estimated pupil pixels may be located consecutively with the estimated pupil pixels different in the row direction and the column direction.
  • the pupil extracting unit may set the distance from the center pixel to the farthest of the first row, the second column, the first row, and the second row of the secondary preliminary pupil region as the radius of the circular region have.
  • the pupil extracting unit may set the distance to the shortest one of the first column, the second column, the first row, and the second row of the center pixel and the secondary preliminary pupil region as the radius of the circular region have.
  • the primary preliminary pupil pixels for determining the final pupil region are extracted by the comparison operation between the pixel value and the set value in the final eye image data, Is greatly reduced.
  • the data processing time is shortened and the extraction time of the final pupil area is reduced.
  • the secondary preliminary pupil region is extracted through the pixel value comparison operation between the predetermined reference pixel and each of the comparison pixels positioned adjacent to the reference pixel. do. This improves the accuracy of the determination operation of the two-window preliminary pupil region for determining the final pupil region, thereby increasing the accuracy of the final pupil region.
  • FIG. 1 is a block diagram of a pupil recognition apparatus according to an embodiment of the present invention.
  • FIGS. 2A to 2D are operational flowcharts of a pupil extracting unit for extracting a final pupil region in the pupil recognition apparatus according to an embodiment of the present invention.
  • FIGS. 3A and 3B are diagrams for explaining a process of determining an estimated pupil region and a primary preliminary pupil region using final eye image data according to the operation of the pupil extraction unit of the pupil recognition apparatus according to an embodiment of the present invention
  • FIGS. FIG. 3A is a diagram for explaining a process of determining an estimated pupil region
  • FIG. 3B is a diagram for explaining a process of determining a primary estimated preliminary pupil region.
  • FIGS. 4 (a) and 4 (b) are views showing images of the primary preliminary pupil region determined by the pupil extracting unit of the pupil recognition apparatus according to the embodiment of the present invention, wherein (a) (B) is a view showing an image obtained by extracting a primary preliminary pupil region from an eye image of (a) to an information output unit.
  • FIG. 5 is a diagram illustrating first to fourth reference pixels in the primary preliminary pupil region according to the operation of the pupil extracting unit of the pupil recognition apparatus according to an embodiment of the present invention.
  • FIG. 6 is a diagram for explaining a process of determining a secondary preliminary pupil region in a secondary preliminary pupil region according to the operation of the pupil extracting unit of the pupil recognition apparatus according to an embodiment of the present invention.
  • FIG. 7 is a diagram showing an image of a secondary preliminary pupil region determined using the eye image shown in FIG. 4 (b).
  • FIG. 7 is a diagram showing an image of a secondary preliminary pupil region determined using the eye image shown in FIG. 4 (b).
  • FIG. 8 is a diagram illustrating an image of a final pupil region in the pupil recognition apparatus according to an embodiment of the present invention.
  • the pupil recognition apparatus includes an image acquisition unit 100, a pupil extraction unit 200 connected to the image acquisition unit 100, a storage unit 200 connected to the pupil extraction unit 200, 300, and an information output unit 400 connected to the pupil extracting unit 200.
  • the image capturing unit 100 captures an image of a person to be photographed.
  • the image capturing unit 100 of the present example captures an area including an eye among the faces of the person to be photographed for pupil recognition, (Hereinafter, this image data is referred to as 'original image data').
  • the image data of this example including the original image data has positional information and pixel values for a plurality of pixels arranged in a matrix structure, and the value of the image data for each pixel is a pixel value. Therefore, the image data is a set of pixels made up of a plurality of pixels having the corresponding pixel values.
  • the image capturing unit 100 may include an infrared camera for generating image data by infrared ray photographing in order to prevent red eye phenomenon of the subject during the photographing operation and accurately photograph the eye.
  • the pupil extracting unit 200 extracts a pupil region from the original image data received from the image capturing unit 100 and includes a eye image data extracting unit 210, (220), and a pupil extracting unit (230) connected to the preprocessing unit (220).
  • the eye image data extraction unit 210 extracts image data of the eye and eye region (hereinafter, this region is referred to as the eye region) from the eye part image data applied from the image pickup section 100 Image data ").
  • the eye-image-data extracting unit 210 extracts eye-image data from original image data through an algorithm such as OpenCV which is already disclosed.
  • the preprocessing unit 220 generates a final eye image data by performing a conversion operation of the eye image data before extracting the pupil region.
  • the preprocessing unit 220 performs various processes such as grayscale conversion processing, pixel number conversion processing, smoothing processing, and binarization processing At least one of the preprocessing operations for the plurality of image data is performed to convert the amount of the eye image data, the size of the pixel value, and the like so that the processing time for extracting the pupil region can be shortened.
  • the preprocessing operation for the eye image data also uses a known processing operation, so a detailed description thereof will be omitted.
  • the preprocessing operation of the present example for the original image data may be omitted, and in this case, the preprocessing unit 220 of this example is also omitted.
  • the pupil extracting unit 230 extracts image data for a pupil from the final eye image data applied from the preprocessing unit 220 to determine a pupil area.
  • the pupil extracting unit 230 firstly determines the primary preliminary pupil region using the pixel values of the final eye image data, and determines the pupil region of the first pupil region based on the uppermost pixel row, the lowest pixel row, And the first to fourth reference pixels located in the rightmost pixel column are used to sequentially change the positions of the first to fourth comparison pixels in which the pixel values are compared with the first to fourth reference pixels, The secondary pupil region in which the pupil region is extended in the first and second row directions and in the first and second row directions is determined.
  • the pupil extracting unit 230 extracts the pupil extracting unit 230 from the center pixel located at the center of the secondary preliminary pupil region, the first column which is the column position of the uppermost pixel row of the secondary preliminary pupil region, A circular area having a radius up to one of the first row and the second row which is the row position of the first row and the leftmost row of pixels in the leftmost pixel column is determined and the secondary spare pupil area included in the original area is defined as The final pupil pixel is determined and the final pupil region is determined.
  • the storage unit 300 connected to the pupil extracting unit 200 is a storage medium for storing data or information necessary for the operation of the iris recognition apparatus and image data generated during operation, and may be a memory or the like.
  • the storage unit 300 may include at least one of a temporary storage device such as a buffer and a permanent storage device such as a hard disk.
  • the information output unit 400 outputs the corresponding image by the operation of the pupil extracting unit 200 so that the user can visually check the pupil extracting process.
  • the information output unit 400 may be a liquid crystal display (LCD), an organic light emitting display (OLED), or the like.
  • LCD liquid crystal display
  • OLED organic light emitting display
  • the operation of the image acquisition unit 100 performs a photographing operation on the face including the eyes to obtain the original image data.
  • the image capturing unit 110 performs a photographing operation on a corresponding area (e.g., a face area) in conjunction with the operation of a driving switch (not shown) by a user or a person is detected in a designated area, A person's face can be photographed.
  • a corresponding area e.g., a face area
  • a driving switch not shown
  • the eye image data extracting unit 210 of the pupil extracting unit 200 extracts the eye image data Extracts image data for the eye region from the image data, stores the extracted image data in the storage unit 300 as eye image data, and outputs the extracted eye image data to the preprocessing unit 220.
  • the preprocessing unit 220 performs at least one preprocessing operation such as gray-scale conversion processing, pixel number conversion processing, or smoothing processing on input eye image data, stores it in the storage unit 300, and preprocesses the eye image (Hereinafter referred to as "final eye image data") to the pupil extraction unit 230 (S13).
  • the preprocessing unit 220 when the preprocessing unit 220 is not provided, the eye image data extracted by the eye image data extraction unit 210 becomes final eye image data.
  • the pupil extracting unit 230 determines the pupil area using the transmitted final eye image data.
  • the pupil extracting unit 230 removes a pixel having a pixel value less than a first set value (lower limit value) and a pixel having a pixel value exceeding a second set value (upper limit value) in the final eye image data , Only pixels having a pixel value in a setting range that is equal to or greater than the first pixel value and equal to or less than the second pixel value in the final eye image data are extracted (S14).
  • the first set value is larger than the second set value, and the first set value is determined using the pixel value of the pixel belonging to the pupil region or the average pixel value, and the second set value is set based on the first set value .
  • the image data for an area (for example, the eye area) not primarily included in the iris area is removed.
  • the pupil extracting unit 230 extracts only the pixels having the first set value which is the lower limit value of the set range among the extracted pixels (hereinafter, the extracted pixels are referred to as 'estimated pupil pixels') And stores the position information and the pixel value in the storage unit 300 (S15).
  • the estimated pupil pixel which is the pixel having the first set value among the pixels belonging to the set range, And is included in the estimated pupil area AR11, which is an area that can be estimated as a region.
  • the pupil extracting unit 230 includes an estimated pupil pixel PX11 existing in the uppermost row (e.g., a best-located pupil pixel) PX11, an estimated pupil pixel located in the lowermost column (For example, the lowest estimated pupil pixel) (PX12, PX13), the estimated pupil pixel (for example, the leftmost estimated pupil pixel) PX14 existing in the leftmost row and the estimated pupil pixel located at the rightmost row estimating the right-most pixel pupil) (PX15) a row and four-position information by the column position information (X n -9, X n, Y p, Y p + 10) to [(X n -9, Y p ) ( E.g., a best-located pupil pixel) PX11, an estimated pupil pixel located in the lowermost column (For example, the lowest estimated pupil pixel) (PX12, PX13), the estimated pupil pixel (for example, the leftmost estimated pupil pixel) PX14 existing in the leftmost
  • first to fourth edge pixels [PX (X n , Y p +10 )] each having the pixel values (X n -9 , Y p +10 ), (X n , Y p ) (X n -9, Y p) , PX (X n -9, Y p +10), PX (X n, Y p) and PX (X n, Y p +10 )] for a rectangular area as the vertices As the estimated pupil area AR11 (S16).
  • the pupil extracting unit 230 extracts the estimated pupil pixels that are not positioned consecutively in a predetermined number of directions in either one of the row direction and the column direction with other immediately adjacent estimated pupil pixels by using the position information of the estimated pupil pixel To extract the first-order preliminary pore pixels (S17).
  • neighboring other estimated pupil pixels are also located consecutively with other estimated pupil pixels in the row direction and the column direction.
  • the estimated pupil pixels belonging to the estimated pupil area AR11 are extracted as shown in FIG. 3A, the estimated pupil pixels PX14 to PX16 are excluded from the primary spare pore pixels.
  • the estimated pupil pixel PX14 does not include other estimated pupil pixels immediately adjacent to the row direction and the column direction on the basis of the estimated pupil pixel PX14, and the estimated pupil pixel PX15 is also referred to as a reference pixel in the row direction and the column direction And does not have other estimated pupil pixels immediately adjacent thereto.
  • the estimated pupil pixels PX16 and PX17 are located in the same pixel column (i.e., the column in which the positions of the rows are the same but the positions of the pixels are different), and are assumed to be the estimated pupil pixels that are consecutively arranged in the column direction. And other estimated pupil pixels that are continuously located in the column direction.
  • these estimated pupil pixels PX16 and PX17 are also not selected as the primary spare pore pixels.
  • the estimated pupil pixels PX11-PX13 include other estimated pupil pixels positioned adjacent to the other estimated pupil pixels in the row direction. At this time, other estimated pupil pixels positioned adjacent to each other are also consecutively arranged in the row direction and the column direction And is in contact with another estimated pupil pixel. Therefore, these estimated pupil pixels PX11 to PX13 are selected as the primary spare pore pixels.
  • the primary spare pore pixel is selected from the estimated pupil pixels located in the estimated pupil region AR11.
  • An example of a primary spare pore pixel selected based on the estimated pupil pixel shown in Fig. 3A is shown in Fig. 3B.
  • the pupil extraction unit 230 uses the position information of the first-order preliminary pore pixels
  • the region including the primary pore pixels is defined as the primary preliminary pupil region AR12 (S18).
  • the primary preliminary pupil region AR12 is divided into a pixel existing in the uppermost row (i.e., the uppermost row) (X n -9 , X n ) and column position information (Y p (X n )) of the pixel located at the leftmost (leftmost) , Y p + 10) four-position information [(X n -9, Y p by a), (X n -9, Y p +10), (X n, Y p) and a (X n, Y p + 10) the four pixels each having [PX (X n -9, Y p), PX (X n -9, Y p +10), PX (X n, Y p) and PX (X n, Y p +10 )].
  • the primary spare pore pixels selected from the estimated pupil pixels belonging to the estimated pupil area AR11 shown in FIG. 3A are as shown in FIG. 3B, A preliminary pupil region AR12 is formed.
  • FIG. 4A shows an eye image output to the information output unit 400 according to the final eye image data.
  • FIG. 4B shows a first preliminary pupil region AR12 in the eye image of FIG.
  • the information output unit 400 displays the extracted image.
  • the pupil extracting unit 230 performs an operation of determining the secondary spare pupil area AR13.
  • the pupil extracting unit 230 first determines the first to fourth reference pixels PXr1, PXr2, PXr3, and PXr4, and stores position information on the determined reference pixels PXr1, PXr2, PXr3, PXr4, (Step S19).
  • the pupil extracting unit 230 extracts, from the primary spare pupil region AR12, a primary preliminary pupil pixel positioned at the center of the pixel row located in the uppermost row Y p , Primary preliminary pupil pixels positioned at the center of the pixel row located in the row (Y p + 10 ), primary preliminary pupil pixels positioned at the center of the pixel column located at the leftmost row (X n - 7 ) X n-1 ) are defined as the first to fourth reference pixels PXr 1 to PXr 4, respectively.
  • the pupil extracting unit 230 is the first of four primary pre pupil pixel located at each corner portion of the pupillary spare area (AR12) [PX (n -9 X, Y p), PX (X n -9, Y p +10), PX (X n, Y p) and PX (X n, Y p +10 )] position information (X n -9, X n, Y p, Y p + 10) using a first of the
  • the primary preliminary pupil pixels positioned at the center in the uppermost pixel row, the lowermost pixel row, the leftmost pixel column, and the rightmost pixel column of the next preliminary pupil region AR12 are referred to as first through fourth reference pixels PXr1, PXr2, PXr3, and PXr4, and stores the position information on the first to fourth reference pixels PXr1, PXr2, PXr3, and PXr4 in the storage unit 300.
  • the pupil extracting unit 230 does not have a primary preliminary pupil pixel located at the center in the uppermost pixel row, the lowest pixel row, the leftmost pixel column, and the rightmost pixel column of the primary spare pupil region AR12 ,
  • One primary spare pore pixel located closest to the pixel position in the middle of the pixel row or pixel column is determined as the reference pixel (e.g., PXr3).
  • each edge pixel in each of the primary spare pupil area (AR12) [PX (X n -9, Y p), PX (X n -9, Y p +10), PX (X n, Y p) and PX (X n, Y p +10)] using the location information of a primary spare pupil area (AR12) first to fourth standard number of pixels (PXr1, PXr2, PXr3, PXr4) is determined when the pupil extraction unit in the ( 230 form a primary preliminary pupil region AR12 extending in the first and second column directions and in the first and second row directions using the reference pixels PXr1, PXr2, PXr3, PXr4, The area AR13 is determined.
  • the pupil extracting unit 230 extracts the pupil extracting unit 230 from the final eye image data stored in the storage unit 300, which is located immediately above the first reference pixel PXr1 located at the uppermost pixel row of the primary spare pupil region AR12 (I.e., the Y direction in which the number of the column is decreased) in one column unit in the first preliminary pupil pixel, that is, in the same row (X n - 3 ) and is located immediately above the first reference pixel PXr 1
  • the first preliminary pupil pixel is determined as the first comparison pixel PXc1 (S110).
  • the first comparison pixel i.e., the current first comparison pixel
  • PXc1 is set to a position where the number of the column is decreased by '1' in the same pixel column as the first reference pixel PXr1 (PXr1 - 1 ) located in the Y p-1 .
  • the pupil extracting unit 230 returns to the first comparison pixel moved by one row, and the first preliminary pupil pixel (PXr1 - 2) which is located immediately above in the same pixel column and the present first comparison pixel establish a new current first comparison pixel (PXc1) (S113), first It is determined whether the difference between the pixel values of the reference pixel pixel PXr1 and the current first comparison pixel PXc1 is equal to or less than the set value at step S111. At this time, it is natural that the comparison pixels immediately before the new first comparison pixels are stored as the previous comparison pixels and the newly determined first comparison pixels are stored as the current comparison pixels.
  • the pupil extracting unit 230 determines that the previous first comparison pixel PXc1 (For example, the first row position) of the second preliminary pupil area AR13, and stores the position of the row in the storage unit 300.
  • the pupil extracting unit 230 extracts the first preliminary papillon pixels (e.g., PXr1 - PXr1) determined as the previous first comparison pixel PXc1 immediately before the pixel value difference with the first reference pixel PXr1 has a set value or less, using 3) the primary spare pupil pixel (for example, PXr1 - - 3) location information (X n -3, Y p of the determined 3) to the first position in the column (3) is located in a column position (Y p S114).
  • the primary spare pupil pixel for example, PXr1 - - 3) location information (X n -3, Y p of the determined 3) to the first position in the column (3) is located in a column position (Y p S114).
  • the pupil extracting unit 230 sequentially moves the positions of the first comparison pixels PXc1 in a column unit along the first column direction in the first reference pixel PXr1, (I.e., the current first comparison pixel) in which the pixel value difference between the first comparison pixel PXc1 and the first comparison pixel PXc1 exceeds the set value, and determines the first comparison pixel PXc1
  • the column position of the first comparison pixel i.e., the previous first comparison pixel is determined as the first column position.
  • the pupil extracting unit 230 again uses the second reference pixel PXr2 to determine the position of the lowermost row (e.g., the second row position) of the primary preliminary pupil area AR13.
  • the pupil extracting unit 230 extracts the second reference pixel PXr2 located in the lowermost row (Y p + 10 ) of the first preliminary pupil region AR12 from the final eye image data stored in the storage unit 300, (PXr2 + 1 ) located immediately below by one column unit in the second column direction (i.e., the Y direction in which the column number increases in the direction opposite to the first column direction) at the position of the current preliminary pupil pixel 2 comparison pixel PXc2, and determines whether the pixel value of the current second comparison pixel PXc2 exceeds the pixel value of the second reference pixel PXr2 (S115-S117).
  • the difference between the pixel values of the second comparative pixel PXc2 and the second reference pixel PXr2 is equal to or smaller than the set value (S117), and the first preliminary pupil pixel PXr2 + 1 , which is the second comparative pixel, The first preliminary pupil pixel PXr2 + 2 immediately below the second preliminary pupil PXr2 is set as a new current second comparison pixel PXc2 (S118), and pixel value comparison with the second reference pixel PXr2 is performed (S115).
  • the pupil extracting unit 230 sequentially changes the current second comparison pixel PXc2 until the pixel value difference between the second comparison pixel PXc2 and the second reference pixel PXr2 exceeds the set value A pixel value comparison operation is performed.
  • the pupil extracting unit 230 outputs the second comparison immediately before the current second comparison pixel PXc2 (Y P + 11 ) of the primary preliminary pupil pixel (for example, PXr2 + 1 ) which is the previous second comparison pixel determined as the pixel is set as the second column position of the secondary pupil region AR13 (S119).
  • the pupil extracting unit 230 extracts the third and fourth reference pixels PXr3 and PXr4 from the third and fourth reference pixels PXr3 and PXr4,
  • the current third comparative pixel PXc3 and the current fourth comparative pixel PXc4 are determined and the current pixel values of the third and fourth comparison pixels PXc3 and PXc4 and the third and fourth reference pixels PXr3, (E.g., the first row position) and the position of the rightmost row (e.g., the second row position) of the secondary pupil region are determined by comparing the pixel values of the left pupil region and the right pupil region.
  • the positions of the secondary spare pore pixels are decreased or increased from one of the corresponding reference pixels PXr1 and PXr2 in one column unit along the first and second column directions.
  • the first and second comparison pixels PXc1 and PXc2 are determined.
  • the first and second comparison pixels PXc3 and PXc4 are arranged in a row unit from the corresponding reference pixels PXr3 and PXr4 along the first and second row directions. Only the third and fourth comparative pixels PXc3 and PXc4 are determined (S120-S124, S125-S129) by decreasing or increasing the position of the secondary spare pore pixel.
  • the first row direction is the X direction in which the number of the rows is decreasing
  • the second row direction is the X direction in which the number of the row increases in the direction opposite to the first row direction.
  • a current third comparison pixel primary spare pupil pixel determined as (PXc3) (PXr3 - 3) and the primary spare pupil defined as of the fourth comparison pixel (PXc4) pixel (PXr4 + 4) If the pixel value difference between the first pre-pupil pixels in excess of the set value, respectively, located in the same column, and these pre-primary pupil pixel (PXr3 -3, PXr4 + 4) present immediately before between the third comparing a pixel (PXc3) and the fourth pixel comparing a first row position of the primary spare pupil pixel (PXr3 -2, PXr4 + 3) a secondary spare pupil area (AR13) of each line position which was determined by (PXc4) and the 2 row position.
  • the pupil extracting unit 230 extracts a region partitioned by a quadrangle And a pixel included in the secondary spare pupil region AR13 is stored in the storage unit 300 as a secondary spare pore pixel (S130).
  • FIG. 3B An example of the secondary spare pupil area AR13 extracted from the primary spare pupil area AR12 shown in FIG. 3B by the operation of the pupil extracting unit 230 is shown in FIG.
  • the secondary preliminary pupil region AR13 is determined on the basis of the primary preliminary pupil region AR12, but the degree of expansion in the column direction and in the row direction is different between the reference pixels PXr1-PXr4 and the respective comparison pixels PXc1-PXc4), the position of the primary preliminary pupil pixel located at the center of the primary preliminary pupil region AR12 and the position of the secondary preliminary pupil pixel located at the center of the secondary preliminary pupil region AR13
  • the positions of the preliminary pore pixels may be different from each other.
  • the accuracy of the extraction operation of the final pupil region is increased by the operation of the pupil extraction unit 230 according to the present example.
  • the secondary spare pupil area AR13 extracted based on the image shown in FIG. 4B can be output to the information output unit 400 as shown in FIG. have.
  • each edge pixel located at the four corners of the secondary spare pupil area (AR13) [PX (n -9 X, Y p -3 ), PX (X n-9 , Y p +11, PX (X n +2, Y p +3), PX (X n +2, Y p +11)] position information (X n -9, X of the secondary preliminary pupil pixel located in the middle portion of the secondary preliminary pupil region AR13 is used as the center pixel PX of the secondary preliminary pupil region AR13 by using the pixel PX ( n + 2 , Y p -3 , Y P + 11 ) is determined by cent) (S131).
  • the pupil extracting unit 230 calculates the distance R131 from the center pixel PX cent to the first and second columns using the positional information of the secondary spare pore pixel included in the secondary spare pupil area AR13 , R132) and the distances R133 and R134 to the first and second rows are calculated (S131).
  • the distances R131, R132, R133 and R134 are the straight line distances from the first and second columns to the pixel located in the same column as the center pixel PX cent in the first and second rows.
  • the pupil extracting unit 230 calculates distances R131 and R132 having the greatest value among the calculated distances R131, R132, R133, and R134, as the distances R131, R132, R133, and R134 (S133).
  • the pupil extracting unit 230 determines a circular region having a radius (e.g., R131) determined based on the center pixel PX cent as a center point, and determines a secondary preliminary pupil pixel included in the circular region as a final pupil pixel And the region including these final pupil pixels is determined as the final pupil region AR20 (S134).
  • a radius e.g., R131
  • the final pupil area AR20 is determined using the largest value among the calculated distances R131, R132, R133, and R134, it is possible to accurately extract the pupil area without any part excluded from the actual pupil area .
  • the final pupil area AR20 may be determined using the smallest value among the calculated distances R131, R132, R133, and R134. In this case, The accuracy of the final pupil pixel included in the image is increased.
  • the position information and pixel values of the secondary spare pore pixels included in the final pupil area AR20 are stored in the storage unit 300 as position information and pixel values of the final pupil pixel (S135).
  • the final pupil area AR20 extracted based on the image of Fig. 7 is shown in Fig.
  • image acquiring unit 200 pupil extracting unit
  • eye image data extraction unit 220 preprocessing unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne un appareil de reconnaissance de pupille, comprenant une unité d'extraction de pupille qui : extrait des pixels ayant des valeurs de pixel qui sont supérieures ou égales à une première valeur définie, et des pixels ayant des valeurs de pixel qui sont inférieures ou égales à une deuxième valeur définie, qui est supérieure à la première valeur définie, à partir de données d'image d'œil finale, puis sélectionne des premiers pixels de pupille préliminaires grâce à des pixels ayant une valeur de pixel qui est la même que la première valeur définie, et détermine ainsi une première zone de pupille préliminaire dans laquelle les premiers pixels de pupille préliminaires sont inclus ; étend la première zone de pupille préliminaire dans les directions de première et deuxième lignes et dans les directions de première et deuxième colonnes grâce à un premier à un quatrième pixel de référence qui sont situés respectivement dans la colonne de pixels la plus haute, la colonne de pixels la plus basse, la ligne de pixels la plus à gauche et la ligne de pixels la plus droite de la première zone de pupille préliminaire, et détermine ainsi une deuxième zone de pupille préliminaire définie par les première et deuxième lignes et les première et deuxième colonnes ; et détermine une zone circulaire, dont le diamètre est la distance entre un pixel central situé au centre de la deuxième zone de pupille préliminaire, et une parmi la première ligne, la deuxième ligne, la première colonne, et la deuxième colonne de la deuxième zone de pupille préliminaire, et détermine, comme contenant des pixels de pupille finaux, la deuxième zone de pupille préliminaire qui est incluse dans la zone circulaire, et détermine la zone qui contient les pixels de pupille finaux comme étant une zone de pupille finale.
PCT/KR2018/015820 2018-01-23 2018-12-13 Appareil de reconnaissance de pupille WO2019146906A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0008425 2018-01-23
KR1020180008425A KR102037779B1 (ko) 2018-01-23 2018-01-23 동공 인식 장치

Publications (1)

Publication Number Publication Date
WO2019146906A1 true WO2019146906A1 (fr) 2019-08-01

Family

ID=67396057

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/015820 WO2019146906A1 (fr) 2018-01-23 2018-12-13 Appareil de reconnaissance de pupille

Country Status (2)

Country Link
KR (1) KR102037779B1 (fr)
WO (1) WO2019146906A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040085220A (ko) * 2002-02-22 2004-10-07 픽솔로지 소프트웨어 리미티드 디지털 이미지의 적목 특징을 검출 및 보정하는 방법
JP2010170347A (ja) * 2009-01-22 2010-08-05 Nec Corp 画像処理装置、生体認証装置、画像処理方法及びプログラム
KR20140106926A (ko) * 2013-02-27 2014-09-04 한국전자통신연구원 동공 검출 장치 및 동공 검출 방법
KR20150070802A (ko) * 2013-12-17 2015-06-25 현대자동차주식회사 동공 검출 장치 및 동공 검출 방법
KR101769741B1 (ko) * 2016-09-26 2017-08-21 크루셜텍 (주) 동공 탐지를 통한 홍채 인식 방법 및 홍채 인식 장치

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100397750B1 (ko) 2000-03-24 2003-09-13 김회율 홍채 인식을 위한 실시간 동공 검출 방법
KR100572410B1 (ko) 2003-03-03 2006-04-18 엘지전자 주식회사 홍채 인식을 위한 동공 영역 추정방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040085220A (ko) * 2002-02-22 2004-10-07 픽솔로지 소프트웨어 리미티드 디지털 이미지의 적목 특징을 검출 및 보정하는 방법
JP2010170347A (ja) * 2009-01-22 2010-08-05 Nec Corp 画像処理装置、生体認証装置、画像処理方法及びプログラム
KR20140106926A (ko) * 2013-02-27 2014-09-04 한국전자통신연구원 동공 검출 장치 및 동공 검출 방법
KR20150070802A (ko) * 2013-12-17 2015-06-25 현대자동차주식회사 동공 검출 장치 및 동공 검출 방법
KR101769741B1 (ko) * 2016-09-26 2017-08-21 크루셜텍 (주) 동공 탐지를 통한 홍채 인식 방법 및 홍채 인식 장치

Also Published As

Publication number Publication date
KR20190089632A (ko) 2019-07-31
KR102037779B1 (ko) 2019-10-29

Similar Documents

Publication Publication Date Title
WO2017099427A1 (fr) Procédé d'authentification biométrique convergente reposant sur une articulation du doigt et une veine du doigt, et appareil associé
WO2015174647A1 (fr) Procédé d'authentification d'utilisateur, dispositif pour l'exécuter et support d'enregistrement pour le stocker
WO2020027607A1 (fr) Dispositif de détection d'objets et procédé de commande
WO2015137635A1 (fr) Appareil de capture d'image et procédé de génération d'image comportant des informations de profondeur
WO2016163755A1 (fr) Procédé et appareil de reconnaissance faciale basée sur une mesure de la qualité
WO2018117353A1 (fr) Procédé de détection de limite entre l'iris et la sclérotique
US20070116364A1 (en) Apparatus and method for feature recognition
US20080192122A1 (en) Photographing apparatus, method and computer program product
JP3490910B2 (ja) 顔領域検出装置
WO2010131435A1 (fr) Appareil de reconnaissance des formes et procede correspondant configure pour reconnaitre un objet et un autre objet d'ordre inferieur
WO2016122068A1 (fr) Procédé pour reconnaître un pneu et dispositif associé
WO2010041836A2 (fr) Procédé de détection d'une zone de couleur peau à l'aide d'un modèle de couleur de peau variable
WO2019168264A1 (fr) Dispositif électronique et son procédé de commande
KR20130000828A (ko) 얼굴 피쳐 검출 방법
WO2016108327A1 (fr) Procédé de détection de véhicule, structure de base de données pour la détection de véhicule, et procédé de construction de base de données pour détection de véhicule
WO2016060439A1 (fr) Procédé et appareil de traitement de d'image
WO2018139847A1 (fr) Procédé d'identification personnelle par comparaison faciale
CN112949365A (zh) 活体脸部辨识系统与方法
WO2022092743A1 (fr) Procédé d'extraction de caractères de plaque d'immatriculation de véhicule et dispositif d'extraction de caractères de plaque d'immatriculation pour appliquer le procédé
WO2019146906A1 (fr) Appareil de reconnaissance de pupille
WO2019146907A1 (fr) Appareil de reconnaissance d'iris
WO2014051309A1 (fr) Appareil de stéréocorrespondance utilisant une propriété d'image
WO2014035050A1 (fr) Procédé de réglage de la luminance d'images, dispositif à cet effet et appareil photo stéréoscopique
WO2018088649A1 (fr) Procédé de détection de reflet
WO2014021490A1 (fr) Dispositif de test de perception d'écran, et système de test de perception d'écran l'utilisant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902231

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18902231

Country of ref document: EP

Kind code of ref document: A1