EP3459009A2 - Procédé de quantification adaptative pour codage d'image d'iris - Google Patents

Procédé de quantification adaptative pour codage d'image d'iris

Info

Publication number
EP3459009A2
EP3459009A2 EP17824490.1A EP17824490A EP3459009A2 EP 3459009 A2 EP3459009 A2 EP 3459009A2 EP 17824490 A EP17824490 A EP 17824490A EP 3459009 A2 EP3459009 A2 EP 3459009A2
Authority
EP
European Patent Office
Prior art keywords
iris
mask
image
converted
code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17824490.1A
Other languages
German (de)
English (en)
Other versions
EP3459009A4 (fr
Inventor
Mikhail Vladimirovich KOROBKIN
Vladimir Alekseevich EREMEEV
Aleksei Mikhailovich FARTUKOV
Gleb Andreevich ODINOKIKH
Vitaly Sergeevich GNATYUK
Aleksei Bronislavovich DANILEVICH
Dae-Kyu Shin
Ju-woan YOO
Kwang-Hyun Lee
Hee-Jun Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2017/007066 external-priority patent/WO2018008934A2/fr
Publication of EP3459009A2 publication Critical patent/EP3459009A2/fr
Publication of EP3459009A4 publication Critical patent/EP3459009A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities

Definitions

  • the present disclosure relates to a method and an apparatus for iris recognition. More particularly, the present disclosure relates to an adaptive quantization method for iris image encoding.
  • Electronic devices may store information, such as addresses, call history, messages, and the like, as well as personal information, such as location information of a person, memos, financial transactions, and the like.
  • electronic devices may have various security functions. More particularly, methods of maintaining security of electronic devices by using bio information of a user have been widely distributed. The method of maintaining the security of the electronic devices by using the bio information may include fingerprint recognition, face recognition, iris recognition, and the like.
  • iris recognition is a verification technique for security by using features of the iris, that is, features varying depending on the person.
  • the iris recognition may be performed by a camera without directly contacting a body of the user.
  • iris recognition technology in mobile devices has a lot of difficulties, for example, severely changing environmental conditions (e.g., indoors/outdoors, clear weather/cloudy weather, wearing of glasses/contacts), limitations in the performance of the devices, which do not allow a device to perform iris recognition in real-time (central processing unit (CPU), random access memory (RAM), camera resolution, and the like), and difficulties in interaction with a user (interactions are sometimes not convenient for the user).
  • CPU central processing unit
  • RAM random access memory
  • camera resolution camera resolution
  • interactions are sometimes not convenient for the user.
  • Work flow according to the related art in the iris recognition may include quantization of an iris image for iris code matching.
  • inconsistent bits occurring from, for example, the variation in the environmental condition, and in quantization may increase a false reject rate (FRR), and decrease accuracy and robustness in the iris recognition.
  • FRR false reject rate
  • an aspect of the present disclosure is to provide an apparatus and method for iris recognition.
  • a certain part of an iris image may provide inconsistent visual information. Therefore, the performance of iris recognition may degrade due to the part providing the inconsistent visual information. Therefore, in order to improve the performance of the iris recognition, blocking the part providing the inconsistent visual information during processes of the iris recognition may be performed.
  • a user recognition method using an iris includes generating a first mask for blocking a non-iris object area of an iris image, generating a converted iris image, in which the non-iris object area is blocked according to the first mask, generating a second mask for additionally blocking an inconsistent area, in which quantization results of the converted iris image are inconsistent, by adaptively transforming the first mask according to features of the converted iris image, obtaining an iris code by quantizing pixels included in the iris image, obtaining a converted iris code, in which portions corresponding to the non-iris object area and the inconsistent area are blocked, by applying the second mask to the iris code, and recognizing a user by matching a reference iris code, stored by the user in advance, to the converted iris code.
  • a user recognition device by using an iris includes a mask generator configured to generate a first mask for blocking a non-iris object area of an iris image, to generate a converted iris image, in which the non-iris object area is blocked according to the first mask, and to generate a second mask for additionally blocking an inconsistent area, in which quantization results of the converted iris image are inconsistent, by transforming the first mask adaptively according to features of the converted iris image, an iris code generator configured to obtain an iris code by quantizing pixels included in the iris image, and to obtain a converted iris code, in which portions corresponding to the non-iris object area and the inconsistent area are blocked, by applying the second mask to the iris code, and an iris scanner configured to recognize a user by matching a reference iris code, stored by the user in advance, to the converted iris code.
  • a computer program product includes a computer-readable storage medium, wherein the computer-readable storage medium includes instructions for performing each process in the user recognition method by using the iris.
  • Embodiments of the present disclosure search for and remove the inconsistent bits by using adaptive critical points during or before iris code matching or data base registration, in order to improve the accuracy and the robustness in the iris code matching.
  • FIG. 1 is a flowchart illustrating iris recognition processes according to an embodiment of the present disclosure
  • FIG. 2 is a diagram showing segmenting an iris image from an eye image according to an embodiment of the present disclosure
  • FIGS. 3 and 4 illustrate regularizing an iris image according to various embodiments of the present disclosure
  • FIG. 5 is a diagram of a first mask and an iris image to which the first mask is applied according to an embodiment of the present disclosure
  • FIG. 6 is a diagram of a converted iris image, in which a non-iris object region is blocked by applying a first mask according to an embodiment of the present disclosure
  • FIG. 7 is a diagram of a converted iris image that is binarized in order to emphasize characteristics of an iris according to an embodiment of the present disclosure
  • FIG. 8 is a diagram of generating an iris feature vector according to a Gabor filter and binarization according to an embodiment of the present disclosure
  • FIG. 9 is a diagram of an inconsistent area regarding an iris feature vector according to an embodiment of the present disclosure.
  • FIG. 10 illustrates determining a critical value according to a block quota according to an embodiment of the present disclosure
  • FIG. 11 illustrates generating a second mask by transforming a first mask according to features of a converted iris image according to an embodiment of the present disclosure
  • FIG. 12 illustrates obtaining a converted iris code by applying a second mask to an iris code according to an embodiment of the present disclosure
  • FIG. 13 is a diagram of a user recognition device by using an iris according to an embodiment of the present disclosure.
  • FIG. 14 is a diagram of a user device including a user recognition device according to an embodiment of the present disclosure.
  • a user recognition method that uses an iris comprises generating a first mask for blocking a non-iris object area of an iris image, generating a converted iris image, in which the non-iris object area is blocked according to the first mask, generating a second mask for additionally blocking an inconsistent area, in which quantization results of the converted iris image are inconsistent, by adaptively transforming the first mask according to features of the converted iris image, obtaining an iris code by quantizing pixels included in the iris image, obtaining a converted iris code, in which portions corresponding to the non-iris object area and the inconsistent area are blocked, by applying the second mask to the iris code, and recognizing a user by matching a reference iris code, stored by the user in advance, to the converted iris code.
  • a user recognition device by using an iris comprises a mask generator, an iris code generator, an iris scanner.
  • the mask generator is configured to generate a first mask for blocking a non-iris object area of an iris image, generate a converted iris image, in which the non-iris object area is blocked according to the first mask, and generate a second mask for additionally blocking an inconsistent area, in which quantization results of the converted iris image are inconsistent, by transforming the first mask adaptively according to features of the converted iris image.
  • the iris code generator is configured to obtain an iris code by quantizing pixels included in the iris image, and obtain a converted iris code, in which portions corresponding to the non-iris object area and the inconsistent area are blocked, by applying the second mask to the iris code.
  • the iris scanner is configured to recognize a user by matching a reference iris code, stored by the user in advance, to the converted iris code.
  • a computer program product comprising at least one non-transitory computer-readable storage medium.
  • the computer-readable storage medium comprises instructions for performing each process in the user recognition method by using the iris.
  • module or “unit” may perform at least one function or operation, and may be implemented as hardware, software, or a combination thereof.
  • a plurality of modules” or “a plurality of units” may be implemented as at least one processor (not shown) via combination with one or more other modules than the "module” or “unit” that needs to be implemented as certain hardware.
  • the term "and/or" includes any and all combinations of one or more of the associated listed items. Expressions, such as "at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
  • An electronic device may include at least one of, for example, a smart phone, a tablet personal computer (PC), a mobile phone, a video phone, an electronic book reader (e-book reader), a desktop PC, a laptop PC, a personal digital assistant (PDA), a portable multimedia player (PMP), a moving picture experts group (MPEG-1) audio layer-3 (MP3) player, a mobile medical device, a camera, and a wearable device (e.g., a head-mounted display (HMD), for example, electronic glasses, electronic clothing, an electronic bracelet, an electronic necklace, an electronic accessory, an electronic tattoo, or a smart watch).
  • a wearable device e.g., a head-mounted display (HMD), for example, electronic glasses, electronic clothing, an electronic bracelet, an electronic necklace, an electronic accessory, an electronic tattoo, or a smart watch.
  • HMD head-mounted display
  • FIG. 1 is a flowchart illustrating iris recognition processes according to an embodiment of the present disclosure.
  • the iris recognition process according to the embodiment of the present disclosure may include following operations S101 to S106.
  • a first mask for blocking a non-iris object area in an iris image is generated.
  • an eye image may be extracted from a face image captured by an internal or external camera of an electronic device.
  • the face image denotes an image including the whole face of a user or a partial face including at least an eye portion.
  • the eye image corresponds to the eye region in the face image.
  • the electronic device for obtaining the face image includes a mobile terminal, a mobile phone, a desktop computer, a smart watch, and the like, but is not limited thereto.
  • the face image may be captured as a full-color image or a monochrome image.
  • the iris image is an image of an iris and objects adjacent to the iris included in the eye.
  • the iris image is segmented from the eye image, and in this specification, the term "segmentation" denotes emphasizing or selecting a certain object from an image.
  • the iris image may include such objects as a pupil, an eyelid, eyelashes, a sclera, an iris, and the like.
  • FIG. 2 shows an example of an iris image segmentation.
  • FIG. 2 is a diagram showing segmenting an iris image from an eye image according to an embodiment of the present disclosure.
  • an eye image 210 and an iris image 220 are illustrated.
  • the left side of FIG. 2 shows segmenting of the iris image 220 from the eye image 210 included in a face image 200.
  • the right side of FIG. 2 shows the iris image 220 emphasized in grey according to the result of segmentation.
  • an area of the iris image 220 is determined by excluding a circular area that is determined according to the size of the pupil from a circular area determined according to the size of the iris. Therefore, the shape of the area of the iris image 220 is a ring.
  • a pattern of the iris is determined by analyzing muscles of the iris, which are spread radially from the pupil. Therefore, if the ring-shaped iris image is transformed into a rectangular iris image, it may be easier to analyze the pattern of the iris.
  • the above transformation of the iris image is defined as a normalization of the iris image.
  • FIGS. 3 and 4 illustrate normalization of the iris image.
  • FIGS. 3 and 4 illustrate regularizing an iris image according to various embodiments of the present disclosure.
  • a location of each pixel in a ring-shaped iris image 300 may be expressed as (r, ⁇ ) according to polar coordinates.
  • r denotes a distance from a center to the pixel
  • denotes a direction of the pixel from the center.
  • the ring-shaped iris image 300 may be normalized by corresponding r to a vertical axis and ⁇ to a horizontal axis, as shown on the right side of FIG. 3.
  • a and B in the ring-shaped iris image 300 may correspond to A and B in a rectangular iris image 310.
  • Other pixels in the ring-shaped iris image 300 also correspond to the rectangular iris image 310 according to (r, ⁇ ).
  • the rectangular iris image 310 that has been normalized is generated.
  • FIG. 4 an example of the normalization of the iris image is illustrated.
  • the left side of FIG. 4 shows a ring-shaped iris image 400.
  • the right side of FIG. 4 shows an iris image 410 that is obtained by normalizing the ring-shaped iris image 400.
  • the performance of iris recognition may degrade. Therefore, it is necessary to remove obstacles, such as the eyelid or the eyelashes from the iris image.
  • an area corresponding to the obstacles, such as the eyelid or the eyelashes is defined as the non-iris object area.
  • an area corresponding to the iris is defined as an iris object area.
  • a mask for inactivating the non-iris object area may be generated.
  • the mask for removing the non-iris object area is defined as a first mask.
  • the term 'first mask' is just defined for convenience of description, and may be defined by other terms.
  • FIG. 5 is a diagram of a first mask and an iris image to which the first mask is applied according to an embodiment of the present disclosure.
  • a black portion in the first mask 500 corresponds to the non-iris object area. Therefore, a part of the iris image corresponding to the black portion is not used in iris matching.
  • a white portion in the first mask 500 corresponds to the iris object area. Therefore, a part of the iris image corresponding to the white portion may be used in matching to a corresponding portion of a reference iris image that is stored in advance.
  • the reference iris image denotes an image showing an iris pattern of the user, which is stored in an internal or external database in advance.
  • the first mask 500 may be expressed as bits. Whether to block a pixel in the iris image may be expressed by at least one bit. When the one bit is 0, a pixel corresponding to the one bit is excluded in the iris matching. On the other hand, when the one bit is 1, the pixel corresponding to the one bit may be used in the iris matching. If each of the pixels in the iris image is expressed by a complex number including a real number and an imaginary number, whether to block the pixel in the iris image may be expressed by two bits.
  • the first mask 500 may be applied equally to the reference image that is compared with the iris image, as well as the iris image.
  • a combined mask that is obtained by combining the first mask 500 with the reference mask may be applied equally to the iris image and the reference image.
  • FIG. 6 is a diagram of a converted iris image, in which a non-iris object region is blocked by applying a first mask according to an embodiment of the present disclosure.
  • FIG. 6 shows a converted iris image 600, in which the non-iris object area is blocked by applying the first mask.
  • the converted iris image 600 the iris image of the iris object area only remains because the non-iris object area is blocked.
  • the iris pattern of the converted iris image may be quantized for iris matching.
  • the quantization converts physical quantity having continuous values into physical quantity having discontinued values.
  • the quantization process may include a binarization process that expresses a numerical document by bits having values of 0 and 1, and may be a part of an encoding process. As an example of binarization, when a pixel value of the converted iris image is less than 0, the pixel value is determined to be 0, and when the pixel value of the converted iris image is greater than 0, the pixel value may be determined to be 1. Therefore, as a result of binarization, the pixels may be expressed by bits.
  • the iris pattern of one user may be captured a little differently depending on the health condition of the user and a photographing environment, and the like. If the iris pattern of the reference iris image and the iris pattern of the iris image are different from each other due to the influence of the photographing environment, a user verification device may recognize the iris image of the same user as an iris image of another person. Therefore, fine variation in the iris pattern caused by the photographing environment may be removed by binarizing the converted iris image.
  • the binarization result of the pixel value may not be consistent.
  • a pixel having a pixel value between -1 to 1 may be binarized into 0 or 1 according to the peripheral environment.
  • a bit of the pixel having inconsistent result of binarization is defined as an inconsistent bit. Therefore, in order to improve robustness of the iris matching, there is a need to additionally block the pixel that is determined to include the inconsistent bit during the iris matching process.
  • FIG. 7 is a diagram of a converted iris image that is binarized in order to emphasize characteristics of an iris according to an embodiment of the present disclosure.
  • a converted iris image 700 that is binarized is illustrated.
  • a grey portion in the binarized converted iris image 700 denotes an area that is binarized into the value 1.
  • a white portion in the binarized converted iris image 700 denotes an area that is binarized into the value 0. Since the pixels at a boundary between the grey portion and the white portion are likely to have a value close to 0, the pixel values may be binarized into 0 or into 1 according to the peripheral environment. Therefore, with respect to the pixels at the boundary between the grey portion and the white portion, it is determined whether the binarization result is consistent, and it is necessary to additionally block the pixels, the binarization results of which are determined inconsistent, from the iris matching.
  • the first mask 500 is adaptively transformed according to characteristics of the converted iris image to generate a second mask that additionally blocks an inconsistent area, where the quantization result or binarization result of the converted iris image is not consistent.
  • the inconsistent area includes the pixels, the quantization or binarization results of which are likely to be changed according to the peripheral environment.
  • the inconsistent area is determined based on the pixels of the converted iris image, and the first mask 500 is transformed to additionally block the inconsistent area, as well as the non-iris object area. According to the transformation result, the second mask blocking both the non-iris object area and the inconsistent area is generated.
  • the pixels of the converted iris image may each have a real number value.
  • the real number value of the pixel may be determined based on a grey level corresponding to the pixel of the iris image.
  • the pixels in the converted iris image may each be expressed by a complex number.
  • the pixel value expressed by the complex number may be determined based on a grey level corresponding to the pixel of the iris image.
  • the grey levels of the pixels in the converted iris image may be expressed as a set of complex numbers representing amplitude information and phase information of the iris pattern according to a Gabor filter.
  • the amplitude information that is affected by the light source or camera gain is removed, and the phase information representing the marks of the complex numbers corresponding to the pixels of the converted iris image may be only used.
  • the phase information may be extracted by other techniques or filters including a filter based on Fourier transformation, a filter based on wavelet transformation, and the like, than the above Gabor filter.
  • FIG. 8 is a diagram of generating an iris feature vector according to a Gabor filter and binarization according to an embodiment of the present disclosure.
  • a graph 800 of a method of determining an iris feature vector according to the Gabor filter and binarization is illustrated.
  • the grey level of the pixel may be expressed by the complex number according to the Gabor filter.
  • the amplitude information representing the size of the complex number is removed and the phrase information representing the mark of the complex number is only used. Therefore, the complex number value of the pixel obtained by using the Gabor filter is binarized to generate the iris feature vector.
  • Re denotes a real part of the complex number value
  • Im denotes an imaginary part of the complex number value.
  • the iris feature vector is determined to be ⁇ 1, 1 ⁇ .
  • the pixel value is -4+1.2j
  • the real part of the pixel value is less than 0 and the imaginary part is greater than 0, and thus, the iris feature vector is determined to be ⁇ -1, 1 ⁇ .
  • An iris code that will be described below is determined to be a set of iris feature vectors.
  • the inconsistent area of the converted iris image may include a first inconsistent area about the real part of the pixels and a second inconsistent area about the imaginary part of the pixels.
  • the first inconsistent area includes the pixels, the real parts of which are close to 0 and have inconsistent binarization results.
  • the second inconsistent area includes the pixels, the imaginary parts of which are close to 0 and have inconsistent binarization results.
  • the first inconsistent area and the second inconsistent area are determined independently from each other. Therefore, the first inconsistent area and the second inconsistent area may be different from each other.
  • a range of pixel values corresponding to the inconsistent area is set. For example, when a critical value is 0.5, pixels having pixel values ranging from -0.5 to 0.5 are included in the inconsistent area. For example, when an absolute value of the pixel value is less than the critical value, the pixel is included in the inconsistent area.
  • the first critical value and the second critical value may be set to be equal to each other.
  • the pixel of the iris image is expressed by the complex number
  • the pixel may be included in the first inconsistent area when an absolute value of the real part of the pixel value is less than the first critical value.
  • the pixel may be included in the second inconsistent area.
  • the absolute value of the real part of the pixel is 0.5 and the first critical value is 0.6, and thus, the pixel is included in the first inconsistent area.
  • the absolute value of the imaginary part of the pixel value is 0.6 and the second critical value is 0.5, and thus, the pixel is not included in the second inconsistent area.
  • FIG. 9 is a diagram of an inconsistent area regarding an iris feature vector according to an embodiment of the present disclosure.
  • a region A 910 includes pixels having the real part, the absolute value of which is less than a first critical value 912.
  • a region B 920 includes pixels having the imaginary part, the absolute value of which is less than a second critical value 922.
  • the first inconsistent area is determined according to the location of the pixel corresponding to the region A 910
  • the second inconsistent area is determined according to the location of the pixel corresponding to the region B 920.
  • a pixel corresponding to a region C 930 is not blocked according to the first critical value 912 and the second critical value 922, and thus, may be used in the iris matching.
  • the critical value may be determined adaptively to the converted iris image.
  • the critical value may be a fixed value regardless of the converted iris image.
  • both the first and second critical values may be determined adaptively to the converted iris image.
  • both the first and second critical values may be fixed values determined in advance.
  • the first critical value is only adaptively determined to the iris image and the second critical value may be a fixed value set in advance, or vice versa.
  • the critical value may be determined according to a block quota that represents a ratio of pixels to be blocked.
  • a minimum amount of the pixels to be blocked is determined according to the pixels of the converted iris image and the block quota. According to the minimum amount determined above, the pixels are included in the inconsistent area in an order in which absolute values of the pixel values increase.
  • the block quota is 20% and the number of pixels in the converted iris image is a hundred, at least twenty (20) pixels have to be blocked. Therefore, twenty pixels are selected in an ascending order of the absolute values of the pixel values, and the largest value among the absolute values of the twenty pixels is determined to be the critical value. In addition, it is determined that all the pixels having absolute values that are greater or less than the critical value are included in the inconsistent area. According to an embodiment of the present disclosure, one of ordinary skill in the art may change the method of setting the critical value so that only the pixels having absolute values that are less than the critical value are included in the inconsistent area.
  • FIG. 10 illustrates determining a critical value according to a block quota according to an embodiment of the present disclosure.
  • the pixels are arranged in an ascending order of the absolute values. If the block quota is 7% and the number of pixels in the converted iris image is a hundred, it is determined that at least seven pixels (1010) are blocked. Therefore, an absolute value 0.2 of a pixel 1012 having the absolute value that is seventh smallest is determined to be the critical value. In addition, ten pixels (1020) having absolute values that are equal to or less than 0.2 are determined to be included in the inconsistent area.
  • the block quota may be determined in advance through statistical experiments. When the block quota is reduced, an area used in the iris matching increases and a false acceptance rate may increase, but the inconsistent area that is excluded from the iris matching is decreased and a false rejection rate may increase. On the other hand, when the block quota is increased, the inconsistent area excluded from the iris matching increases and the false rejection rate may decrease, but the area used in the iris matching decreases and the false acceptance rate may increase. Therefore, an appropriate block quota has to be used in setting the critical value.
  • the false rejection rate cited above denotes the number of negative verification results occurring with respect to a definitely affirmative case, for example, a case where the eyes of the same person are compared.
  • the false acceptance rate cited above denotes the number of positive verification results occurring with respect to a definitely negative case, for example, a case where the eyes of different persons are compared.
  • the false rejection rate and the false acceptance rate have to be maintained low in order to improve the robustness and accuracy in the iris matching.
  • the false rejection rate and the false acceptance rate that are allowable in the iris matching may be determined differently according to a required accuracy of the iris recognition based on the field to which the iris recognition is applied. Therefore, the block quota may be determined based on the allowable false rejection rate and the false acceptance rate.
  • a first block quota applied to setting of the first critical value with respect to the real part of the pixel and a second block quota applied to setting of the second critical value with respect to the imaginary part of the pixel may be separately set.
  • the first block quota and the second block quota may be set equal to each other, or may be set differently according to experimental results.
  • the first mask is transformed according to the determined inconsistent area to generate the second mask.
  • the first mask is generated to exclude the non-iris object area from the iris image in the iris matching.
  • the second mask is generated to additionally exclude the inconsistent area including the inconsistent bit from the iris object area in the iris matching.
  • FIG. 11 illustrates generating a second mask by transforming a first mask according to features of a converted iris image according to an embodiment of the present disclosure.
  • a first mask 1110 whether each of the pixels in the iris image is included in the non-iris object area is expressed as 0 or 1. A pixel corresponding to an element determined to be 0 is not used in the iris matching. A pixel corresponding to an element determined to be 1 is used in the iris matching.
  • the elements corresponding to the pixels included in a non-iris object area 1112 are determined to be 0.
  • the elements corresponding to the pixels included in an iris object area 1114 are determined to be 1. Therefore, when the first mask 1110 is applied to the iris image, a converted iris image in which the non-iris object area 1112 is blocked is obtained.
  • An image map 1120 expresses grey levels of the pixels included in the converted iris image.
  • the pixels included in the iris object area 1114 may be expressed by the grey levels.
  • the critical value is determined according to the grey levels of the pixels included in the iris object area 1114 and the block quota set in advance. In FIG. 11, the critical value is determined to be 0.2 as an example.
  • pixels having the grey levels, absolute values of which are less than the critical value 0.2 are determined to be included in the inconsistent area 1122.
  • a second mask 1130 is generated by changing the elements, in the first mask 1110, corresponding to the inconsistent area 1122 from 1 to 0. Therefore, in the second mask 1130, elements corresponding to the pixels included in the non-iris object area 1112 or the inconsistent area 1122 are determined to be 0. In addition, elements corresponding to pixels that are not included in the non-iris object area 1112 or the inconsistent area 1122 are determined to be 1. Therefore, when the second mask 1130 is applied to the converted iris image, a second converted iris image, in which the inconsistent area 1122 is additionally blocked, may be obtained. In addition, when the second mask 1130 is applied to an iris code that will be described below, a converted iris code in which codes corresponding to the inconsistent area 1122 are blocked may be generated.
  • FIG. 11 only illustrates the process of generating the second mask 1130 in a case where the pixel values are expressed by real numbers, for convenience of description. Therefore, if the pixel value is expressed by a complex number, a second mask for the real part and a second mask for the imaginary part may be separately generated. Otherwise, one second mask may include information about the blocking of the real part and the imaginary part.
  • the elements included in the iris image are quantized to obtain an iris code.
  • the iris code may be obtained by quantizing pixel values expressed by the complex numbers in the iris image.
  • binarization that is, a kind of quantization, may be applied to the complex number value of each pixel, wherein the complex number value is generated by using a Gabor filter, and the like.
  • the phase information of the complex numbers corresponding to the pixels may be extracted as the feature of the iris pattern.
  • the phase information of the complex number is expressed as an iris feature vector, and an iris code is determined to be a set of iris feature vectors of the pixels.
  • the iris code may be obtained by binarizing pixel values expressed by the complex numbers in the iris image into two bits.
  • one bit may be generated by binarizing a real part of the complex number value.
  • the bit obtained by binarizing the real number value is defined as a real number binarization bit.
  • the other bit in the above two bits may be generated by binarizing an imaginary part of the complex number value.
  • the bit obtained by binarizing the imaginary number value is defined as an imaginary number binarization bit.
  • the iris code may be obtained by binarizing pixel values expressed by the complex numbers in the converted iris image into two bits.
  • the iris code may be obtained by binarizing the pixel values expressed by real numbers in the iris image or the converted iris image into two bits.
  • the second mask is applied to the iris code, and then, a converted iris code in which codes corresponding to the non-iris object area and the inconsistent area are blocked is obtained.
  • the real number binarization bits and the imaginary number binarization bits of the iris code are separately blocked according to the second mask, and then, a converted iris code may be obtained.
  • a converted iris code may be obtained.
  • the real number binarization bits are not inconsistent bits and the imaginary number binarization bits are inconsistent bits, the imaginary number binarization bits are only blocked and the real number binarization bits may be used in the iris matching.
  • the second mask may be expressed by bits like the first mask. If each of the pixels in the iris image is expressed by a complex number including a real part and an imaginary part, whether to block the pixel in the iris image may be expressed by two bits. Like the first mask, a bit expressed as 0 denotes that a corresponding pixel or a part of the corresponding pixel is blocked in the iris matching, and a bit expressed as 1 denotes that a corresponding pixel or a part of the corresponding pixel is not blocked in the iris matching.
  • FIG. 12 illustrates obtaining a converted iris code by applying a second mask to an iris code according to an embodiment of the present disclosure.
  • the iris code 1200 includes codes corresponding to grey levels of the pixels included in the iris image. Each code is expressed by two bits, and the right bit in the two bits is a real number binarization bit and the left bit is an imaginary number binarization bit.
  • the iris code 1200 may be divided into a real part iris code 1202 and an imaginary part iris code 1204.
  • the real part iris code 1202 only includes the real number binarization bit and the imaginary part iris code 1204 only includes the imaginary number binarization bit.
  • the second mask 1210 about the real part is applied to the real part iris code 1202.
  • the real number binarization bit of a pixel that corresponds to an element of the second mask 1210 about the real part having a value of 0 is blocked in the iris matching.
  • the second mask 1212 about the imaginary part is applied to the imaginary part iris code 1204.
  • the imaginary number binarization bit of a pixel that corresponds to an element of the second mask 1212 about the imaginary part having a value of 0 is blocked in the iris matching.
  • Results of the blocking by the second mask 1210 about the real part and the second mask 1212 about the imaginary part are combined to obtain a converted iris code 1220.
  • Blocking according to the second mask 1210 about the real part and blocking according to the second mask 1212 about the imaginary part may be independently performed from each other. Therefore, only one bit that is not blocked in codes corresponding to the same pixels may be used in the iris matching.
  • a code a 1230 having a value of '10' is blocked by both the second mask 1210 about the real part and the second mask 1212 about the imaginary part. Therefore, the code a 1230 is expressed as '--' that represents that the real number binarization bit and the imaginary number binarization bit are all blocked in the converted iris code 1220.
  • a code b 1232 having a value of '10' is not blocked by both the second mask 1210 about the real part and the second mask 1212 about the imaginary part. Therefore, the code b 1232 is expressed as '10' in the converted iris code 1220, that is, equal to the value in the iris code 1200.
  • a code c 1234 having a value '01' is not blocked by the second mask 1210 about the real part, but is blocked by the second mask 1212 about the imaginary part. Therefore, the code c 1234 is expressed as '0-' representing that the imaginary number binarization bit is only blocked in the converted iris code 1220.
  • a code d 1236 having a value '01' is not blocked by the second mask 1212 about the imaginary part, but is blocked by the second mask 1210 about the real part. Therefore, the code d 1236 is expressed as '-1' representing that the real number binarization bit is only blocked in the converted iris code 1220.
  • FIG. 12 illustrates the method of blocking the code by using the second masks separately according to the real part and the imaginary part, but the second mask may be applied to the iris code without separating the real part and the imaginary part.
  • the user is recognized by matching a reference code stored in advance by the user to the converted iris code.
  • the converted iris code that is obtained by removing the inconsistent bits of the inconsistent area from the iris code is matched to the reference iris code that is stored by the user in advance in an external or internal iris code database, in order to verify whether the iris codes match each other or whether the iris code is real.
  • the user may execute various applications, for example, from a social network service to a mobile banking application, requiring various security levels.
  • the matching fails or the reality is not identified, the user's access to the application requiring security is denied.
  • the storage of the reference iris code in advance may be performed by using the function of the iris recognition device that is described in the present specification.
  • Fields to which the suggested iris recognition method is applied may include iris-based bio verification for border control, a technique of using the iris as a bio-passport, computer log-in, a technique of using the iris as a bio password, a wireless device-based verification technique, a technique of safely accessing a bank account or a banking application via an automatic teller machine (ATM) or a mobile application, ticket-less travel, authentication of the right to a service, indoor access control (home, an office, a lab, a fridge, and the like), a driver's license, other personal verification authority, forensics, birth certificates, pursuit of car engine start and unlock by a missing person or a person as a tracking target, an anti-theft device, an anti-terrorism device (e.g., airport security inspection), financial transaction (e-commerce and e-finance) security, Internet security, confidential information access control, keys, cards, and "bio-recognition key encryption" used as a personal identification number (PIN) or a password (stabil
  • the block quota for the iris recognition in a banking application may be higher than that for the iris recognition in a social network application.
  • the block quota may vary depending on a required accuracy in iris recognition according to the field to which the iris recognition was applied.
  • FIG. 13 is a diagram of a user recognition device by using an iris according to an embodiment of the present disclosure.
  • a user recognition device 1300 may include a mask generator 1310, an iris code generator 1320, and an iris scanner 1330.
  • the mask generator 1310, the iris code generator 1320, and the iris scanner 1330 are shown as separate components, but may be combined as one component in some embodiments.
  • the mask generator 1310, the iris code generator 1320, and the iris scanner 1330 are shown as components located in one device, but devices performing functions of the mask generator 1310, the iris code generator 1320, and the iris scanner 1330 may not be necessarily adjacent to each other physically. Therefore, the mask generator 1310, the iris code generator 1320, and the iris scanner 1330 may be dispersed in some embodiments.
  • the mask generator 1310, the iris code generator 1320, and the iris scanner 1330 may be implemented by one processor. Otherwise, according to the various embodiments of the present disclosure, the above components may be implemented by a plurality of processors.
  • the mask generator 1310 may generate the first mask for blocking the non-iris object area in the iris image. In addition, the mask generator 1310 may generate a converted iris image, in which the non-iris object area is blocked according to the first mask.
  • the mask generator 1310 may adaptively transform the first mask according to features of the converted iris image to generate a second mask that additionally blocks an inconsistent area, in which quantization results of the converted iris image are not consistent.
  • the mask generator 1310 may obtain complex values representing grey levels of the pixels in the converted iris image.
  • the mask generator 1310 may determine a critical value for determining the inconsistent bits according to the complex number value and the block quota.
  • the critical value may be determined respectively to the real part and the imaginary part of the complex number value.
  • the block quota for determining the critical value may be set respectively to the real part or the imaginary part of the complex number value in advance.
  • the mask generator 1310 may determine whether the real part of the pixel is to be blocked in the iris matching by comparing an absolute value of the real part of the pixel expressed by the complex number with the critical value. Likewise, the mask generator 1310 may determine whether the imaginary part of the pixel is to be blocked in the iris matching by comparing an absolute value of the imaginary part of the pixel expressed by the complex number with the critical value. The mask generator 1310 may transform the second mask for blocking the inconsistent bit according to the critical value.
  • the mask generator 1310 may determine whether to block the pixel in the iris matching by comparing an absolute value of the pixel with the critical value, in a case where the pixel is expressed by the real number, not by the complex number. Since there is no imaginary part in the pixel, the mask generator 1310 does not generate a mask with respect to the imaginary part.
  • the iris code generator 1320 may obtain the iris code by quantizing the pixels included in the iris image.
  • the iris code generator 1320 may obtain the iris code by binarizing complex number values corresponding to the pixels in the iris image.
  • the iris code generator 1320 may obtain a converted iris code, in which codes corresponding to the non-iris object area and the inconsistent area are blocked, by applying the second mask to the iris code.
  • the iris code generator 1320 may obtain the converted iris code by separately performing the blocking processes of the real number binarization bits and the imaginary number binarization bits in the iris code to obtain a converted iris code.
  • the iris scanner 1330 may recognize the user by matching the reference iris code, stored by the user in advance, to the converted iris code.
  • At least one of the mask generator 1310, the iris code generator 1320, and the iris scanner 1330 may be implemented as a software module.
  • the software module may be stored in a non-transitory computer-readable recording medium.
  • the at least one software module may be provided by an operating system (OS), or a certain application. Otherwise, a part of the at least one software module is provided by the OS or the remaining part of the at least one software module may be provided by a certain application.
  • OS operating system
  • the user recognition device 1300 of FIG. 13 may perform each of the functions and processes regarding the iris recognition described with reference to FIG. 1.
  • FIG. 14 is a diagram of a user device including a user recognition device according to an embodiment of the present disclosure.
  • a user device 1400 may include a processor 1401, a display 1402, an infrared ray (IR) camera 1403, a memory 1404, and a keyboard 1405.
  • IR infrared ray
  • the user device 1400 may include a smartphone, a tablet PC, a portable phone, a video phone, an E-book reader, a desktop PC, a laptop PC, a PDA, a PMP, an MP3 player, a mobile medical device, a camera, and a wearable device (e.g., an HMD, for example, electronic glasses, electronic clothing, an electronic bracelet, an electronic necklace, an electronic accessory, an electronic tattoo, or a smart watch).
  • a wearable device e.g., an HMD, for example, electronic glasses, electronic clothing, an electronic bracelet, an electronic necklace, an electronic accessory, an electronic tattoo, or a smart watch.
  • the processor 1401 makes the camera 1403 capture an image, processes the image according to the method described in the present specification, and stores information in the memory 1404.
  • the processor 1401 may perform functions of the components illustrated in FIG. 13. Although with reference to FIG. 14 the processor 1401 is expressed as a single processor, the processor 1401 may be provided in plural according to various embodiments.
  • the display 1402 displays information to the user.
  • the display 1402 may display images, a user interface for capturing images, an iris matching result, and other necessary information.
  • the display 1402 may be a touch sensitive display.
  • the camera 1403 includes IR illumination, and is configured to perform an image capturing process as instructed by the processor 1401.
  • the camera 1403 may include another type of light source. It should be understood that the method according to the present disclosure may be modified to use another type of light in a frame capturing process.
  • the memory 1404 is configured to store information.
  • the memory 1404 may store additional information about captured images, processed images, and images (e.g., iris code, mask, reference iris code, and the like).
  • the keyboard 1405 is used by the user to control the device.
  • the keyboard 1405 may be used to control the image capturing process.
  • the keyboard 1405 is not limited to a physical keyboard, but may be a virtual keyboard used in a touch sensitive display.
  • the above-described user device is provided to perform one or more processes from among the processes included in one of the methods described in the present specification.
  • the user may generate the reference iris code in advance by using the user device 1400.
  • the user device 1400 may capture a face image in order to extract an eye image of the user, may process the eye image via at least some of the processes in the above-described method, and may store the processed eye image in the memory in order to use the eye image in a next iris matching process.
  • one real number representing the grey level of the pixel in the iris image is binarized into one bit, not two bits applied to the complex number as described above.
  • the iris code is a bit expression of the iris image, and the bit expression is obtained through the encoding process.
  • a pair of bits generated from the pixel value expressed by the complex number corresponds to one point in an original image (according to binarization of the complex number).
  • another technique of extracting features and encoding may be used. For example, a local binary pattern (LBP) transformation may be used. According to the LBP transformation, the iris image is transformed into an integer matrix (8 bits or 16 bits according to a selected type of LBP).
  • transformation of one real number value or the complex number value into one or two discrete values may be understood under the concept of quantization.
  • transformation of the numerical value of the image intensity into the bit type may be understood under the concept of encoding.
  • the encoding may include all the processes from the extraction of features to the process about the final formation of the code that will be stored in the memory.
  • FIGS. 1 to 14 The embodiments of the present disclosure are described with reference to FIGS. 1 to 14.
  • the present disclosure has been described in language specific to structural features or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
  • the present disclosure is not limited by the order of processes in the method, and the order may be modified by one of ordinary skill in the art without undue technical difficulty. Some or all of the processes of the method may be sequentially or simultaneously executed.
  • Some embodiments may be embodied in a storage medium including instruction code executable by a computer or processor, such as a program module executed by the computer.
  • the computer-readable storage medium may be any available medium that may be accessed by a computer, and includes volatile and non-volatile media and removable and non-removable media.
  • the computer-readable medium may include both a computer storage medium and a communication medium.
  • the computer storage medium may include volatile and non-volatile media and removable and non-removable media that are implemented using any method or technology for storing information, such as computer-readable instructions, a data structure, a program module, or other types of data.
  • the communication medium typically includes computer-readable instructions, a data structure, a program module, or other data of modulated data signal, such as carrier waves, or other transmission mechanisms, and includes an arbitrary information transfer medium.
  • some embodiments may be implemented as a computer program or a computer program product including instructions executable by a computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Cette invention concerne un procédé de reconnaissance d'utilisateur qui met en œuvre un iris. Le procédé de reconnaissance d'utilisateur comprend la génération d'un premier masque pour bloquer une zone d'objet hors iris d'une image d'iris, la génération d'une image d'iris convertie, dans laquelle la zone d'objet hors iris est bloquée en fonction du premier masque, la génération d'un second masque pour bloquer en outre une zone incohérente dans laquelle les résultats de quantification de l'image d'iris convertie sont incohérents, par la transformation adaptative du premier masque en fonction de caractéristiques de l'image d'iris convertie, l'obtention d'un code d'iris par quantification des pixels inclus dans l'image d'iris, l'obtention d'un code d'iris converti dans lequel les parties correspondant à la zone d'objet hors iris et à la zone incohérente sont bloquées, par application du second masque au code d'iris, et la reconnaissance d'un utilisateur par mise en correspondance d'un code d'iris de référence, préalablement stocké par l'utilisateur, avec le code d'iris converti.
EP17824490.1A 2016-07-07 2017-07-04 Procédé de quantification adaptative pour codage d'image d'iris Withdrawn EP3459009A4 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
RU2016127451A RU2628201C1 (ru) 2016-07-07 2016-07-07 Способ адаптивного квантования для кодирования изображения радужной оболочки
KR1020170068659A KR102329128B1 (ko) 2016-07-07 2017-06-01 홍채 이미지 부호화를 위한 적응적 양자화 방법
PCT/KR2017/007066 WO2018008934A2 (fr) 2016-07-07 2017-07-04 Procédé de quantification adaptative pour codage d'image d'iris

Publications (2)

Publication Number Publication Date
EP3459009A2 true EP3459009A2 (fr) 2019-03-27
EP3459009A4 EP3459009A4 (fr) 2019-07-03

Family

ID=59641733

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17824490.1A Withdrawn EP3459009A4 (fr) 2016-07-07 2017-07-04 Procédé de quantification adaptative pour codage d'image d'iris

Country Status (4)

Country Link
EP (1) EP3459009A4 (fr)
KR (1) KR102329128B1 (fr)
CN (1) CN109416734B (fr)
RU (1) RU2628201C1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11434251B2 (en) 2017-11-28 2022-09-06 Scg Chemicals Co., Ltd. Magnesium compound, method for producing the same and use thereof

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2667790C1 (ru) 2017-09-01 2018-09-24 Самсунг Электроникс Ко., Лтд. Способ автоматической регулировки экспозиции для инфракрасной камеры и использующее этот способ вычислительное устройство пользователя
RU2670798C9 (ru) * 2017-11-24 2018-11-26 Самсунг Электроникс Ко., Лтд. Способ аутентификации пользователя по радужной оболочке глаз и соответствующее устройство
KR102122830B1 (ko) * 2018-10-10 2020-06-15 고려대학교 산학협력단 분할 프래질 비트를 활용한 홍채인식 장치 및 방법
US10832053B2 (en) 2018-12-18 2020-11-10 Advanced New Technologies Co., Ltd. Creating an iris identifier to reduce search space of a biometric system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6744909B1 (en) * 1999-08-19 2004-06-01 Physical Optics Corporation Authentication system and method
KR101161803B1 (ko) * 2005-05-25 2012-07-03 삼성전자주식회사 가버 필터 및 그것의 필터링 방법, 그리고 그것을 이용한영상 처리 방법
JP4650386B2 (ja) * 2006-09-29 2011-03-16 沖電気工業株式会社 個人認証システム及び個人認証方法
WO2009041963A1 (fr) * 2007-09-24 2009-04-02 University Of Notre Dame Du Lac Reconnaissance de l'iris à l'aide d'informations de cohérence
CN101246588B (zh) * 2008-03-20 2011-04-13 复旦大学 彩色图像超复数空间的自适应水印算法
US8411910B2 (en) * 2008-04-17 2013-04-02 Biometricore, Inc. Computationally efficient feature extraction and matching iris recognition
CN102884570B (zh) * 2010-04-09 2015-06-17 杜比国际公司 基于mdct的复数预测立体声编码

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11434251B2 (en) 2017-11-28 2022-09-06 Scg Chemicals Co., Ltd. Magnesium compound, method for producing the same and use thereof

Also Published As

Publication number Publication date
EP3459009A4 (fr) 2019-07-03
RU2628201C1 (ru) 2017-08-15
KR102329128B1 (ko) 2021-11-22
KR20180006284A (ko) 2018-01-17
CN109416734B (zh) 2023-11-14
CN109416734A (zh) 2019-03-01

Similar Documents

Publication Publication Date Title
EP3459009A2 (fr) Procédé de quantification adaptative pour codage d'image d'iris
WO2015174647A1 (fr) Procédé d'authentification d'utilisateur, dispositif pour l'exécuter et support d'enregistrement pour le stocker
WO2020022703A1 (fr) Procédé de masquage de données et dispositif de brouillage de données l'utilisant
WO2020130309A1 (fr) Dispositif de masquage d'image et procédé de masquage d'image
WO2019033572A1 (fr) Procédé de détection de situation de visage bloqué, dispositif et support d'informations
WO2016163755A1 (fr) Procédé et appareil de reconnaissance faciale basée sur une mesure de la qualité
CN111814194B (zh) 基于隐私保护的图像处理方法、装置和电子设备
WO2013048160A1 (fr) Procédé de reconnaissance de visage, appareil et support d'enregistrement lisible par ordinateur pour exécuter le procédé
WO2022124701A1 (fr) Procédé de production d'une image étiquetée à partir d'une image originale tout en empêchant une fuite d'informations privées d'une image originale et serveur l'utilisant
CN111783146B (zh) 基于隐私保护的图像处理方法、装置和电子设备
WO2022086147A1 (fr) Procédé permettant d'entraîner et de tester un réseau d'apprentissage utilisateur à utiliser pour reconnaître des données brouillées créées par brouillage de données originales pour protéger des informations personnelles et dispositif d'apprentissage utilisateur et dispositif de test faisant appel à celui-ci
WO2023096445A1 (fr) Procédé de génération d'image obscurcie à utiliser dans un réseau d'apprentissage d'entraînement et dispositif de marquage l'utilisant
Rai et al. Software development framework for real-time face detection and recognition in mobile devices
Barni et al. Iris deidentification with high visual realism for privacy protection on websites and social networks
CN111191521A (zh) 人脸活体检测方法、装置、计算机设备及存储介质
WO2016108562A1 (fr) Système de codage et de reconnaissance d'informations d'empreinte digitale, et son procédé de fonctionnement
WO2018008934A2 (fr) Procédé de quantification adaptative pour codage d'image d'iris
Dhruva et al. Novel algorithm for image processing based hand gesture recognition and its application in security
WO2022097766A1 (fr) Procédé et dispositif de restauration de zone masquée
Senthilkumar et al. Suspicious human activity detection in classroom examination
Hongo et al. Personal authentication with an iris image captured under visible-light condition
Koppikar et al. Face liveness detection to overcome spoofing attacks in face recognition system
WO2023075183A1 (fr) Système et procédé de reconnaissance d'empreinte palmaire sans contact basés sur un apprentissage profond
WO2019117379A1 (fr) Dispositif et procédé d'authentification biométrique basé sur une image oculaire dans un dispositif d'affichage vestimentaire
WO2023204449A1 (fr) Procédé d'apprentissage et dispositif d'apprentissage pour la formation d'un réseau de masquage apte à masquer des données d'origine pour la confidentialité pour obtenir un masquage de restriction d'informations et procédé de test et dispositif de test utilisant ces derniers

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20181219

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

A4 Supplementary search report drawn up and despatched

Effective date: 20190531

RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/00 20060101AFI20190524BHEP

Ipc: G06K 9/32 20060101ALI20190524BHEP

Ipc: G06K 9/38 20060101ALI20190524BHEP

Ipc: G06K 9/46 20060101ALI20190524BHEP

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200203

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: G06K0009000000

Ipc: G06V0010250000

RIC1 Information provided on ipc code assigned before grant

Ipc: G06V 40/18 20220101ALI20230131BHEP

Ipc: G06V 10/44 20220101ALI20230131BHEP

Ipc: G06V 10/25 20220101AFI20230131BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20230320

RIN1 Information on inventor provided before grant (corrected)

Inventor name: LEE, HEE-JUN

Inventor name: LEE, KWANG-HYUN

Inventor name: YOO, JU-WOAN

Inventor name: SHIN, DAE-KYU

Inventor name: DANILEVICH, ALEKSEI BRONISLAVOVICH

Inventor name: GNATYUK, VITALY SERGEEVICH

Inventor name: ODINOKIKH, GLEB ANDREEVICH

Inventor name: FARTUKOV, ALEKSEI MIKHAILOVICH

Inventor name: EREMEEV, VLADIMIR ALEKSEEVICH

Inventor name: KOROBKIN, MIKHAIL VLADIMIROVICH

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20230801