WO2011070660A1 - Détecteur de capacité et procédé de formation d'une image biologique - Google Patents

Détecteur de capacité et procédé de formation d'une image biologique Download PDF

Info

Publication number
WO2011070660A1
WO2011070660A1 PCT/JP2009/070621 JP2009070621W WO2011070660A1 WO 2011070660 A1 WO2011070660 A1 WO 2011070660A1 JP 2009070621 W JP2009070621 W JP 2009070621W WO 2011070660 A1 WO2011070660 A1 WO 2011070660A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sensor
capacitance
biological
unit
Prior art date
Application number
PCT/JP2009/070621
Other languages
English (en)
Japanese (ja)
Inventor
幸弘 安孫子
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2009/070621 priority Critical patent/WO2011070660A1/fr
Priority to JP2011545019A priority patent/JP5472319B2/ja
Priority to EP09852056.2A priority patent/EP2511871B1/fr
Publication of WO2011070660A1 publication Critical patent/WO2011070660A1/fr
Priority to US13/490,102 priority patent/US8787631B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1306Sensors therefor non-optical, e.g. ultrasonic or capacitive sensing

Definitions

  • the embodiment disclosed herein relates to a capacitance sensor and a biological image generation method using the capacitance sensor.
  • Capacitance sensors detect changes in capacitance due to contact or proximity of a living body on the sensor surface, thereby detecting contact of the living body or converting the degree of unevenness on the surface of the living body into an electrical signal. Convert. Capacitance sensors are widely used as, for example, touch sensors, touch pads, touch panels or fingerprint sensors (see, for example, Patent Documents 1 to 5).
  • the touch sensor is a sensor that detects that a living body such as a finger has touched the sensor surface.
  • the touch pad detects a position where the living body contacts the sensor surface, and outputs the movement amount as a relative coordinate when the living body moves while contacting the sensor surface.
  • the touch panel is a device in which a display device such as a liquid crystal display and a touch panel are combined.
  • the fingerprint sensor obtains an image by converting the unevenness of the finger skin such as ridges and valleys formed on the finger surface into pixel values corresponding to the distance between the finger skin and the sensor. .
  • a sensor that realizes a touchpad function using a fingerprint sensor has also been proposed.
  • a living body that is in contact with or close to the sensor surface functions as an electrode facing an electrode provided in the sensor. That is, a change in capacitance occurs due to the accumulation of charge between the electrode and the living body.
  • the capacitance sensor detects a change in the capacitance.
  • skin tissue pieces or sweat remain on the sensor surface, or if water droplets are condensed on the sensor surface, these are also conductors.
  • the capacitance between the inner electrodes changes. Therefore, the capacitance sensor erroneously detects skin tissue pieces, sweat, or water droplets as a living body.
  • the present specification describes a capacitance sensor capable of preventing erroneous detection of a conductor as a living body even when a conductor other than the living body adheres to the sensor surface, and a living body using such a capacitance sensor.
  • An object is to provide an image generation method.
  • a capacitive sensor has a plurality of electrodes that output an electrical signal corresponding to the capacitance determined by the distance between the surface of the capacitance sensor and the conductor. And among the plurality of electrodes, the electrode having the first parasitic capacitance and the electrode having the second parasitic capacitance different from the first parasitic capacitance are different from the biological information of the part read by the capacitance sensor. Arranged according to the arrangement pattern.
  • it has a plurality of electrodes that output an electric signal corresponding to a capacitance determined by the distance between the surface of the sensor and the conductor, and among the plurality of electrodes, the first parasitic capacitance is Provided is a biological image generation method using a capacitance sensor in which an electrode having a second parasitic capacitance different from the first parasitic capacitance is arranged according to a predetermined arrangement pattern different from the biological information of the part to be read Is done.
  • This biological image acquisition method generates an image in which each of a plurality of electrodes corresponds to one pixel and the pixel value of each pixel is a value corresponding to an electrical signal output from the corresponding electrode among the plurality of electrodes.
  • an area including a pixel where a conductor is present at a distance that can form a capacitor with any of a plurality of electrodes is extracted as a foreground area, and a correlation calculation of a filter according to the foreground area and a predetermined arrangement pattern
  • the non-living region where the non-living conductor is attached to the sensor surface of the capacitance sensor is detected and the index that becomes smaller as the area of the non-living region occupying the foreground region becomes larger than a predetermined threshold, And if the index is below a predetermined threshold, discarding the image.
  • the capacitance sensor and the biological image generation method disclosed in this specification can prevent erroneous detection of a conductor as a living body even when a conductor other than the living body adheres to the sensor surface.
  • FIG. 1 is a schematic configuration diagram of a capacitance sensor according to one embodiment.
  • FIG. 2 is a schematic plan view of a sensor unit of the capacitance sensor.
  • FIG. 3 is a schematic side cross-sectional view of a capacitive sensor, according to one embodiment.
  • FIG. 4A is a diagram illustrating an outline of an image generated when a finger is placed on the surface of the sensor unit.
  • FIG. 4B is generated after the finger is released from the surface of the sensor unit.
  • FIG. FIG. 5 is a functional block diagram of the processing unit of the capacitance sensor according to one embodiment.
  • FIG. 6 is a diagram illustrating an example of a filter used for detection of a non-biological region.
  • FIG. 7 is a diagram illustrating an operation flowchart of the biological image generation process.
  • FIG. 8 is a schematic side cross-sectional view of a capacitive sensor according to another embodiment.
  • FIG. 9 is a functional block diagram of a processing unit of a swipe-type capacitive sensor according to another embodiment.
  • FIG. 10 is a diagram illustrating an operation flowchart of the reading start timing determination process controlled by the processing unit.
  • FIG. 11 is a diagram illustrating an operation flowchart of a reading end timing determination process controlled by the processing unit.
  • an electrostatic capacity sensor used as an area sensor for acquiring a biological image representing biological information according to an embodiment will be described with reference to the drawings.
  • this capacitance sensor utilizes the fact that the change in capacitance when a conductor other than a living body adheres to the sensor surface is smaller than the change in capacitance when the living body contacts or approaches the sensor surface. Thus, when a conductor other than the living body adheres to the sensor surface, the distribution pattern of the parasitic capacitance is reproduced on the living body image acquired by the sensor.
  • the capacitance sensor determines whether to output or discard the biological image depending on whether or not a parasitic capacitance distribution pattern is detected on the biological image.
  • the biometric information to be read is a fingerprint.
  • the biometric information to be read may be a palm print.
  • FIG. 1 shows a schematic configuration diagram of a capacitance sensor.
  • the capacitance sensor 1 includes a sensor unit 2, a storage unit 3, a processing unit 4, and an interface unit 5.
  • the capacitance sensor 1 generates a biological image corresponding to the two-dimensional distribution of unevenness on the surface of the user's finger in contact with or close to the sensor unit 2 by the processing unit 4, and the biological image is generated via the interface unit 5. Output.
  • the sensor unit 2 is a two-dimensional sensor, and outputs a change in capacitance at each position on the sensor as an electrical signal at that position.
  • FIG. 2 is a schematic plan view of the sensor unit 2.
  • the sensor unit 2 includes a plurality of electrodes 21-ij arranged in a two-dimensional array of m in the horizontal direction and n in the vertical direction (where 1 ⁇ i ⁇ m, 1 ⁇ j ⁇ n, m, n are An integer of 2 or more.
  • Each electrode 21-ij is made of a conductor such as copper or gold, and such a conductor is formed in a square shape.
  • the width of each electrode is preferably smaller than the average width of the fingerprint ridges, for example, 0.1 mm.
  • Each electrode 21-ij forms a capacitor with the conductor when a conductor such as a finger is placed on the surface of the sensor unit 2.
  • the capacitor has a capacitance according to the distance between the electrode 21-ij and the conductor, and a charge according to the capacitance is accumulated in the electrode 21-ij. Therefore, an electrical signal corresponding to the capacitance is read from each electrode 21-ij, and the electrical signal is amplified by an operational amplifier (not shown) and then converted into a digital signal by an analog-digital converter (not shown). And is sent to the processing unit 4.
  • reference electrodes 22-1 to 22-4 for applying a reference potential to a finger that is in contact with or close to the sensor unit 2 are arranged. Each of the reference electrodes 22-1 to 22-4 is connected to a power source (not shown).
  • FIG. 3 is a schematic side sectional view of the sensor unit 2 taken along the line specified by the symbol AA ′ in FIG.
  • An insulating protective layer 23 such as glass or resin is provided on the surface of each electrode 21-ij.
  • a water repellent coating is applied to the surface 24 of the protective layer 23.
  • Each electrode 21-ij is disposed on a substrate 25 formed of an insulator such as resin.
  • the capacitance detected by each electrode when the finger is placed on the sensor unit 2 and the surface of the sensor unit 2 A difference in capacitance detected when a conductor other than a living body adheres to 24 will be described.
  • the capacitance C total between each electrode 21-ij and a conductor such as a finger placed on the surface of the sensor unit 2 is calculated by the following equation.
  • C f is a capacitance between the conductor and the electrode 21-ij
  • C p is a parasitic capacitance of the electrode 21-ij.
  • a protective layer 23 is present between the electrode 21-ij and the conductor placed on the surface of the sensor unit 2.
  • the capacitor formed between the conductor and the electrode 21-ij has a capacitor between the conductor and the surface 24 of the sensor unit 2 and a capacitor between the surface 24 of the sensor unit 2 and the electrode 21-ij connected in series. It becomes. Therefore, the capacitance C f is expressed by the following equation. However, C 0 is a capacitance between the conductor and the surface 24 of the sensor unit 2, and C c is a capacitance between the surface 24 of the sensor unit 2 and the electrode 21-ij. From the equations (1) and (2), the capacitance C total between the conductor and the electrode 21-ij is expressed by the following equation.
  • the distance between the surface 24 of the conductor and the sensor unit 2 and d 0, the thickness of the protective layer 23 and d c, the dielectric constant of vacuum and epsilon 0, a dielectric constant of the protective layer 23 and epsilon c, electrode If the area of A is A, the capacitance C total is expressed by the following equation.
  • the second term in the equation (4) can be ignored when the area of the conductor facing the electrode 21-ij is equal to or larger than the area of the electrode 21-ij. Therefore, when a finger larger than the electrode 21-ij is placed on the surface 24 of the sensor unit 2, the capacitance C total between the finger surface and the electrode 21-ij is equal to the surface of the finger and the surface of the sensor unit 2. The smaller the distance d 0 between 24, the larger the value. Therefore, the capacitance sensor 1 can generate a biological image having a pixel value corresponding to the unevenness of the finger surface.
  • equation (4) can be rewritten as
  • a ′ is the area of the conductor attached to the surface 24 of the sensor unit 2 facing the electrode 21-ij.
  • d 0 0.
  • the second term cannot be ignored, and the capacitance C total has a different value depending on the parasitic capacitance C p .
  • each electrode 21-ij varies depending on the distance between the electrode and the surface 24 of the sensor unit 2.
  • the distance between each electrode 21-ij and the surface 24 of the sensor unit 2 varies depending on the position of the electrode. Therefore, when a minute skin tissue or water droplets adheres to the surface of the sensor unit 2, an arrangement pattern of electrodes having two types of parasitic capacitances is formed on the image.
  • the electrode arrangement pattern is different from the fingerprint ridge pattern. Therefore, it can be easily identified whether the pattern shown on the biological image is due to the fingerprint ridge or the electrode arrangement pattern.
  • the resolution of the electrode arrangement pattern is higher than the resolution of the fingerprint pattern. Therefore, even when water droplets or the like are attached along the ridges of the fingerprint contacting the surface of the sensor unit 2, the electrode arrangement pattern can be observed on the biological image.
  • FIG. 4A is a diagram showing an outline of a biological image generated when a finger is placed on the surface 24 of the sensor unit 2
  • FIG. 4B is a diagram illustrating the separation of the finger from the surface 24 of the sensor unit 2. It is a figure showing the outline of the biometric image produced
  • a black line 401 represents a fingerprint ridge.
  • the capacitance C total is a value corresponding to the distance between the surface of the finger and the surface 24 of the sensor unit 2 regardless of the parasitic capacitance of each electrode. It becomes. Therefore, in the biological image 400, the part corresponding to the ridge 401 of the fingerprint is black and the other part is white. On the other hand, after the living body image 400 shown in FIG. 4A is generated, when a finger is separated from the sensor unit 2, a minute skin tissue piece or sweat is formed on the surface 24 of the sensor unit 2 along the ridge. And so on.
  • each electrode 21-ij has the same parasitic capacitance at one row interval. Therefore, in the biological image 410, a pattern 411 in which a ridge pattern and a horizontal stripe pattern are superimposed is formed.
  • the storage unit 3 includes, for example, a semiconductor memory. And the memory
  • FIG. the storage unit 3 temporarily stores the biological image generated by the processing unit 4 until the biological image is output via the interface unit 5 or discarded.
  • the processing unit 4 includes one or a plurality of processors and their peripheral circuits. Then, the processing unit 4 creates a biological image in which the user's fingerprint is copied based on the electrical signals of the electrodes acquired from the sensor unit 2. Further, the processing unit 4 evaluates the quality of the biological image based on whether or not the electrode arrangement pattern is observed on the biological image based on the biological image. When the quality of the generated biometric image is good, the processing unit 4 outputs the biometric image to another device (not shown) connected via the interface unit 5. In addition, when the quality of the generated biological image is poor, the processing unit 4 discards the biological image. Then, the processing unit 4 outputs a message that prompts the fingerprint to be read again to another device connected via the interface unit 5.
  • the interface unit 5 has an interface circuit for connecting another device and the capacitance sensor 1.
  • the interface unit 5 outputs the biological image or message received from the processing unit 4 to a device connected to the capacitance sensor 1.
  • FIG. 5 is a functional block diagram of the processing unit 4.
  • the processing unit 4 includes an image generation unit 41, a foreground region extraction unit 42, a non-biological region detection unit 43, and a quality determination unit 44.
  • Each of these units included in the processing unit 4 is a functional module implemented by a computer program executed on a processor included in the processing unit 4.
  • these units included in the processing unit 4 may be mounted on the capacitance sensor 1 as firmware.
  • the image generation unit 41 arranges the electric signals read from the electrodes 21-ij of the sensor unit 2 in a two-dimensional manner along the arrangement of the electrodes, and converts the electric signals of the electrodes to the corresponding pixel pixels. By converting it into a value, a biological image representing the biological information read by the sensor unit 2 is generated. Then, the image generation unit 41 outputs the generated biological image to the foreground region extraction unit 42 and the non-biological region detection unit 43, respectively. The image generation unit 41 stores the generated biological image in the storage unit 3.
  • the foreground region extraction unit 42 from the biological image generated by the image generation unit 41, a living body or a tissue piece of skin, a conductor such as sweat is in contact with or close to the surface 24 of the sensor unit 2, and any electrode and capacitor are connected.
  • the formed area is extracted as the foreground area.
  • the image generation unit 41 may generate a biological image such that the closer the distance between the conductor and the electrode of the sensor unit 2 is, the smaller the pixel value corresponding to the electrode is. In this case, the direction of the inequality regarding the pixel value in the following description is reversed.
  • each pixel has a value corresponding to the unevenness on the surface of the finger, so that the variation in the pixel value in the foreground area is relatively large.
  • the foreground area extraction unit 42 determines, for example, a threshold value for classifying each pixel in the biological image into a foreground area and a background area based on the pixel value itself and variations in the pixel value.
  • the foreground area extraction unit 42 divides the biological image into a plurality of small areas.
  • the width and height of each small region are preferably about several times the average ridge pitch so that a plurality of ridges are included in one small region.
  • the foreground region extraction unit 42 calculates the standard deviation or variance of the pixel values included in each small region. Then, the foreground region extraction unit 42 further calculates a standard deviation ⁇ ′ using the standard deviation or variance for each small region as a variable. The foreground region extraction unit 42 sets a value obtained by multiplying the standard deviation ⁇ ′ by a predetermined coefficient C ⁇ as a threshold T ⁇ relating to pixel value variation. The predetermined coefficient C ⁇ is determined in advance by experiment, for example. Further, the foreground region extraction unit 42 sets a value obtained by adding a predetermined offset value C ⁇ to the average value ⁇ of the pixel values of all the pixels in the biological image as a threshold value T ⁇ regarding the pixel value.
  • the predetermined offset value C ⁇ is determined in advance by experiment, for example. Further, the foreground region extraction unit 42 may determine the threshold value T ⁇ by obtaining a histogram of pixel values of all pixels in the biological image and discriminating and analyzing the histogram.
  • the foreground area extraction unit 42 calculates a standard deviation ⁇ and an average value ⁇ of pixel values for each small area.
  • the foreground area extraction unit 42 sets a small area that satisfies the following conditions as the foreground area.
  • the foreground area extraction unit 42 may use only a small area that satisfies both conditions shown in Equation (6) as a foreground area.
  • the foreground region extraction unit 42 performed before determining the threshold value T sigma, T mu, noise removal process using a Gaussian filter, a pre-processing such as contrast correction or gradation conversion processing for the entire biometric image May be.
  • the foreground area extraction unit 42 notifies the non-biological area detection unit 43 of information representing the foreground area.
  • the information representing the foreground area can be, for example, a binary image in which the pixels included in the foreground area and the pixels included in the background area have different values.
  • the information representing the foreground area may be coordinates of any corner or centroid of the small area extracted as the foreground area.
  • the non-living area detection unit 43 detects, as a non-living area, an area corresponding to a portion where a non-living conductor is attached to the surface 24 of the sensor unit 2 in the foreground area.
  • the non-biological region detection unit 43 performs a correlation calculation between the foreground region and the filter according to the following expression using a filter corresponding to the arrangement pattern of the electrodes having different parasitic capacitances.
  • I ′ (i, j) represents the correlation value of the pixel of interest (i, j)
  • I (i + k, j + l) represents the pixel (i + k, j +) included in the foreground area.
  • l) represents the value.
  • F (k, l) represents the pixel value of the filter coordinates (k, l).
  • FIG. 6 shows an example of a filter used for detection of a non-biological region.
  • the filter 600 has 3 horizontal pixels ⁇ 5 vertical pixels.
  • electrodes with high and low parasitic capacitance are alternately arranged for each row. Therefore, the filter 600, each pixel in the odd rows has a value of w 0, the pixels in the even row has a value of -3w 0/2.
  • w 0 is set to 1 or 1/15, for example.
  • the non-living area detection unit 43 calculates a correlation value for each pixel included in the foreground area. Then, the non-living area detection unit 43 sets pixels whose absolute value of the correlation value exceeds a predetermined threshold as pixels included in the non-living area. Note that the predetermined threshold value is determined in advance by experiment, and is set, for example, to 1/4 of the total absolute value of the pixel values of the filter. The non-living area detection unit 43 obtains the total number N f of pixels included in the foreground area and the total number N n of pixels included in the non-living area in the foreground area. Then, the non-biological region detection unit 43 notifies the pass / fail determination unit 44 of N f and N n .
  • the quality determination unit 44 evaluates the quality of the biological image created by the image generation unit 41. Therefore, the quality determination unit 44 calculates a quality index Q based on the total number N f of pixels included in the foreground area and the total number N n of pixels included in the non-biological area in the foreground area. For example, the quality index Q is calculated according to the following equation. In the equation (8), N n can take a value from 0 to N f . Therefore, the quality index Q has a value included in the range of 0 to 1. And as N n is larger, the non-biological region included in the foreground region is larger, so the quality index Q is lower. The quality determination unit 44 compares the quality index Q with the quality threshold value Tq .
  • the nondefective determination unit 44 determines that the biometric image is poor.
  • the quality index Q is equal to or greater than the threshold value T q , the quality determination unit 44 determines that the biological image is good.
  • the pass / fail determination unit 44 may determine that the biological image is defective even when the total number N f of pixels included in the foreground region is less than the predetermined area threshold value T s .
  • the quality threshold value T q and the area threshold value T s are set based on the usage method of the biological image generated by the capacitance sensor 1. For example, when a biometric image generated by the capacitance sensor 1 is used for biometric authentication, there is a high possibility that biometric authentication will fail if a non-biological region exists in the biometric image. In addition, it is necessary that the biometric area sufficiently large so that the biometric authentication device can reliably detect the characteristic points of the fingerprint. Therefore, the quality threshold value T q is set to 0.9, for example, and the area threshold value T s is set to 1/2 of the total number of pixels of the biological image, for example.
  • the pass / fail determination unit 44 notifies the processing unit 4 of the pass / fail determination result of the biological image.
  • the processing unit 4 reads the biological image from the storage unit 3 and outputs the biological image to another device connected via the interface unit 5.
  • the pass / fail determination unit 44 determines that the image is defective, the biological image stored in the storage unit 3 is discarded. Then, the processing unit 4 notifies a message that prompts the fingerprint to be read again and a message indicating that the sensor surface may be dirty to other devices connected via the interface unit 5.
  • the capacitance sensor 1 has a display unit (not shown) such as a liquid crystal display, the processing unit 4 has failed to read biometric information to the user via the display unit. And that the sensor surface may be dirty.
  • FIG. 7 shows an operation flowchart of the biological image generation process controlled by the processing unit 4.
  • the processing unit 4 acquires an electrical signal corresponding to the capacitance of each electrode 21-ij from the sensor unit 2 (step S101).
  • the image generation part 41 of the process part 4 produces
  • the image generation unit 41 outputs the generated biological image to the foreground region extraction unit 42 and the non-biological region detection unit 43 of the processing unit 4, respectively.
  • the image generation unit 41 stores the biological image in the storage unit 3.
  • the foreground area extraction unit 42 extracts a foreground area from the biological image (step S103). Then, the foreground area extraction unit 42 notifies the non-biological area detection unit 43 of information representing the foreground area.
  • the non-biological region detection unit 43 detects the non-biological region by performing a correlation operation between the foreground region in the biological image and a filter corresponding to the arrangement pattern of electrodes having different parasitic capacitances (step S104). Then, the non-living area detection unit 43 calculates the number of pixels N f in the foreground area and the number of pixels N n in the non-living area (step S105). The non-living region detection unit 43 notifies N f and N n to the quality determination unit 44 of the processing unit 4.
  • the quality determination unit 44 calculates the quality index Q from N f and N n (step S106). Then, the quality determination unit 44 determines whether or not the quality index Q is greater than or equal to the quality threshold T q and the number of pixels N f in the foreground region is greater than or equal to T s (step S107). If the quality index Q is less than the quality threshold T q or the number of pixels N f in the foreground region is less than T s (No in step S107), the pass / fail determination unit 44 determines that the biological image is defective. The pass / fail determination unit 44 notifies the processing unit 4 of the determination result. The processing unit 4 discards the biological image (Step S108).
  • the processing unit 4 outputs a message for prompting rereading of the fingerprint and a message indicating the possibility that the sensor surface is dirty to other devices connected via the interface unit 5.
  • the pass / fail determination unit 44 determines that the biological image is good. To do.
  • the pass / fail determination unit 44 notifies the processing unit 4 of the determination result.
  • the processing unit 4 outputs the biological image to another device connected via the interface unit 5 (step S109). After step S108 or S109, the processing unit 4 ends the biological image generation process.
  • electrodes having different parasitic capacitances are arranged according to a pattern different from the pattern of biological information to be read. Therefore, in this capacitance sensor, when a conductor other than a living body adheres to the surface of the sensor, an arrangement pattern of electrodes having the same parasitic capacitance appears on the generated living body image. Therefore, this electrostatic capacity sensor can detect an image quality defect due to a conductor other than a living body adhering to the surface of the sensor by detecting an arrangement pattern of electrodes having different parasitic capacitances. Therefore, this capacitance sensor can prevent outputting a biological image in which a tissue piece, sweat, or the like attached to the sensor surface is erroneously detected as biological information.
  • the arrangement pattern of electrodes having different parasitic capacitances is not limited to the above embodiment.
  • the arrangement pattern of the electrodes having different parasitic capacitances may be different from the pattern of the biological information to be read.
  • the biological information to be read is a fingerprint
  • electrodes having different parasitic capacitances may be arranged in a checkered pattern.
  • electrodes having different parasitic capacitances may be arranged in a vertical stripe shape.
  • the electrodes having different parasitic capacitances may have a random pattern.
  • a plurality of electrodes having three or more different parasitic capacitances may be arranged according to an arrangement pattern different from the pattern of biological information to be read.
  • the non-living area detection unit of the processing unit detects the non-living area by performing a correlation operation between the filter having the same pattern as the arrangement pattern of electrodes having different parasitic capacitances and the foreground area in the image. it can.
  • FIG. 8 is a schematic cross-sectional side view of a sensor unit formed so as to have different parasitic capacitance for each electrode according to another embodiment.
  • each electrode 21-ij is placed on the substrate 25 so that the distance between each electrode 21-ij and the surface 24 of the protective layer 23 is equal.
  • a capacitor is also formed between the conductor 26 and the electrode 21-i (2k-1).
  • the parasitic capacitance C c1 generated between each electrode 21-i (2k-1) arranged in the odd-numbered row and the surface of the protective layer 23 is different from each electrode 21-i (2k) arranged in the even-numbered row. This is different from the parasitic capacitance C c0 generated between the surface of the protective layer 23.
  • the non-biological region detection unit of the processing unit may detect a non-biological region from the entire biological image by performing a correlation operation with the filter corresponding to the electrode arrangement pattern on the entire biological image.
  • the processing unit discards the living body image in which the tissue pattern of the skin, sweat, or the like adheres to the entire surface of the sensor unit, and the electrode arrangement pattern is expressed over a wide range of the parts that are not in contact with the living body. it can.
  • This capacitance sensor can be used for various purposes.
  • this capacitance sensor may be used as a biometric authentication device.
  • the storage unit of the capacitance sensor stores, as a registered biometric image, an image obtained by capturing the biometric information of the user together with the user identification information registered in advance.
  • the processing unit performs pattern matching between the biometric image determined to be good by the quality determination unit and the registered biometric image.
  • the processing unit may extract feature points from each of the biometric image determined to be good by the pass / fail determination unit and the registered biometric image, and execute matching (maneuver matching) between the feature points.
  • the processing unit authenticates the user as a registered user corresponding to the registered biometric information when the similarity calculated by executing pattern matching or minutia matching is equal to or greater than a predetermined threshold. Then, the processing unit transmits a signal indicating that the authentication has been successful and the identification information of the registered user to other devices connected via the interface unit.
  • this electrostatic capacitance sensor may be used as a touch sensor.
  • the processing unit only needs to be able to accurately determine whether or not the finger has touched the sensor unit of the capacitance sensor. Therefore, the quality threshold value T q and the area threshold value T s used in the quality determination unit of the processing unit are set to values lower than those threshold values when a biometric image is used for biometric authentication. For example, the quality threshold value T q is set to 0.5, and the area threshold value T s is set to 1/10 of the total number of pixels of the biological image. If the processing unit determines that the biometric image is good, the processing unit outputs a signal indicating that the capacitance sensor has been touched to another device connected via the interface unit.
  • this capacitance sensor may be used as a touch pad or a touch panel.
  • the capacitance sensor detects the pressed position on the sensor surface. Therefore, when it is determined that the biological image is good, the processing unit obtains the coordinates of the center of gravity of the pixels that are not included in the non-biological region in the foreground region on the biological image. Then, the processing unit outputs the coordinates of the center of gravity as a pressed position to another device connected via the interface unit.
  • the capacitance sensor may be a swipe sensor that reads a fingerprint in a wider range than the sensor by moving the finger relative to the sensor.
  • the capacitance sensor can have the same configuration as that shown in FIG.
  • a direction in which a part such as a finger is slid is a vertical direction
  • a direction orthogonal to the direction in which the part is slid is a horizontal direction.
  • FIG. 9 is a functional block diagram of the processing unit 4.
  • the processing unit 4 includes an image generation unit 41, a foreground region extraction unit 42, a non-biological region detection unit 43, a quality determination unit 44, a reading start / end timing determination unit 46, and an image combination unit 47.
  • each functional block of the processing unit 4 is assigned the same reference number as the corresponding functional block of the processing unit 4 shown in FIG. Below, a different point from the process part 4 shown by FIG. 5 is demonstrated.
  • the image generation unit 41 sequentially generates partial images at a predetermined reading time interval, each electrode corresponding to one pixel, and a pixel value of each pixel corresponding to a signal read from the corresponding electrode. To do. Then, the image generation unit 41 passes each partial image to the foreground region extraction unit 42, the non-biological region detection unit 43, the reading start / end timing determination unit 46, and the image combination unit 47. For example, when the sensor unit 2 includes a plurality of electrodes arranged in a line along the horizontal direction, the partial image is an image of one line in the horizontal direction.
  • the partial image is a two-dimensional image corresponding to the size of the two-dimensional array.
  • the foreground area extraction unit 42 extracts a foreground area from each partial image. Then, the foreground region extraction unit 42 passes information indicating the foreground region to the non-biological region detection unit 43.
  • the non-living area detection unit 43 detects the non-living area from the foreground area of each partial image. Then, the non-living area detection unit 43 calculates the number of pixels included in the foreground area and the number of pixels included in the non-living area for each partial image, and starts reading the number of pixels together with information indicating the corresponding partial image.
  • the foreground region extraction unit 42 and the non-biological region detection unit 43 perform the same processing as the processing by the foreground region extraction unit 42 and the non-biological region detection unit 43 of the processing unit 4 illustrated in FIG.
  • a foreground region and a non-biological region can be extracted respectively.
  • the filter for detecting the non-biological region is, for example, the filter 600 shown in FIG. One row is rotated by 90 °, and each pixel is arranged horizontally.
  • the reading start / end timing determination unit 46 determines the timing to start reading biometric information and the timing to end reading biometric information.
  • the reading start timing is the acquisition timing of the first partial image used to create a biological image including biological information
  • the reading end timing is the last partial image used to create a biological image. It is the acquisition timing. Therefore, it should be noted that the capacitance sensor acquires the partial image by the sensor unit 2 before the reading start timing or after the reading start timing.
  • a swipe-type sensor when a living body such as a finger is brought close to the sensor at the start of biometric information reading, even if the living body does not touch the sensor surface due to water vapor generated when sweat evaporates from the living body, A pattern corresponding to the smudge or residual fingerprint appears on the partial image.
  • the biometric information reading is completed, sweat from the living body is rubbed directly onto the sensor surface and adheres to the sensor surface. Therefore, even after the living body is separated from the sensor, a pattern corresponding to dirt or a residual fingerprint appears on the partial image.
  • the reading start / end timing determination unit 46 ends the reading of the biological information and the reading of the biological information so that the area where the dirt or the like attached to the sensor surface is not increased on the biological image. Determine the timing.
  • the reading start / end timing determination unit 46 calculates the quality index Q for each partial image according to the equation (8). Then, the reading start / end timing determination unit 46 compares the quality index Q with a predetermined reading start threshold value. Then, the reading start / end timing determination unit 46 sets the reading start timing as the acquisition time of the partial image corresponding to the quality index Q that first exceeds the reading start threshold. Further, the reading start / end timing determination unit 46 ends reading at the time of acquisition of a partial image corresponding to the quality index Q that first falls below a predetermined reading end threshold among the partial images acquired after the reading start timing. Timing. The reading start / end timing determination unit 46 notifies the processing unit 4 of the reading start timing and the reading end timing.
  • the processing unit 4 When the reading start timing is notified, the processing unit 4 temporarily stores the partial images acquired thereafter in the storage unit 3. On the other hand, the processing unit 4 discards the partial image acquired until the reading start timing is notified. Further, when the reading end timing is notified, the processing unit 4 passes the partial image acquired by the reading end timing and stored in the storage unit 3 to the image imaging unit 47.
  • the reading start threshold is preferably set lower than the reading end threshold. If the user is not accustomed to reading biometric information by the capacitance sensor or is too accustomed to start the biometric information reading, the living body tends to slide before the living body and the sensor come into close contact with each other. For this reason, when the capacitance sensor starts reading biological information after the living body and the sensor are in close contact with each other, the range of the living body to be read may be narrowed. In particular, when the speed at which the living body is slid is high, the living body moves greatly until it is determined that the living body and the sensor are in close contact with each other, and a feature point useful for collation in the biological information is read range. There is a risk of disengagement. Therefore, the reading start threshold is set to about 0.3 to 0.5, for example.
  • the reading start / end timing determination unit 46 may determine the reading start timing based on the width of contact between the living body and the sensor instead of the quality index Q or together with the quality index Q.
  • the width in which the living body and the sensor are in contact can be the maximum length of the remaining areas excluding the non-living area from the foreground area.
  • the reading start / end timing determination unit 46 sets the acquisition time of the partial image as the reading start timing.
  • the reading start / end timing determination unit 46 has a width where the living body and the sensor are in contact with each of the partial images is larger than the contact width threshold, and the quality index Q is higher than the reading start threshold. In this case, the time when the partial image is acquired is set as the reading start timing.
  • the contact width threshold is lower than the average width of the living body to be read, and is set to, for example, 1/4 to 1/2 of the average width of the living body.
  • the reading start / end timing determination unit 46 may set two-stage conditions in order to determine the reading start timing. For example, in the first stage condition, the reading start threshold value and / or the contact width threshold value are set lower than the reading start threshold value and / or the contact width threshold value in the second stage condition. Then, when the quality index Q or the contact width acquired from the partial image satisfies the first stage condition, the reading start / end timing determination unit 46 acquires the partial image that satisfies the first stage condition and the subsequent image. The stored partial image is temporarily stored in the storage unit 3. Thereafter, when the quality index Q or the contact width satisfies the second stage condition from any of the partial images, the processing unit 4 starts reading to notify the user that reading of the biological information has started.
  • the signal shown is output to another device connected via the interface unit 5. Further, when the processing unit 4 receives the notification of the reading end timing from the reading start / end timing determination unit 46, the processing unit 4 stores the partial images acquired from the time when the first-stage condition is satisfied to the reading end timing. 3 and read to the image combining unit 47.
  • the living body When the biometric information reading is completed, the living body is rubbed against the surface of the sensor, so that a skin tissue piece or sweat adheres to the surface of the sensor. For this reason, a pattern corresponding to a tissue piece of skin or sweat attached to the sensor surface appears in the partial image acquired when the biometric information reading is completed. Therefore, if the reading end threshold is set low, the reading start / end timing determination unit 46 may not be able to appropriately determine the reading end timing even though the slide on the biological sensor surface has actually ended. is there. Moreover, even if a living body contact detector is provided in the vicinity of the sensor unit, the detection surface of the living body contact detector does not include a sensor surface. Therefore, it is difficult to determine whether or not a living body is in contact.
  • the reading end threshold is set to a relatively high value, for example, about 0.5 to 0.8 so that the reading end can be reliably determined.
  • the reading start / end timing determination unit 46 may obtain the maximum value Q max of the quality index of the partial image acquired after the reading start timing and use the maximum value Q max for the determination of the reading end threshold. Good. In this case, for example, the reading end threshold is set to 1/2 of the maximum value Q max of the quality index.
  • the reading start / end timing determination unit 46 determines the reading end timing based only on the quality of one partial image, the timing at which the partial image corresponding to such a streak or scratch is acquired is read end timing. There is a risk that. Therefore, the reading start / end timing determination unit 46 determines that the quality index Q of the plurality of partial images acquired continuously for a predetermined period after the reading start timing is equal to or less than the reading end threshold value in order to determine the reading end timing. You may detect that it has become.
  • the reading start / end timing determination unit 46 sets, as a reading end timing, the time when a partial image whose quality index is equal to or lower than the reading end threshold among the plurality of partial images acquired continuously for a predetermined period of time is acquired. Also good. In this way, the reading start / end timing determination unit 46 erroneously detects the reading end timing by using the determination condition of the reading end timing that the quality index Q is continuously below the reading end threshold for a certain period of time. Can be prevented.
  • the predetermined period is set to 0.1 to 0.3 msec, for example.
  • the pass / fail determination unit 44 sums up the number of pixels included in the foreground area and the number of pixels included in the non-living area obtained for each of the plurality of partial images acquired from the reading start timing to the reading end timing.
  • the pass / fail judgment unit 44 obtains from the reading start timing to the reading end timing by substituting the total number of pixels included in the foreground area and the total number of pixels included in the non-living area into the above equation (8).
  • a quality index Q of the living body image obtained by combining the plurality of partial images is obtained.
  • the quality judgment unit 44 determines the quality index Q is, if less than the quality threshold T q, the nondefective determination unit 44, a biometric image is poor.
  • the quality determination unit 44 determines that the biological image is good. Also in this embodiment, the quality determination unit 44 may determine that the image is defective even when the total number of pixels included in the foreground region is less than the predetermined area threshold value T s . Further, the quality threshold value T q and the area threshold value T s are also determined in the same manner as in the above embodiment.
  • the image combining unit 47 generates a biological image by combining a plurality of partial images acquired from the reading start timing to the reading end timing in the vertical direction in time order.
  • the image combining unit 47 outputs the biological image to another device connected via the interface unit 5.
  • the image combining unit 47 discards the biological image.
  • FIG. 10 shows an operation flowchart of the reading start timing determination process controlled by the processing unit 4.
  • the processing unit 4 acquires an electrical signal corresponding to the capacitance of each electrode from the sensor unit 2 (step S301).
  • the image generating unit 41 of the processing unit 4 generates a partial image having a pixel value corresponding to the capacitance of each electrode every time an electrical signal of each electrode is acquired from the sensor unit 2 (step S302). Then, the image generation unit 41 passes the generated partial image to the foreground region extraction unit 42, the non-biological region detection unit 43, the reading start / end timing determination unit 46, and the image combination unit 47 of the processing unit 4.
  • the foreground area extraction unit 42 extracts a foreground area from the partial image.
  • the non-living area detection unit 43 extracts the non-living area from the foreground area (step S303). Then, the non-living area detection unit 43 passes the number of pixels in the foreground area and the number of pixels in the non-living area to the reading start / end timing determination unit 46 of the processing unit 4.
  • the reading start / end timing determination unit 46 calculates the quality index Q of the partial image from the area of the non-biological region occupying the foreground region (step S304). Then, the reading start / end timing determination unit 46 determines whether or not the quality index Q is equal to or higher than the reading start threshold (step S305). When the quality index Q is less than the reading start threshold value (step S305—No), the reading start / end timing determination unit 46 discards the partial image (step S306). And the process part 4 repeats the process after step S301.
  • the reading start / end timing determination unit 46 determines the acquisition time of the partial image corresponding to the quality index Q as the reading start timing. Then, the processing unit 4 stores the partial image corresponding to the quality index Q in the storage unit 3 (Step S307). Then, the processing unit 4 ends the reading start timing determination process.
  • FIG. 11 shows an operation flowchart of a reading end timing determination process controlled by the processing unit 4.
  • the processing unit 4 acquires an electrical signal corresponding to the capacitance of each electrode from the sensor unit 2 (step S401).
  • the image generation unit 41 of the processing unit 4 generates a partial image having a pixel value corresponding to the capacitance of each electrode each time an electrical signal of each electrode is acquired from the sensor unit 2 (step S402). Then, the image generation unit 41 passes the generated partial image to the foreground region extraction unit 42, the non-biological region detection unit 43, the reading start / end timing determination unit 46, and the image combination unit 47 of the processing unit 4.
  • the foreground area extraction unit 42 extracts a foreground area from the partial image.
  • the non-living area detection unit 43 extracts the non-living area from the foreground area (step S403). Then, the non-living area detection unit 43 passes the number of pixels in the foreground area and the number of pixels in the non-living area to the reading start / end timing determination unit 46 of the processing unit 4.
  • the reading start / end timing determination unit 46 calculates the quality index Q of the partial image from the area of the non-biological region occupying the foreground region (step S404). Then, the reading start / end timing determination unit 46 determines whether or not the quality index Q is equal to or less than the reading end threshold (step S405). When the quality index Q is equal to or higher than the reading start threshold (step S405 No), the reading start / end timing determination unit 46 stores the partial image in the storage unit 3 (step S406). And the process part 4 repeats the process after step S401.
  • the reading start / end timing determining unit 46 determines the acquisition time of the partial image corresponding to the quality index Q as the reading end timing. Then, the processing unit 4 passes the partial image stored in the storage unit 3 to the image combining unit 47.
  • the image combining unit 47 generates a biological image by combining a plurality of partial images acquired from the reading start timing to the reading end timing in the vertical direction in time order. (Step S407). Then, the processing unit 4 ends the reading end timing determination process.
  • the capacitance sensor starts reading even if a conductor other than a living body is attached to the sensor surface by detecting the arrangement pattern of electrodes having different parasitic capacitances.
  • the timing and the reading end timing can be appropriately determined. Therefore, this capacitance sensor can reduce the non-living area included in the biometric image, and can generate a biometric image suitable for use in biometric authentication.
  • the capacitance sensor can start the biometric authentication process immediately after the reading of the biometric information is completed, the waiting time required for the biometric authentication process can be suppressed.
  • the processor of the device connected to the capacitance sensor may execute the function of the processing unit of each of the above embodiments.
  • the capacitance sensor amplifies the signal from each electrode and performs analog-to-digital conversion, and then the signal from each analog-to-digital conversion is connected via the interface unit in a predetermined order. Output to the connected device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Input (AREA)
  • Measurement Of Length, Angles, Or The Like Using Electric Or Magnetic Means (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Position Input By Displaying (AREA)

Abstract

Un détecteur de capacité possède une pluralité d'électrodes, chacune produisant en sortie des signaux électriques correspondant à la capacité déterminée par la distance entre la surface du détecteur de capacité et un conducteur. Parmi les électrodes, l'électrode ayant une première capacité parasite et l'électrode ayant une seconde capacité parasite, différente de la première capacité parasite, sont agencées selon un motif d'agencement prédéterminé qui est différent des informations biologiques d'une région à lire à l'aide du détecteur de capacité.
PCT/JP2009/070621 2009-12-09 2009-12-09 Détecteur de capacité et procédé de formation d'une image biologique WO2011070660A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/JP2009/070621 WO2011070660A1 (fr) 2009-12-09 2009-12-09 Détecteur de capacité et procédé de formation d'une image biologique
JP2011545019A JP5472319B2 (ja) 2009-12-09 2009-12-09 静電容量センサ及び生体画像生成方法
EP09852056.2A EP2511871B1 (fr) 2009-12-09 2009-12-09 Détecteur de capacité et procédé de formation d'une image biologique
US13/490,102 US8787631B2 (en) 2009-12-09 2012-06-06 Capacitive sensor and biometric image generating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/070621 WO2011070660A1 (fr) 2009-12-09 2009-12-09 Détecteur de capacité et procédé de formation d'une image biologique

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/490,102 Continuation US8787631B2 (en) 2009-12-09 2012-06-06 Capacitive sensor and biometric image generating method

Publications (1)

Publication Number Publication Date
WO2011070660A1 true WO2011070660A1 (fr) 2011-06-16

Family

ID=44145231

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/070621 WO2011070660A1 (fr) 2009-12-09 2009-12-09 Détecteur de capacité et procédé de formation d'une image biologique

Country Status (4)

Country Link
US (1) US8787631B2 (fr)
EP (1) EP2511871B1 (fr)
JP (1) JP5472319B2 (fr)
WO (1) WO2011070660A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130175153A1 (en) * 2012-01-06 2013-07-11 Egalax_Empia Technology Inc. Thin capacitive touch panel
JP2014035606A (ja) * 2012-08-08 2014-02-24 Alps Electric Co Ltd 入力装置
JP2014092963A (ja) * 2012-11-05 2014-05-19 Fujitsu Ltd 接触状態検出装置、方法及びプログラム
JP2017519566A (ja) * 2014-06-17 2017-07-20 クゥアルコム・インコーポレイテッドQualcomm Incorporated 静電容量プロフィールを使用するバイオメトリックをベースにしたセキュリティのための方法および装置
JP2019503751A (ja) * 2015-12-23 2019-02-14 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 抽出された信号の品質を測定する装置及び方法
JP2020027661A (ja) * 2018-08-10 2020-02-20 三星電子株式会社Samsung Electronics Co.,Ltd. タッチ・指紋複合センサ、及びそれを含む電子装置
JP2020047284A (ja) * 2015-09-18 2020-03-26 日本電気株式会社 指紋撮像システム、指紋撮像装置、画像処理装置、指紋撮像方法、及び記録媒体
CN112955899A (zh) * 2018-08-22 2021-06-11 傲迪司威生物识别公司 用于提高传感器图像质量的系统和方法

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110248920A1 (en) * 2010-04-09 2011-10-13 Microsoft Corporation Keyboard with hinged keys and display functionality
US9846799B2 (en) 2012-05-18 2017-12-19 Apple Inc. Efficient texture comparison
US9202099B2 (en) 2012-06-29 2015-12-01 Apple Inc. Fingerprint sensing and enrollment
US9715616B2 (en) * 2012-06-29 2017-07-25 Apple Inc. Fingerprint sensing and enrollment
KR20140060181A (ko) * 2012-11-09 2014-05-19 삼성전자주식회사 데이터 공유 시스템에서의 데이터 공유 방법 및 이를 위한 장치들
US9088282B2 (en) * 2013-01-25 2015-07-21 Apple Inc. Proximity sensors with optical and electrical sensing capabilities
US10068120B2 (en) 2013-03-15 2018-09-04 Apple Inc. High dynamic range fingerprint sensing
JP2015023723A (ja) * 2013-07-22 2015-02-02 船井電機株式会社 給電装置及び非給電対象物検知方法
JP6167733B2 (ja) * 2013-07-30 2017-07-26 富士通株式会社 生体特徴ベクトル抽出装置、生体特徴ベクトル抽出方法、および生体特徴ベクトル抽出プログラム
US9390306B2 (en) * 2013-09-06 2016-07-12 Apple Inc. Finger biometric sensor including circuitry for acquiring finger biometric data based upon finger stability and related methods
US9679183B2 (en) * 2013-12-20 2017-06-13 Apple Inc. Finger biometric sensor including drive signal level updating and related methods
US10565925B2 (en) 2014-02-07 2020-02-18 Samsung Electronics Co., Ltd. Full color display with intrinsic transparency
US10453371B2 (en) 2014-02-07 2019-10-22 Samsung Electronics Co., Ltd. Multi-layer display with color and contrast enhancement
US10375365B2 (en) * 2014-02-07 2019-08-06 Samsung Electronics Co., Ltd. Projection system with enhanced color and contrast
US10554962B2 (en) 2014-02-07 2020-02-04 Samsung Electronics Co., Ltd. Multi-layer high transparency display for light field generation
TWM493712U (zh) * 2014-08-01 2015-01-11 Superc Touch Corp 具有遮罩功能的感應電極之生物辨識裝置
KR102255351B1 (ko) * 2014-09-11 2021-05-24 삼성전자주식회사 홍채 인식 방법 및 장치
CN107710671B (zh) 2015-04-30 2020-06-12 德山真旭 终端装置及计算机可读存储介质
JP6886429B2 (ja) * 2018-06-04 2021-06-16 株式会社東海理化電機製作所 生体情報センサ装置
TWI679431B (zh) * 2018-07-31 2019-12-11 友達光電股份有限公司 指紋感測裝置及其檢測方法
US11398104B2 (en) 2018-09-05 2022-07-26 Fingerprint Cards Anacatum Ip Ab Optical fingerprint sensor module and method for operating optical fingerprint sensor module
US10810450B2 (en) * 2018-09-13 2020-10-20 Blackberry Limited Methods and systems for improved biometric identification
KR20220043999A (ko) * 2020-09-29 2022-04-06 삼성디스플레이 주식회사 표시장치 및 표시장치의 구동 방법
FR3143795A3 (fr) * 2022-12-20 2024-06-21 Idemia France Carte à capteur biométrique

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5325442A (en) 1990-05-18 1994-06-28 U.S. Philips Corporation Fingerprint sensing device and recognition system having predetermined electrode activation
JP2000337813A (ja) * 1999-05-25 2000-12-08 Sony Corp 静電容量式指紋センサおよびその製造方法
JP2001056204A (ja) 1999-08-19 2001-02-27 Sony Corp 静電容量式指紋センサ
JP2001222706A (ja) 1999-12-30 2001-08-17 Stmicroelectronics Inc 指紋検知の改良
JP2001525067A (ja) 1997-05-13 2001-12-04 ベリディコム,インコーポレイテッド 調節可能な利得を有する容量性指紋センサ
JP2005030901A (ja) * 2003-07-11 2005-02-03 Alps Electric Co Ltd 容量センサ
JP2005505065A (ja) 2001-10-03 2005-02-17 スリーエム イノベイティブ プロパティズ カンパニー 複数のタッチ入力を区別するタッチパネルシステムおよび方法
JP2006071579A (ja) * 2004-09-06 2006-03-16 Alps Electric Co Ltd 容量検出回路及び容量検出方法
JP2006162345A (ja) * 2004-12-03 2006-06-22 Alps Electric Co Ltd 容量検出型センサ

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963656A (en) * 1996-09-30 1999-10-05 International Business Machines Corporation System and method for determining the quality of fingerprint images
US6512381B2 (en) 1999-12-30 2003-01-28 Stmicroelectronics, Inc. Enhanced fingerprint detection
EP1530949B1 (fr) * 2002-09-13 2010-11-10 Fujitsu Limited Instrument et procede de biodetection et dispositif d'identification dote d'une fonction de biodetection
WO2005124659A1 (fr) * 2004-06-18 2005-12-29 Fingerprint Cards Ab Element de capteur d'empreinte digitale
WO2006098176A1 (fr) * 2005-03-15 2006-09-21 Sharp Kabushiki Kaisha Substrat matriciel actif et dispositif d’affichage utilisant ledit substrat

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5325442A (en) 1990-05-18 1994-06-28 U.S. Philips Corporation Fingerprint sensing device and recognition system having predetermined electrode activation
JP2001525067A (ja) 1997-05-13 2001-12-04 ベリディコム,インコーポレイテッド 調節可能な利得を有する容量性指紋センサ
JP2000337813A (ja) * 1999-05-25 2000-12-08 Sony Corp 静電容量式指紋センサおよびその製造方法
JP2001056204A (ja) 1999-08-19 2001-02-27 Sony Corp 静電容量式指紋センサ
JP2001222706A (ja) 1999-12-30 2001-08-17 Stmicroelectronics Inc 指紋検知の改良
JP2005505065A (ja) 2001-10-03 2005-02-17 スリーエム イノベイティブ プロパティズ カンパニー 複数のタッチ入力を区別するタッチパネルシステムおよび方法
JP2005030901A (ja) * 2003-07-11 2005-02-03 Alps Electric Co Ltd 容量センサ
JP2006071579A (ja) * 2004-09-06 2006-03-16 Alps Electric Co Ltd 容量検出回路及び容量検出方法
JP2006162345A (ja) * 2004-12-03 2006-06-22 Alps Electric Co Ltd 容量検出型センサ

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130175153A1 (en) * 2012-01-06 2013-07-11 Egalax_Empia Technology Inc. Thin capacitive touch panel
JP2014035606A (ja) * 2012-08-08 2014-02-24 Alps Electric Co Ltd 入力装置
JP2014092963A (ja) * 2012-11-05 2014-05-19 Fujitsu Ltd 接触状態検出装置、方法及びプログラム
JP2017519566A (ja) * 2014-06-17 2017-07-20 クゥアルコム・インコーポレイテッドQualcomm Incorporated 静電容量プロフィールを使用するバイオメトリックをベースにしたセキュリティのための方法および装置
JP2020047284A (ja) * 2015-09-18 2020-03-26 日本電気株式会社 指紋撮像システム、指紋撮像装置、画像処理装置、指紋撮像方法、及び記録媒体
US11004185B2 (en) 2015-09-18 2021-05-11 Nec Corporation Fingerprint capture system, fingerprint capture device, image processing apparatus, fingerprint capture method, and storage medium
US11501422B2 (en) 2015-09-18 2022-11-15 Nec Corporation Fingerprint capture system, fingerprint capture device, image processing apparatus, fingerprint capture method, and storage medium
US11779245B2 (en) 2015-09-18 2023-10-10 Nec Corporation Fingerprint capture system, fingerprint capture device, image processing apparatus, fingerprint capture method, and storage medium
JP2019503751A (ja) * 2015-12-23 2019-02-14 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 抽出された信号の品質を測定する装置及び方法
JP7041061B2 (ja) 2015-12-23 2022-03-23 コーニンクレッカ フィリップス エヌ ヴェ 抽出された信号の品質を測定する装置及び方法
JP7041061B6 (ja) 2015-12-23 2022-05-30 コーニンクレッカ フィリップス エヌ ヴェ 抽出された信号の品質を測定する装置及び方法
JP2020027661A (ja) * 2018-08-10 2020-02-20 三星電子株式会社Samsung Electronics Co.,Ltd. タッチ・指紋複合センサ、及びそれを含む電子装置
JP7481813B2 (ja) 2018-08-10 2024-05-13 三星電子株式会社 タッチ・指紋複合センサ、及びそれを含む電子装置
CN112955899A (zh) * 2018-08-22 2021-06-11 傲迪司威生物识别公司 用于提高传感器图像质量的系统和方法

Also Published As

Publication number Publication date
US20120250949A1 (en) 2012-10-04
EP2511871B1 (fr) 2018-05-09
EP2511871A4 (fr) 2016-12-07
JPWO2011070660A1 (ja) 2013-04-22
JP5472319B2 (ja) 2014-04-16
EP2511871A1 (fr) 2012-10-17
US8787631B2 (en) 2014-07-22

Similar Documents

Publication Publication Date Title
JP5472319B2 (ja) 静電容量センサ及び生体画像生成方法
JP4883185B2 (ja) 生体情報読取装置,生体情報読取方法,及び生体情報読取プログラム
JP3859673B2 (ja) 生体情報取得装置および生体情報による認証装置
US7203347B2 (en) Method and system for extracting an area of interest from within a swipe image of a biological surface
JP4088625B2 (ja) 生体検知装置および方法並びに生体検知機能を有する認証装置
JP5196010B2 (ja) 生体情報登録装置、生体情報登録方法及び生体情報登録用コンピュータプログラムならびに生体認証装置、生体認証方法及び生体認証用コンピュータプログラム
KR100564915B1 (ko) 정전용량방식 지문센서 및 이를 이용한 지문 센싱방법
EP1966739B1 (fr) Detection d'informations biometriques utilisant un imageur de type a balayage
US20130028488A1 (en) Biometric information processing device, biometric-information processing method, and computer-readable storage medium
JP4427039B2 (ja) 生体情報取得装置および生体情報による認証装置
JP2005004632A (ja) 画像照合装置、および画像照合方法
CN109690560B (zh) 具有不同电容式配置的指纹感测
KR101537211B1 (ko) 상이한 지문 입력 방식을 지원하는 지문 인증 방법 및 전자 장치
EP3364339B1 (fr) Dispositif, procédé et support de stockage lisible par ordinateur non transitoire pour authentification biométrique
US20210334495A1 (en) Display device including a fingerprint sensor and a method of driving the same
US20060078178A1 (en) Swipe sensor
JP2002279413A (ja) 擬似指紋判別装置および指紋照合装置
JP6364828B2 (ja) 生体認証装置および携帯型電子装置
CN110574038B (zh) 从指纹图像中提取指纹特征数据
KR20150139396A (ko) 상이한 지문 입력 방식을 지원하는 지문 인증 방법 및 전자 장치
TWI836240B (zh) 光學指紋識別方法及利用其之觸控顯示器和資訊處理裝置
Cho et al. A method for fingerprint enrollment by finger rubbing
Aithal A critical study on fingerprint image sensing and acquisition technology
CN115620347A (zh) 生物感测辨识系统及其感测装置与感测方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09852056

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011545019

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2009852056

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE