WO2020263875A1 - Détection de faux doigt à l'aide de caractéristiques de crête - Google Patents

Détection de faux doigt à l'aide de caractéristiques de crête Download PDF

Info

Publication number
WO2020263875A1
WO2020263875A1 PCT/US2020/039208 US2020039208W WO2020263875A1 WO 2020263875 A1 WO2020263875 A1 WO 2020263875A1 US 2020039208 W US2020039208 W US 2020039208W WO 2020263875 A1 WO2020263875 A1 WO 2020263875A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
finger
signal strength
fingerprint
difference
Prior art date
Application number
PCT/US2020/039208
Other languages
English (en)
Inventor
Sina AKHBARI
Lingtao Wang
Mei-Lin Chan
Nikhil APTE
Original Assignee
Invensense, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Invensense, Inc. filed Critical Invensense, Inc.
Publication of WO2020263875A1 publication Critical patent/WO2020263875A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
    • G06V40/1394Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1306Sensors therefor non-optical, e.g. ultrasonic or capacitive sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1359Extracting features related to ridge properties; Determining the fingerprint type, e.g. whorl or loop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger

Definitions

  • Fingerprint sensors have become ubiquitous in mobile devices as well as other devices (e.g., locks on cars and buildings) and applications for authenticating a user’s identity. They provide a fast and convenient way for the user to unlock a device, provide authentication for payments, etc. It is essential that fingerprint sensors operate at a level of security that, at a minimum, reduces the potential for circumvention of security of fingerprint authentication. For instance, fake fingers having fake or spoofed fingerprints can be used to attempt to circumvent fingerprint authentication at fingerprint sensors.
  • FIG. 1 is a block diagram of an example electronic device 100 upon which
  • FIG. 2 illustrates a block diagram of an example fingerprint sensing system for determining whether a fingerprint image was generated using a real finger or a fake finger, according to some embodiments.
  • FIGs. 3A and 3B illustrate block diagrams of a difference quantifier, according to some embodiments.
  • FIG. 4 illustrates example fingerprint images captured at an ultrasonic fingerprint sensor using different times of flight, according to embodiments.
  • FIG. 5A illustrates example fingerprint images and graphs of spatial ridge frequency of a real finger, according to embodiments.
  • FIG. 5B illustrates example fingerprint images and graphs of spatial ridge frequency of a fake finger, according to embodiments.
  • FIG. 6A illustrates example fingerprint images and a combined graph of spatial ridge frequency of a real finger, according to embodiments.
  • FIG. 6B illustrates example fingerprint images and a combined graph of spatial ridge frequency of a fake finger, according to embodiments.
  • FIG. 7 illustrates an example process for determining whether a finger is a real finger at an ultrasonic fingerprint sensor, according to some embodiments.
  • FIG. 8 illustrates an example process for quantifying a difference in width of ridges of different fingerprint images, according to some embodiments.
  • FIG. 9 illustrates an example process for quantifying a difference in width of ridges of different fingerprint images, according to other embodiments.
  • Embodiments described herein may be discussed in the general context of processor- executable instructions residing on some form of non-transitory processor-readable medium, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
  • the functionality of the program modules may be combined or distributed as desired in various embodiments.
  • a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software.
  • various illustrative components, blocks, modules, logic, circuits, and steps have been described generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
  • the example fingerprint sensing system and/or mobile electronic device described herein may include components other than those shown, including well-known components.
  • Various techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices if implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium comprising instructions that, when executed, perform one or more of the methods described herein.
  • the non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
  • the non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EERROM), FLASH memory, other known storage media, and the like.
  • RAM synchronous dynamic random access memory
  • ROM read only memory
  • NVRAM non-volatile random access memory
  • EERROM electrically erasable programmable read-only memory
  • FLASH memory other known storage media, and the like.
  • the techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
  • processors such as one or more motion processing units (MPUs), sensor processing units (SPUs), host processor(s) or core(s) thereof, digital signal processors (DSPs), general purpose
  • MPUs motion processing units
  • SPUs sensor processing units
  • DSPs digital signal processors
  • processors application specific integrated circuits (ASICs), application specific instruction set processors (AS!Ps), field programmable gate arrays (FPGAs), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein, or other equivalent integrated or discrete logic circuitry.
  • ASICs application specific integrated circuits
  • AS!Ps application specific instruction set processors
  • FPGAs field programmable gate arrays
  • PLC programmable logic controller
  • CPLD complex programmable logic device
  • discrete gate or transistor logic discrete hardware components, or any combination thereof designed to perform the functions described herein, or other equivalent integrated or discrete logic circuitry.
  • processor can refer to substantially any computing processing unit or device comprising, but not limited to comprising, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory.
  • processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of user equipment.
  • a processor may also be implemented as a combination of computing processing units.
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of an SPU/MPU and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with an SPU core, MPU core, or any other such configuration.
  • Discussion begins with a description of a device including a fingerprint sensor, upon which described embodiments can be implemented.
  • An example fingerprint sensor and system for determining whether a fingerprint image is generated using a real finger or a fake finger is then described, in accordance with various embodiments.
  • Example operations of a fingerprint sensor for determining whether a fingerprint image is generated using a real finger or a fake finger based on features of finger ridges of the captured fingerprint images are then described.
  • Fingerprint sensors are used in electronic devices for user authentication, such as mobile electronic devices and applications operating on mobile electronic devices, locks for accessing cars or buildings, for protecting against unauthorized access to the devices and/or applications. Authentication of a fingerprint at a fingerprint sensor is performed before providing access to a device and/or application. In order to circumvent fingerprint authentication, attempts can be made to copy or spoof fingerprints of an authorized user using a fake or artificial finger. As such, fingerprint sensors should be capable of distinguishing real fingers from fake, artificial, or even dead fingers, also referred to herein as performing“spoof detection” or“fake finger detection”. A“spoofed” fingerprint is a fake or artificial fingerprint that is used to attempt to circumvent security measures requiring fingerprint authentication.
  • an artificial finger may be used to gain unauthorized access to the electronic device or application, by making an unauthorized copy of the fingerprint of an authorized user, e.g.,“spoofing” an actual fingerprint.
  • the spoof detection may be performed by analyzing fingerprint images captured by the fingerprint sensor, e.g., performing biometric analysis of the fingerprint images, or iooking at any characteristics that can help distinguish a fake/spoof fingerprint from a real fingerprint. These characteristics may be static features or dynamic features which have a certain time dependency because they change over time.
  • Embodiments described herein provide methods and systems for determining whether a finger interacting with a fingerprint sensor, for purposes of authentication, is a real finger or a fake finger based on observed features of finger ridges of the captured fingerprint images.
  • Observed features of finger ridges may refer to the width of ridges of the ridge/valley pattern of captured fingerprint images.
  • a feature of the finger ridge may include a profile of the ridge, and how the profile changes based on depth and/or deformation.
  • capturing multiple fingerprint images using different times of flight allows for detecting ridge features that are indicative of whether a finger is a real finger or a fake finger.
  • Embodiments described herein provide for determining whether a finger is a real finger at an ultrasonic fingerprint sensor.
  • a first image of a fingerprint pattern is captured at an ultrasonic fingerprint sensor, wherein the first image is based on ultrasonic signals corresponding to a first time of flight range.
  • a second image of the fingerprint pattern is captured at the ultrasonic fingerprint sensor, wherein the second image is based on ultrasonic signals corresponding to a second time of flight range, the second time of flight range being delayed compared to the first time of flight range.
  • a difference in a width of ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image is quantified. Based on the quantification of the difference, a probability whether the finger is a real finger is determined.
  • quantifying the difference in the width of the ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image includes determining a difference between the first image and the second image in one embodiment, a difference image is generated by subtracting the second image from the first image (or vice-versa).
  • a first signal strength is determined at a first spatial frequency range of at least one of the first image and the second image and a second signal strength is determined at a second spatial frequency range of the difference between the first image and the second image.
  • the first signal strength corresponds to a maximum in signal strength of the first spatial frequency range and wherein the second signal strength corresponds to a maximum in signal strength of the second spatiai frequency range.
  • the second frequency range is distributed around a frequency substantially double a main frequency contribution of the first frequency range.
  • the first spatial frequency range corresponds to a spatial ridge frequency range of the fingerprint pattern.
  • the first signal strength is compared to the second signal strength.
  • a ratio of the second signal strength to the first signal strength is determined. Provided the ratio satisfies a ratio range threshold, it is determined that the finger is a real finger in one embodiment, the ratio range threshold is above 0.8. Based on the comparing, the probability that the finger is a real finger is determined. In one embodiment, the probability that the finger is a real finger is based on the ratio of the second signal strength to the first signal strength, wherein the probability that the finger is a real finger increases as the ratio of the second signal strength to the first signal strength increases.
  • determining the second signal strength at the second spatial frequency range of the difference between the first image and the second image includes determining the second signal strength at the second spatial frequency of the of the difference image.
  • quantifying the difference in the width of the ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image includes determining a first average width of ridges of the fingerprint pattern of the first image and determining a second average width of ridges of the fingerprint pattern of the second image.
  • a difference between the first average width and the second average width is quantified in some embodiments, a ratio of the first average width to the second average width is determined.
  • determining the probability whether the finger is a real finger includes comparing of the ratio of the first average width and the second average width to a width range threshold.
  • FIG. 1 is a block diagram of an example electronic device 100.
  • electronic device 100 may be implemented as a device or apparatus, such as a handheld mobile electronic device.
  • a mobile electronic device may be, without limitation, a mobile telephone phone (e.g., smartphone, cellular phone, a cordless phone running on a local network, or any other cordless telephone handset), a wired telephone (e.g., a phone attached by a wire), a personal digital assistant (PDA), a video game player, video game controller, a Head Mounted Display (HMD), a virtual or augmented reality device, a navigation device, an activity or fitness tracker device (e.g , bracelet, clip, band, or pendant), a smart watch or other wearable device, a mobile internet device (MID), a personal navigation device (PND), a digital still camera, a digital video camera, a portable music player, a portable video player, a portable multi-media player, a remote control,
  • MID mobile internet device
  • PND personal navigation device
  • electronic device 100 may be implemented as a fixed electronic device, such as and without limitation, an electronic lock, a doorknob, a car start button, an automated teller machine (ATM), etc.
  • electronic device 100 is capable of reading fingerprints.
  • electronic device 100 may include a host processor 110, a host bus 120, a host memory 130, and a sensor processing unit 170. Some embodiments of electronic device 100 may further include one or more of a display device 140, an interface 150, a transceiver 160 (ail depicted in dashed lines) and/or other components. In various embodiments, a display device 140, an interface 150, a transceiver 160 (ail depicted in dashed lines) and/or other components. In various
  • electrical power for electronic device 100 is provided by a mobile power source such as a battery (not shown), when not being actively charged.
  • Host processor 110 can be one or more microprocessors, central processing units (CPUs), DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs or applications, which may be stored in host memory 130, associated with the functions and capabilities of electronic device 100.
  • USB universal serial bus
  • UART universal asynchronous receiver/transmitter
  • microcontroller bus architecture ABA
  • I2C Inter-Integrated Circuit
  • SDIO serial digital input output
  • SPI serial peripheral interface
  • host processor 110, host memory 130, display 140, interface 150, transceiver 160, sensor processing unit (SPU) 170, and other components of electronic device 100 may be coupled communicatively through host bus 120 in order to exchange commands and data.
  • different bus configurations may be employed as desired.
  • additional buses may be used to couple the various components of electronic device 100, such as by using a dedicated bus between host processor 110 and memory 130.
  • Host memory 130 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory), hard disk, optical disk, or some combination thereof.
  • electronic memory e.g., read only memory (ROM), random access memory, or other electronic memory
  • hard disk e.g., hard disk, optical disk, or some combination thereof.
  • Multiple layers of software can be stored in host memory 130 for use with/operation upon host processor 110.
  • an operating system layer can be provided for electronic device 100 to control and manage system resources in real time, enable functions of application software and other layers, and interface application programs with other software and functions of electronic device 100.
  • a user experience system layer may operate upon or be facilitated by the operating system.
  • the user experience system may comprise one or more software application programs such as menu navigation software, games, device function control, gesture recognition, image processing or adjusting, voice recognition, navigation software, communications software (such as telephony or wireless local area network (WLAN) software), and/or any of a wide variety of other software and functional interfaces for interaction with the user can be provided.
  • multiple different applications can be provided on a single electronic device 100, and in some of those embodiments, multiple applications can run simultaneously as part of the user experience system.
  • the user experience system, operating system, and/or the host processor 110 may operate in a low-power mode (e.g., a sleep mode) where very few instructions are processed. Such a low-power mode may utilize only a small fraction of the processing power of a full-power mode (e.g., an awake mode) of the host processor 110.
  • Display 140 when included, may be a liquid crystal device, (organic) light emitting diode device, or other display device suitable for creating and visibly depicting graphic images and/or alphanumeric characters recognizable to a user.
  • Display 140 may be configured to output images viewable by the user and may additionally or alternatively function as a viewfinder for camera. It should be appreciated that display 140 is optional, as various electronic devices, such as electronic locks, doorknobs, car start buttons, etc., may not require a display device.
  • Interface 150 when included, can be any of a variety of different devices providing input and/or output to a user, such as audio speakers, touch screen, real or virtual buttons, joystick, slider, knob, printer, scanner, computer network I/O device, other connected
  • Transceiver 160 when included, may be one or more of a wired or wireless
  • transceiver which facilitates receipt of data at electronic device 100 from an external
  • transceiver 160 comprises one or more of: a cellular transceiver, a wireless local area network transceiver (e.g., a transceiver compliant with one or more Institute of Electrical and Electronics Engineers (IEEE) 802.11 specifications for wireless local area network communication), a wireless personal area network transceiver (e.g., a transceiver compliant with one or more IEEE 802.15 specifications for wireless personal area network communication), and a wired a serial transceiver (e.g., a universal serial bus for wired communication).
  • IEEE Institute of Electrical and Electronics Engineers
  • Electronic device 100 also includes a general purpose sensor assembly in the form of integrated Sensor Processing Unit (SPU) 170 which includes sensor processor 172, memory 176, a fingerprint sensor 178, and a bus 174 for facilitating communication between these and other components of SPU 170.
  • SPU 170 may include at least one additional sensor 180 (shown as sensor 180-1 , 180-2, ... 180-n) communicatively coupled to bus 174.
  • at least one additional sensor 180 is a force or pressure sensor (e.g. a touch sensor) configured to determine a force or pressure or a temperature sensor configured to determine a temperature at electronic device 100.
  • the force or pressure sensor may be disposed within, under, or adjacent fingerprint sensor 178. in some
  • SPU 170 all of the components illustrated in SPU 170 may be embodied on a single integrated circuit it should be appreciated that SPU 170 may be manufactured as a stand alone unit (e.g., an integrated circuit), that may exist separately from a larger electronic device and is coupled to host bus 120 through an interface (not shown). It should be appreciated that, in accordance with some embodiments, that SPU 170 can operate independent of host processor 110 and host memory 130 using sensor processor 172 and memory 176.
  • Sensor processor 172 can be one or more microprocessors, CPUs, DSPs, general purpose microprocessors, ASICs, ASIPs, FPGAs or other processors which run software programs, which may be stored in memory 176, associated with the functions of SPU 170. It should also be appreciated that fingerprint sensor 178 and additional sensor 180, when included, may also utilize processing and memory provided by other components of electronic device 100, e.g., host processor 110 and host memory 130
  • Bus 174 may be any suitable bus or interface to include, without limitation, a peripheral component interconnect express (PCIe) bus, a universal serial bus (USB), a universal asynchronous receiver/transmitter (UART) serial bus, a suitable advanced microcontroller bus architecture (AMBA) interface, an Inter- Integrated Circuit (I2C) bus, a serial digital input output (SDIO) bus, a serial peripheral interface (SRI) or other equivalent.
  • PCIe peripheral component interconnect express
  • USB universal serial bus
  • UART universal asynchronous receiver/transmitter
  • AMBA advanced microcontroller bus architecture
  • I2C Inter- Integrated Circuit
  • SDIO serial digital input output
  • SRI serial peripheral interface
  • sensor processor 172 may be communicatively coupled through bus 174 in order to exchange data.
  • memory 176 may be communicatively coupled through bus 174 in order to exchange data.
  • fingerprint sensor 178 may be communicatively coupled through bus 174 in order to exchange data.
  • Memory 176 can be any suitable type of memory, including but not limited to electronic memory (e.g., read only memory (ROM), random access memory, or other electronic memory). Memory 176 may store algorithms or routines or other instructions for processing data received from fingerprint sensor 178 and/or one or more sensor 180, as well as the received data either in its raw form or after some processing. Such algorithms and routines may be implemented by sensor processor 172 and/or by logic or processing capabilities included in fingerprint sensor 178 and/or sensor 180.
  • ROM read only memory
  • Memory 176 may store algorithms or routines or other instructions for processing data received from fingerprint sensor 178 and/or one or more sensor 180, as well as the received data either in its raw form or after some processing. Such algorithms and routines may be implemented by sensor processor 172 and/or by logic or processing capabilities included in fingerprint sensor 178 and/or sensor 180.
  • a sensor 180 may comprise, without limitation: a temperature sensor, a humidity sensor, an atmospheric pressure sensor, an infrared sensor, a radio frequency sensor, a navigation satellite system sensor (such as a global positioning system receiver), an acoustic sensor (e.g., a microphone), an inertial or motion sensor (e.g., a gyroscope, accelerometer, or magnetometer) for measuring the orientation or motion of the sensor in space, or other type of sensor for measuring other physical or environmental factors.
  • sensor 180-1 may comprise an acoustic sensor
  • sensor 180-2 may comprise a temperature sensor
  • sensor 180-n may comprise a motion sensor
  • fingerprint sensor 178 and/or one or more sensors 180 may be implemented using a microelectromechanical system (MEMS) that is integrated with sensor processor 172 and one or more other components of SPU 170 in a single chip or package. It should be appreciated that fingerprint sensor 178 may be disposed behind display 140.
  • MEMS microelectromechanical system
  • fingerprint sensor 178 and/or one or more sensors 180 may be disposed externally to SPU 170 in various embodiments.
  • fingerprint sensor 178 can be any type of fingerprint sensor, including without limitation, an ultrasonic sensor, an optical sensor, a camera, etc.
  • FIG. 2 illustrates a block diagram of an example fingerprint sensing system 200 for determining whether a fingerprint image was generated using a real finger or a fake finger, according to some embodiments.
  • Fingerprint sensing system 200 is configured to determine whether a finger is a real finger or a fake finger using ridge features (e.g., width) from fingerprint images captured at the ultrasonic fingerprint sensor it should be appreciated that fingerprint sensing system 200 can be implemented as hardware, software, or any combination thereof it should also be appreciated that fingerprint image capture 210, difference quantifier 220, and fake finger determiner 230 may be separate components, may be comprised within a single component, or may be comprised in various combinations of multiple components (e.g., difference quantifier 220 and fake finger determiner 230 may be comprised within a single component), In accordance with some embodiments.
  • Fingerprint images 215 are captured at fingerprint image capture 210.
  • fingerprint image capture 210 is an ultrasonic sensor (e.g., a sensor capable of transmitting and receiving ultrasonic signals).
  • the fingerprint sensor is operable to emit and detect ultrasonic waves (also referred to as ultrasonic signals or ultrasound signals).
  • An array of ultrasonic transducers e.g., Piezoelectric
  • Micromachined Ultrasonic Transducers may be used to transmit and receive the ultrasonic waves, where the ultrasonic transducers of the array are capable of performing both the transmission and receipt of the ultrasonic waves.
  • the emitted ultrasonic waves are reflected from any objects in contact with (or in front of) the fingerprint sensor, and these reflected ultrasonic waves, or echoes, are then detected.
  • the object is a finger
  • the waves are reflected from different features of the finger, such as the surface features on the skin, fingerprint, or features present in deeper layers of the finger (e.g., the dermis).
  • Examples of surface features of a finger are ridges and valleys of a fingerprint, e.g., the ridge/valley pattern of the finger.
  • the reflection of the sound waves from the ridge/valley pattern enables the fingerprint sensor to produce a fingerprint image that may be used for identification of the user.
  • At least two fingerprint images 215 are captured at an ultrasonic fingerprint sensor at different times of flight
  • operating parameters of an ultrasonic fingerprint sensor can be controlled, allowing for image capture at different times of flight.
  • an adjustment of timing of transmission of the ultrasonic signals for ultrasonic transducers of an ultrasonic fingerprint sensor can change the time of flight.
  • a first fingerprint image is captured at a finger surface time of flight (e.g., a time of flight formed for imaging at a contact surface of the ultrasonic transducer) and second fingerprint image is captured at a delayed time of flight (e.g., 50-150 nanoseconds) relative to the first image.
  • the finger used for generating the fingerprint images is a real finger, it is typically observed (either visually or analytically) that ridges of the second fingerprint image are narrower than ridges of the first fingerprint image. This event (ridge narrowing) is typically not observed in fake fingers.
  • the time of flight delay may be selected for an optimum ridge narrowing effect, and may be dependent on the user. As such, the delay in time of flight may be determined for user during enrollment.
  • the first image is captured with the optimal time of flight (e.g., calibrated time of flight for an optimum image) for measuring at the sensor surface, and the signal integration windows may be optimized for the fake finger detection.
  • the integration window may be large (e.g., 50-2Q0ns), while for the fake finger detection a shorter integration window (e.g., ⁇ 50ns) may be used.
  • the signal integration window for the first and second image may be different.
  • Embodiments described herein focus on the use of a first and second image. However, more images may be used at different time of flights, and the methods described herein may then be applied in a similar manner on the plurality of images.
  • the shorter integration window provides more depth resolution. It should be appreciated that the actual integration windows may depend on the ultrasonic fingerprint sensor stack, thickness and material, and acoustic properties of the specific ultrasonic fingerprint sensor design.
  • the beam focusing for the two images may be the same, and may be adapted to focus on the top of the sensor stack. In other embodiments, the beam focusing may be different, where the focusing for the second image is at certain depth in the finger or certain depth from the sensor surface.
  • the capturing of the first and second images may be initiated whenever a change in signal is detected. For example, to capture the images when the user presses the finger on the sensor, the image capture may be started as soon an object or finger starts interacting with the sensor. For an ultrasonic sensor with an array of ultrasonic transducers, a subset of transducers may be active in a low power mode, and as soon as a finger start interacting with the sensor, the full sensor may be activated to capture the sequence of images.
  • a change in signal may occur as the pressure of the finger is reduced, and this may initiate the image capture in some embodiments, a background image is captured before a finger contacts the sensor, where the background image can be used to enhance image quality by subtracting from the captured fingerprint images.
  • the change of contact state may be determined by the fingerprint sensor itself, or it may be detected by a second sensor associated with the fingerprint sensor.
  • a pressure sensor, a force sensor, or a touch sensor may be position near, below, or above the fingerprint sensor and this additional sensor may be used to detect a change in contact state that initiates the capturing of the image sequence.
  • the fingerprint images 215 can include any number of fingerprint images. In some embodiments, two fingerprint images 215 are captured using different times of flight In some embodiments, fingerprint images 215 includes at least two fingerprint images, but it should be appreciated that more than two image can be captured.
  • the fingerprint images 215 forwarded to difference quantifier 220 may include two or more images captured during a steady state while the finger is contacting the ultrasonic fingerprint sensor. The embodiments described herein may then be applied in a similar manner on the plurality of images to determine a change in ridge width as a function of time of flight.
  • Fingerprint images 215 are received at difference quantifier 220, which is configured to quantify a difference of ridge features (e.g , ridge width) of the fingerprint images 215.
  • the (difference of) ridge features may be determined as a function of time of flight. For instance, ridge narrowing as a function of time of flight can be taken as an indicator for the probability that the finger is a real or fake finger. In general, the depth profile of the ridges can be used as an indicator as to the probability that the finger is a real or fake finger. To determine the probability, embodiments herein quantify the ridge narrowing (e.g., profile change).
  • FIG. 3A illustrates a block diagram of a difference quantifier 220, according to an embodiment.
  • Fingerprint images 215 are received at difference quantifier 220.
  • difference determiner 310 a difference between the fingerprint images is determined in one embodiment, difference determiner includes difference image generator 312, which is configured to generate a difference image by subtracting one of the fingerprint images 215 from another fingerprint image 215. In one embodiment, where there are two fingerprint images 215 captured with different times of flight, the latter captured fingerprint image 215 is subtracted from the earlier captured fingerprint image 215.
  • Difference determiner 310 outputs difference 315.
  • difference 315 is a difference image.
  • the second image is subtracted from the first image, and the spatial frequency spectrum of the first image and of the difference image is compared. If the finger is real, due to ridge narrowing of the second image, the ridges of the difference image are observably split. Therefore, the difference image has substantially twice the main spatial frequency compared to the first and second images alone.
  • Signal strength determiner 320 receives difference 315 and at least one of fingerprint images 215, and is configured to determine signal strengths corresponding to spatial frequency values or ranges. For example, signal strength determiner 320 determines a maximum signal strength at a first spatial frequency of at least one of fingerprint images 215 and a maximum signal strength at a second spatial frequency of difference 315. The first spatial frequency of each of the fingerprint images 215 should be substantially the same. In some embodiments, the second spatial frequency range is distributed around a frequency substantially double a main frequency contribution of the first spatial frequency range. In some embodiments, signal strength determiner 320 determines the maximum signal strength for at least one of fingerprint images 215, and identifies the corresponding spatial frequency (within the first spatial frequency range). Signal strength determiner 320 then determines the maximum signal strength for difference 315 at the second spatial frequency range, where the second spatial frequency range is distributed around a frequency substantially double the frequency contribution of the first frequency range.
  • the signal strengths 325 are output to signal strength comparer 330.
  • Signal strength comparer 330 is configured to compare the signal strength 325 for at least fingerprint image 215 and difference 315.
  • Output 335 is generated as a result of the comparison, where output 335 includes a probability whether the finger is a real finger or a fake finger.
  • the ratio of the signal strength of difference 315 to the signal strength of a fingerprint image 215 is determined. The ratio is then compared to a ratio range threshold.
  • the probability that the finger is a real finger is based on the ratio, wherein the probability that the finger is a real finger increases as the ratio increases.
  • the ratio range threshold is greater than 0 8.
  • output 335 is transmitted to fake finger determiner 230 as difference quantification 225.
  • FIG. 3B illustrates a block diagram of difference quantifier 220, according to another embodiment. Fingerprint images 215 are received at difference quantifier 220. At ridge width determiner 360, the average ridge widths 365 of each of fingerprint images 215 is generated.
  • Ridge width difference quantifier 370 receives ridge widths 365 and quantifies a difference between the ridge widths 365. The change of width may be determined as a function of the time of flight. In one embodiments, ridge width difference quantifier 370 determines a ratio 375 of the ridge widths 365. Ratio comparer 380 receives ratio 375, and compares the ratio to a width range threshold. In some embodiments, the probability that the finger is a real finger is based on the ratio. In some embodiments, output 385 is transmitted to fake finger determiner 230 as difference quantification 225.
  • difference quantification 225 is output 335 of FIG. 3A. In other embodiments, difference quantification 225 is output 385 of FIG. 3B. In some embodiments, difference quantification includes a probability whether the finger is a real finger or a fake finger. In some embodiments, fake finger determiner 230 receives input from other types of spoof detection or fake finger detection, and makes the determination 235 as to whether the finger is a fake finger based on multiple inputs.
  • Each type of spoof detection method may produce a probability of whether the finger is a real finger, and fake finger determiner may combine the probabilities to output a decision whether the finger is a real finger.
  • the output may also contain a confidence, based on the one or more spoof detection methods.
  • the different probabilities from different methods may have different weights, which may depend on the user. These weights may have been determined in during enrollment, where it may be determined which spoof detection works best for the user, and may even depend on the context.
  • FIG. 4 illustrates example fingerprint images 410 and 420 of a real finger captured at an ultrasonic fingerprint sensor using different times of flight, according to embodiments.
  • Fingerprint image 410 is captured at a first time of flight, where the first time of flight is the standard (e.g., calibrated) time of flight for capturing an image at a surface of the ultrasonic fingerprint sensor.
  • Fingerprint image 420 is captured at a second time of flight, where the second time of flight is delayed (e.g., 15 to 150 nanoseconds) relative to the first time of flight.
  • the ridges of fingerprint images 410 and 420 are different, indicating a change in ridge profile (e.g., ridge width), with the ridges narrowing in fingerprint image 420 relative to fingerprint image 410.
  • the observed width of the ridges narrows at the delayed time of flight in fingerprint image 420.
  • the ridge of the fingerprint pattern are dark in the image and the valleys are white, because the ultrasonic signals are more absorbed at the ridges and more reflected at the valley due to the air gap present at the position of the valleys in some embodiments, the ridge profile changes illustrating ridge narrowing at delayed times of flight can be used to determine whether the finger is a real finger or a fake finger.
  • FIG. 5A illustrates example 500 fingerprint images and graphs of spatial ridge frequency of a real finger, according to embodiments.
  • Example 500 illustrates first fingerprint image 510, second fingerprint image 520, and difference image 530, of a real finger, where the ridges are indicated in the fingerprint images as the darker patterns.
  • First fingerprint image 510 is captured at a first time of flight
  • second fingerprint image 520 is captured at a delayed time of flight relative to the first time of flight.
  • Difference image 530 is generated by subtracting second fingerprint image 520 from first fingerprint image 510.
  • Graph 515 illustrates a graph of spectrogram power (e.g., signal strength) versus frequency of first fingerprint image 510, in which the maximum power corresponds to a spatial frequency range centered at approximately 2.4 line pairs per millimeter (Ipmm).
  • Graph 525 illustrates a graph of spectrogram power versus frequency of second fingerprint image 520, in which the maximum power also corresponds to a spatial frequency range centered at approximately 2.4 Ipmm.
  • Graph 535 illustrates a graph of spectrogram power versus frequency of difference image 530, in which the maximum power corresponds to a spatial frequency range centered at approximately 4.8 Ipmm.
  • the frequency of detected ridges in difference image 530 is substantially double the frequency of detected ridges in first fingerprint image 510 and second fingerprint image 520, indicative of the finger being a real finger.
  • the finger is a real finger
  • the ridges of difference image 530 are observably split. Therefore, difference image 530 has substantiaily twice the main spatial frequency comparing to one of first fingerprint image 510 and second fingerprint image 520. Therefore, example 500 illustrates that first fingerprint image 510 and second fingerprint image 520 were generated using a real finger.
  • the ratio of the signal strength of difference image 530 to the signal strength of one of first fingerprint image 510 and second fingerprint image 520 is determined. The ratio is then compared to a ratio range threshold. In some embodiments, the probability that the finger is a real finger is based on the ratio, wherein the probability that the finger is a real finger increases as the ratio increases. The reasoning is that for real fingers the change in width as a function of time of flight is more observable, and therefore the signal strength of the second spatial frequency range is higher.
  • FIG. 5B illustrates example 550 fingerprint images and graphs of spatial ridge frequency of a fake finger, according to embodiments.
  • Example 550 illustrates first fingerprint image 580, second fingerprint image 570, and difference image 580, of a fake finger, where the ridges are indicated in the fingerprint images as the darker patterns.
  • First fingerprint image 560 is captured at a first time of flight
  • second fingerprint image 570 is captured at a delayed time of flight relative to the first time of flight.
  • Difference image 580 is generated by subtracting second fingerprint image 570 from first fingerprint image 560.
  • Graph 565 illustrates a graph of spectrogram power versus frequency of first fingerprint image 560, in which the maximum power corresponds to a spatial frequency range centered at approximately 2.1 line pairs per millimeter (Ipmm).
  • Graph 575 illustrates a graph of spectrogram power versus frequency of second fingerprint image 570, in which the maximum power also corresponds to a spatial frequency range centered at approximately 2.1 Ipmm.
  • Graph 585 illustrates a graph of spectrogram power versus frequency of difference image 580, in which the maximum power at around twice the main frequency contributions of the first and/or second image is difficult to determine.
  • the frequency of detected ridges in difference image 580 is not substantially double the frequency of detected ridges in first fingerprint images 560 or second fingerprint image 570, indicative of the finger being a fake finger. Moreover, there is no maximum or peak signal strength at or near 4.2 Ipmm of difference image 580 Where the finger is a fake finger, the ridges of difference image 580 are not clearly observable. In particular, there is no observable ridge splitting in difference image 580, and no observable ridge narrowing between first fingerprint images 560 and second fingerprint image 570. Therefore, example 550 illustrates that first fingerprint images 560 and second fingerprint image 570 were generated using a fake finger.
  • FIG. 6A illustrates example 600 of fingerprint images and a composite graph of spatial ridge frequency of a real finger, according to embodiments.
  • Example 600 illustrates first fingerprint image 610, second fingerprint image 620, and difference image 630, of a real finger, where the ridges are indicated in the fingerprint images as the darker patterns.
  • First fingerprint image 610 is captured at a first time of flight
  • second fingerprint image 620 is captured at a delayed time of flight relative to the first time of flight.
  • Difference image 630 is generated by subtracting second fingerprint image 620 from first fingerprint image 610.
  • Graph 615 illustrates a graph of normalized power (e.g., signal strength) versus frequency of first fingerprint image 610, second fingerprint image 620, and difference image 630.
  • First fingerprint image 610 corresponds to a ridge frequency of approximately 2.5 Ipmm, as indicated at point 614 of line 612.
  • Second fingerprint image 620 corresponds to a ridge frequency of approximately 2.3 Ipmm, as indicated at point 624 line 622.
  • Difference image 630 corresponds to a maximum ridge frequency of approximately 4.6 Ipmm as indicated at point 634 of line 632.
  • the frequency of detected ridges in difference image 630 is substantially double the frequency of detected ridges in first fingerprint image 610 and second fingerprint image 620, indicative of the finger being a real finger.
  • the finger is a real finger
  • the ridges of difference image 630 are observably split. Therefore, difference image 630 has substantially twice the main spatial frequency comparing to one of first fingerprint image 610 and second fingerprint image 620. Therefore, example 600 illustrates that first fingerprint image 610 and second fingerprint image 620 were generated using a real finger.
  • the ratio of the signal strength of difference image 630 to the signal strength of one of first fingerprint image 610 and second fingerprint image 620 is determined. The ratio is then compared to a ratio range threshold. In some embodiments, the probability that the finger is a real finger is based on the ratio, wherein the probability that the finger is a real finger increases as the ratio increases. The reasoning is that for real fingers the change in width as a function of time of flight is more observable, and therefore the signal strength of the second spatial frequency range is higher.
  • FIG. 6B illustrates example 650 of fingerprint images and a composite graph of spatial ridge frequency of a fake finger, according to embodiments.
  • Example 650 illustrates first fingerprint image 660, second fingerprint image 670, and difference image 680, of a fake finger, where the ridges are indicated in the fingerprint images as the darker patterns.
  • First fingerprint Image 660 is captured at a first time of flight
  • second fingerprint image 670 is captured at a delayed time of flight relative to the first time of flight.
  • Difference image 680 is generated by subtracting second fingerprint image 670 from first fingerprint image 660.
  • Graph 690 illustrates a graph of normalized power (e.g., signal strength) versus frequency of first fingerprint image 660, second fingerprint image 670, and difference image 680
  • First fingerprint image 660 and second fingerprint image 670 correspond to a ridge frequency of approximately 2.2 ipmm, as indicated at point 664 of line 662 and point 674 of line 672, respectively.
  • Difference image 680 corresponds to a maximum ridge frequency of approximately 1.4 Ipmm, as indicated at point 684 of line 682.
  • the frequency of detected ridges in difference image 680 is not substantially double the frequency of detected ridges in first fingerprint image 660 or second fingerprint image 670, indicative of the finger being a fake finger.
  • example 650 illustrates that first fingerprint image 660 and second fingerprint image 670were generated using a fake finger.
  • FIGs. 7 through 9 illustrate example processes for determining whether a finger is a real finger at a fingerprint sensor, according to some embodiments. Procedures of the example processes will be described with reference to elements and/or components of various figures described herein. It is appreciated that in some embodiments, the procedures may be performed in a different order than described, that some of the described procedures may not be performed, and/or that one or more additional procedures to those described may be performed.
  • the flow diagram includes some procedures that, in various embodiments, are carried out by one or more processors (e.g., a host processor or a sensor processor) under the control of computer-readable and computer-executable instructions that are stored on non- transitory computer-readable storage media it is further appreciated that one or more procedures described in the flow diagrams may be implemented in hardware, or a combination of hardware with firmware and/or software.
  • processors e.g., a host processor or a sensor processor
  • a first image of a fingerprint pattern is captured at an ultrasonic fingerprint sensor, wherein the first image is based on ultrasonic signals corresponding to a first time of flight range.
  • a second image of the fingerprint pattern is captured at the ultrasonic fingerprint sensor, wherein the second image is based on ultrasonic signals corresponding to a second time of flight range, the second time of flight range being delayed compared to the first time of flight range.
  • procedure 730 a difference in a width of ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image is quantified.
  • procedure 730 is performed according to flow diagram 800 of FIG. 8.
  • a difference between the first image and the second image in one embodiment as shown at procedure 812 a difference image is generated by subtracting the second image from the first image.
  • a first signal strength is determined at a first spatial frequency range of at least one of the first image and the second image.
  • a second signal strength is determined at a second spatial frequency range of the difference between the first image and the second image.
  • the second signal strength is determined at the second spatial frequency range of the difference image between the first image and the second image in one embodiment, the first signal strength corresponds to a maximum in signal strength of the first spatial frequency range and wherein the second signal strength corresponds to a maximum in signal strength of the second spatial frequency range.
  • the second frequency range is distributed around a frequency substantially double a main frequency contribution of the first frequency range in one embodiment, the first spatial frequency range corresponds to a spatial ridge frequency range of the fingerprint pattern.
  • the first signal strength is compared to the second signal strength.
  • a ratio of the second signal strength to the first signal strength is determined. Provided the ratio satisfies a ratio range threshold, as shown at procedure 854, it is determined that the finger is a real finger. In one embodiment, the ratio range threshold is above 0.8.
  • the probability that the finger is a real finger is determined. In one embodiment, the probability that the finger is a real finger is based on the ratio of the second signal strength to the first signal strength, wherein the probability that the finger is a real finger Increases as the ratio of the second signal strength to the first signal strength increases.
  • procedure 730 is performed according to flow diagram 900 of FIG. 9.
  • procedure 910 of flow diagram 900 a first average width of ridges of the fingerprint pattern of the first image is determined.
  • procedure 920 a second average width of ridges of the fingerprint pattern of the second image is determined.
  • procedure 930 a difference between the first average width and the second average width is quantified.
  • a ratio of the first average width to the second average width is determined.
  • determining the probability whether the finger is a real finger includes comparing of the ratio of the first average width and the second average width to a width range threshold.
  • a method for determining whether a finger is a real finger at a fingerprint sensor comprising:
  • the second image is based on ultrasonic signals corresponding to a second time of flight range, the second time of flight range being delayed compared to the first time of flight range;
  • An ultrasonic fingerprint sensor device comprising:
  • processor configured to:
  • the second image is based on uitrasonic signals corresponding to a second time of flight range, the second time of flight range being delayed compared to the first time of flight range; quantify a difference in a width of ridges of the fingerprint pattern in the first image compared to the width of ridges of the fingerprint pattern in the second image;
  • a non-transitory computer readable storage medium having computer readable program code stored thereon for causing a computer system to perform a
  • the second image is based on ultrasonic signals corresponding to a second time of flight range, the second time of flight range being delayed compared to the first time of flight range;

Abstract

Selon l'invention, dans un procédé exécuté pour déterminer si un doigt est un doigt réel au niveau d'un capteur d'empreinte digitale ultrasonore, une première image d'un motif d'empreinte digitale est capturée au niveau d'un capteur d'empreinte ultrasonore, la première image étant basée sur des signaux ultrasonores correspondant à une première plage de temps de vol. Une seconde image du motif d'empreinte digitale est capturée au niveau du capteur d'empreinte ultrasonore, la seconde image étant basée sur des signaux ultrasonores correspondant à une seconde plage de temps de vol, la seconde plage de temps de vol étant retardée par rapport à la première plage de temps de vol. Une différence d'une largeur de crêtes du motif d'empreinte digitale dans la première image par rapport à la largeur de crêtes du motif d'empreinte digitale dans la seconde image est quantifiée. Sur la base de la différence quantifiée, une probabilité que le doigt soit un doigt réel est déterminée.
PCT/US2020/039208 2019-06-24 2020-06-23 Détection de faux doigt à l'aide de caractéristiques de crête WO2020263875A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962865810P 2019-06-24 2019-06-24
US62/865,810 2019-06-24
US16/909,917 2020-06-23
US16/909,917 US11188735B2 (en) 2019-06-24 2020-06-23 Fake finger detection using ridge features

Publications (1)

Publication Number Publication Date
WO2020263875A1 true WO2020263875A1 (fr) 2020-12-30

Family

ID=74038879

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/039208 WO2020263875A1 (fr) 2019-06-24 2020-06-23 Détection de faux doigt à l'aide de caractéristiques de crête

Country Status (2)

Country Link
US (1) US11188735B2 (fr)
WO (1) WO2020263875A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220014441A (ko) * 2020-07-27 2022-02-07 삼성디스플레이 주식회사 지문 인증 장치, 이를 포함하는 표시 장치, 및 지문 인증 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160070967A1 (en) * 2014-09-05 2016-03-10 Qualcomm Incorporated Multi-Stage Liveness Determination
EP3086261A2 (fr) * 2015-04-21 2016-10-26 Samsung Electronics Co., Ltd. Procédé et appareil pour la détection d'empreintes digitales
WO2017053877A2 (fr) * 2015-09-26 2017-03-30 Qualcomm Incorporated Dispositifs et procédés d'imagerie à ultrasons
US20190018123A1 (en) * 2017-07-17 2019-01-17 Invensense, Inc. Defective ultrasonic transducer detection in an ultrasonic sensor

Family Cites Families (259)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07121158B2 (ja) 1987-01-19 1995-12-20 オムロン株式会社 超音波探触子
AU3361095A (en) 1994-08-05 1996-03-04 Acuson Corporation Method and apparatus for transmit beamformer system
US5585546A (en) 1994-10-31 1996-12-17 Hewlett-Packard Company Apparatus and methods for controlling sensitivity of transducers
US5575286A (en) 1995-03-31 1996-11-19 Siemens Medical Systems, Inc. Method and apparatus for generating large compound ultrasound image
US5808967A (en) 1996-10-07 1998-09-15 Rowe-Deines Instruments Incorporated Two-dimensional array transducer and beamformer
US5867302A (en) 1997-08-07 1999-02-02 Sandia Corporation Bistable microelectromechanical actuator
US6289112B1 (en) 1997-08-22 2001-09-11 International Business Machines Corporation System and method for determining block direction in fingerprint images
US6071239A (en) 1997-10-27 2000-06-06 Cribbs; Robert W. Method and apparatus for lipolytic therapy using ultrasound energy
US5911692A (en) 1998-01-20 1999-06-15 General Electric Company Sparse two-dimensional wideband ultrasound transducer arrays
US6350652B1 (en) 1998-10-23 2002-02-26 Stmicroelectronics S.R.L. Process for manufacturing nonvolatile memory cells with dimensional control of the floating gate regions
US6483932B1 (en) 1999-08-19 2002-11-19 Cross Match Technologies, Inc. Method and apparatus for rolled fingerprint capture
WO2001021072A1 (fr) 1999-09-17 2001-03-29 Hitachi Medical Corporation Sonde a ultrasons et appareil de diagnostic a ultrasons la contenant
US6292576B1 (en) * 2000-02-29 2001-09-18 Digital Persona, Inc. Method and apparatus for distinguishing a human finger from a reproduction of a fingerprint
US6428477B1 (en) 2000-03-10 2002-08-06 Koninklijke Philips Electronics, N.V. Delivery of theraputic ultrasound by two dimensional ultrasound array
JP2003527906A (ja) 2000-03-23 2003-09-24 クロス マッチ テクノロジーズ, インコーポレイテッド 圧電識別デバイスおよびそのアプリケーション
US7067962B2 (en) 2000-03-23 2006-06-27 Cross Match Technologies, Inc. Multiplexer for a piezo ceramic identification device
US6571444B2 (en) 2001-03-20 2003-06-03 Vermon Method of manufacturing an ultrasonic transducer
US6582372B2 (en) 2001-06-22 2003-06-24 Koninklijke Philips Electronics N.V. Ultrasound system for the production of 3-D images
US6527723B2 (en) 2001-06-26 2003-03-04 Koninklijke Philips Electronics N.V. Variable multi-dimensional apodization control for ultrasonic transducers
US6500120B1 (en) 2001-07-31 2002-12-31 Koninklijke Philips Electronics N.V. Beamforming system using analog random access memory
FR2835981B1 (fr) 2002-02-13 2005-04-29 Commissariat Energie Atomique Microresonateur mems a ondes acoustiques de volume accordable
US6676602B1 (en) 2002-07-25 2004-01-13 Siemens Medical Solutions Usa, Inc. Two dimensional array switching for beamforming in a volume
US6958255B2 (en) 2002-08-08 2005-10-25 The Board Of Trustees Of The Leland Stanford Junior University Micromachined ultrasonic transducers and method of fabrication
JP3859673B2 (ja) 2002-09-17 2006-12-20 富士通株式会社 生体情報取得装置および生体情報による認証装置
JP4386683B2 (ja) 2002-09-30 2009-12-16 富士フイルム株式会社 超音波送受信装置及び超音波送受信方法
US7116805B2 (en) 2003-01-07 2006-10-03 Avagotechnologies Ecbu Ip (Singapore) Pte. Ltd. Fingerprint verification device
DE602004030900D1 (de) 2003-01-15 2011-02-17 Univ Virginia Effizientes ultraschallsystem für die zweidimensionale c-scan-darstellung und verwandte verfahren
US7313053B2 (en) 2003-03-06 2007-12-25 General Electric Company Method and apparatus for controlling scanning of mosaic sensor array
US6865140B2 (en) 2003-03-06 2005-03-08 General Electric Company Mosaic arrays using micromachined ultrasound transducers
JP2005045691A (ja) 2003-07-24 2005-02-17 Taiyo Yuden Co Ltd 圧電振動装置
US7539963B2 (en) 2003-10-24 2009-05-26 Fujitsu Microelectronics Limited Semiconductor device group and method for fabricating the same, and semiconductor device and method for fabricating the same
KR100561851B1 (ko) 2003-11-18 2006-03-16 삼성전자주식회사 지문 인식 센서 및 그 제조 방법
US7109642B2 (en) 2003-11-29 2006-09-19 Walter Guy Scott Composite piezoelectric apparatus and method
US7030536B2 (en) 2003-12-29 2006-04-18 General Electric Company Micromachined ultrasonic transducer cells having compliant support structure
US7052464B2 (en) 2004-01-01 2006-05-30 General Electric Company Alignment method for fabrication of integrated ultrasonic transducer array
EP1708135B1 (fr) 2004-01-13 2011-05-11 Fujitsu Ltd. Unite d'authentification utilisant des informations relatives a l'organisme
US8358815B2 (en) 2004-04-16 2013-01-22 Validity Sensors, Inc. Method and apparatus for two-dimensional finger motion tracking and control
DE102004022838A1 (de) 2004-05-08 2005-12-01 Forschungszentrum Karlsruhe Gmbh Ultraschallwandler sowie Verfahren zur Herstellung desselben
JP4575738B2 (ja) 2004-09-29 2010-11-04 富士フイルム株式会社 超音波画像境界抽出方法及び超音波画像境界抽出装置、並びに、超音波撮像装置
US7739912B2 (en) 2004-10-07 2010-06-22 Ultra-Scan Corporation Ultrasonic fingerprint scanning utilizing a plane wave
US7243547B2 (en) 2004-10-13 2007-07-17 Honeywell International Inc. MEMS SAW sensor
US7257241B2 (en) 2005-01-07 2007-08-14 Motorola, Inc. Dynamic thresholding for a fingerprint matching system
KR100747446B1 (ko) 2005-03-07 2007-08-09 엘지전자 주식회사 휴대단말기의 지문인식 장치 및 방법
CA2607916A1 (fr) 2005-05-18 2006-11-23 Kolo Technologies, Inc. Transducteurs mecaniques microelectriques
US7433034B1 (en) 2005-06-17 2008-10-07 Nanometrics Incorporated Darkfield defect inspection with spectral contents
GB0513253D0 (en) 2005-06-29 2005-08-03 Oceanscan Ltd Improved acoustic sensor and method
US8182428B2 (en) 2005-07-26 2012-05-22 Surf Technology As Dual frequency band ultrasound transducer arrays
US7564172B1 (en) 2005-08-03 2009-07-21 Kolo Technologies, Inc. Micro-electro-mechanical transducer having embedded springs
WO2007020694A1 (fr) 2005-08-18 2007-02-22 Fujitsu Limited Dispositif semi-conducteur et son procédé de fabrication
KR100825773B1 (ko) 2005-08-23 2008-04-28 삼성전자주식회사 방향 추정 방법 및 장치
US20070073135A1 (en) 2005-09-13 2007-03-29 Warren Lee Integrated ultrasound imaging and ablation probe
JP4896542B2 (ja) 2006-02-24 2012-03-14 富士フイルム株式会社 パターン膜の製造方法
US7615834B2 (en) 2006-02-28 2009-11-10 The Board Of Trustees Of The Leland Stanford Junior University Capacitive micromachined ultrasonic transducer(CMUT) with varying thickness membrane
JP4839099B2 (ja) 2006-03-03 2011-12-14 オリンパスメディカルシステムズ株式会社 マイクロマシンプロセスにより製造された超音波振動子、超音波振動子装置、その体腔内超音波診断装置、及びその制御方法
JP4757071B2 (ja) 2006-03-27 2011-08-24 富士通株式会社 指紋認証装置および情報処理装置
US20070230754A1 (en) 2006-03-30 2007-10-04 Jain Anil K Level 3 features for fingerprint matching
WO2008066956A2 (fr) 2006-05-25 2008-06-05 Ultra-Scan Corporation Lecteur d'objet biométrique comportant un dispositif de manipulation d'ondes ultrasonores
US20100030076A1 (en) 2006-08-01 2010-02-04 Kobi Vortman Systems and Methods for Simultaneously Treating Multiple Target Sites
KR101335200B1 (ko) 2006-11-03 2013-11-29 리써치 트라이앵글 인스티튜트 굴곡 모드 압전 트랜스듀서를 사용하는 보강된 초음파 촬영 프로브
CN101190133B (zh) 2006-11-28 2011-05-18 深圳迈瑞生物医疗电子股份有限公司 超声波诊断系统中宽波束的发射方法和装置
US8018010B2 (en) 2007-04-20 2011-09-13 The George Washington University Circular surface acoustic wave (SAW) devices, processes for making them, and methods of use
US8096951B2 (en) 2007-06-28 2012-01-17 General Electric Company Transmit beamforming in 3-dimensional ultrasound
WO2009016606A2 (fr) 2007-07-31 2009-02-05 Koninklijke Philips Electronics, N.V. Transducteurs à ultrasons micro-usinés capacitifs ayant un diélectrique à constante diélectrique élevée
US20100256498A1 (en) 2007-11-16 2010-10-07 Hiroki Tanaka Ultrasonic imaging device
JP2009182838A (ja) 2008-01-31 2009-08-13 Kyoto Univ 弾性波トランスデューサ、弾性波トランスデューサアレイ、超音波探触子、超音波撮像装置
US8531915B2 (en) 2008-04-20 2013-09-10 Stalix Llc Acoustic and ultrasonic concealed object detection
US20090274343A1 (en) 2008-05-05 2009-11-05 Sonavation, Inc. Dynamic optimization of a biometric sensor
US8515135B2 (en) 2008-05-06 2013-08-20 Sonavation, Inc. PLL adjustment to find and maintain resonant frequency of piezo electric finger print sensor
US8335356B2 (en) 2008-05-08 2012-12-18 Sonavation, Inc. Mechanical resonator optimization using shear wave damping
US20090279745A1 (en) 2008-05-08 2009-11-12 Sonavation, Inc. Method and System for Image Resolution Improvement of Biometric Digit Imprint Sensors Using Staggered Rows
US8805031B2 (en) 2008-05-08 2014-08-12 Sonavation, Inc. Method and system for acoustic impediography biometric sensing
US9024507B2 (en) 2008-07-10 2015-05-05 Cornell University Ultrasound wave generating apparatus
JP5206218B2 (ja) 2008-08-20 2013-06-12 富士通株式会社 指紋画像取得装置、指紋認証装置、指紋画像取得方法及び指紋認証方法
JP5628178B2 (ja) 2008-09-16 2014-11-19 コーニンクレッカ フィリップス エヌ ヴェ 容量性マイクロマシン超音波トランスデューサ
AU2009310362A1 (en) 2008-11-03 2010-05-06 Cross Match Technologies, Inc. Apparatus and method for the identification of fake fingerprints
US8255698B2 (en) 2008-12-23 2012-08-28 Motorola Mobility Llc Context aware biometric authentication
US10129656B2 (en) 2009-01-30 2018-11-13 Avago Technologies International Sales Pte. Limited Active temperature control of piezoelectric membrane-based micro-electromechanical devices
TWI515664B (zh) 2009-03-23 2016-01-01 索納遜公司 用於壓電陶瓷辨識裝置的改良多工器
US20100239751A1 (en) 2009-03-23 2010-09-23 Sonavation, Inc. Sea of Pillars
US8508103B2 (en) 2009-03-23 2013-08-13 Sonavation, Inc. Piezoelectric identification device and applications thereof
US8703040B2 (en) 2009-06-19 2014-04-22 Sonavation, Inc. Method for manufacturing a piezoelectric ceramic body
JP2011040467A (ja) 2009-08-07 2011-02-24 Toshiba Corp 半導体装置
US10496871B2 (en) 2009-11-10 2019-12-03 Nec Corporation Fake-finger determination device, fake-finger determination method, and fake-finger determination program
US8433110B2 (en) 2009-12-11 2013-04-30 Sonavation, Inc. Pulse-rate detection using a fingerprint sensor
JP5665016B2 (ja) * 2009-12-22 2015-02-04 日本電気株式会社 偽指判定装置
CN103109252A (zh) 2010-05-14 2013-05-15 索纳维森股份有限公司 使用声学超声阻抗描记术的用于指向装置的方法和系统
US8357981B2 (en) 2010-05-28 2013-01-22 Avago Technologies Wireless Ip (Singapore) Pte. Ltd. Transducer devices having different frequencies based on layer thicknesses and method of fabricating the same
US8942438B2 (en) 2010-07-19 2015-01-27 The University Of Maryland, College Park Method and apparatus for authenticating swipe biometric scanners
US8311514B2 (en) 2010-09-16 2012-11-13 Microsoft Corporation Prevention of accidental device activation
WO2012051305A2 (fr) 2010-10-13 2012-04-19 Mau Imaging, Inc. Appareil interne de sonde à ouvertures multiples et systèmes de câbles
KR20130127980A (ko) 2010-10-19 2013-11-25 소나베이션, 인크. 어쿠스틱 임피디오그래피를 사용한 지문 센서 장치, 방법, 전기적 시스템
WO2012061740A2 (fr) 2010-11-04 2012-05-10 Sonavation, Inc. Capteur d'empreintes digitales tactile utilisant des composites piézo à 1 : 3 et le principe d'impédiographie acoustique
US9259961B2 (en) 2010-12-10 2016-02-16 Palo Alto Research Center Incorporated Large-area ultrasound contact imaging
CN103493510B (zh) 2011-02-15 2016-09-14 富士胶卷迪马蒂克斯股份有限公司 使用微圆顶阵列的压电式换能器
US8891334B2 (en) 2011-03-04 2014-11-18 Georgia Tech Research Corporation Compact, energy-efficient ultrasound imaging probes using CMUT arrays with integrated electronics
US8900148B2 (en) 2011-03-09 2014-12-02 Fujifilm Corporation Ultrasound diagnostic apparatus
US20120238876A1 (en) 2011-03-18 2012-09-20 Fujifilm Corporation Ultrasound diagnostic apparatus and method of producing ultrasound image
TW201306337A (zh) 2011-04-08 2013-02-01 Sonavation Inc 用於在壓電陣列上沈積材料之系統及方法
KR101761818B1 (ko) 2011-08-23 2017-08-04 삼성전자주식회사 전기음향 변환기 및 그 제조 방법
WO2013072803A1 (fr) 2011-11-17 2013-05-23 Koninklijke Philips Electronics N.V. Cellule de transducteur micro-usinée capacitive pré-écrasée avec région écrasée de forme annulaire
US8836472B2 (en) 2011-11-23 2014-09-16 Blackberry Limited Combining navigation and fingerprint sensing
KR101288178B1 (ko) 2011-11-30 2013-07-19 삼성전기주식회사 지문 인식 센서 및 지문 인식 방법
KR101320138B1 (ko) 2011-11-30 2013-10-23 삼성전기주식회사 지문 인식 센서 및 그 제조 방법
US8723399B2 (en) 2011-12-27 2014-05-13 Massachusetts Institute Of Technology Tunable ultrasound transducers
TWI585657B (zh) 2012-02-02 2017-06-01 高通公司 具有顯示監視器之超音波觸控感應器
US20130271628A1 (en) 2012-04-17 2013-10-17 Skybox Imaging, Inc. Sensor dark pixel offset estimation
US8767512B2 (en) 2012-05-01 2014-07-01 Fujifilm Dimatix, Inc. Multi-frequency ultra wide bandwidth transducer
US9454954B2 (en) 2012-05-01 2016-09-27 Fujifilm Dimatix, Inc. Ultra wide bandwidth transducer with dual electrode
US8913801B2 (en) 2012-06-29 2014-12-16 Apple Inc. Enrollment using synthetic fingerprint image and fingerprint sensing systems
CN104620128B (zh) 2012-08-10 2017-06-23 毛伊图像公司 多孔径超声探头的校准
US20140060196A1 (en) 2012-08-31 2014-03-06 General Electric Company Ultrasonic testing apparatus
US9660170B2 (en) 2012-10-26 2017-05-23 Fujifilm Dimatix, Inc. Micromachined ultrasonic transducer arrays with multiple harmonic modes
US10497747B2 (en) 2012-11-28 2019-12-03 Invensense, Inc. Integrated piezoelectric microelectromechanical ultrasound transducer (PMUT) on integrated circuit (IC) for fingerprint sensing
US10726231B2 (en) 2012-11-28 2020-07-28 Invensense, Inc. Integrated piezoelectric microelectromechanical ultrasound transducer (PMUT) on integrated circuit (IC) for fingerprint sensing
US9218472B2 (en) 2012-12-20 2015-12-22 Google Technology Holdings LLP Piezo based fingerprint sensor structure
JP6212870B2 (ja) 2013-01-28 2017-10-18 セイコーエプソン株式会社 超音波デバイス、超音波プローブ、電子機器および超音波画像装置
EP2954458A4 (fr) 2013-02-06 2016-11-09 Sonavation Inc Dispositif de détection biométrique pour l'imagerie tridimensionnelle de structures sous-cutanées intégrées dans un tissu de doigt
US9096422B2 (en) 2013-02-15 2015-08-04 Fujifilm Dimatix, Inc. Piezoelectric array employing integrated MEMS switches
US9245165B2 (en) 2013-03-15 2016-01-26 Google Technology Holdings LLC Auxiliary functionality control and fingerprint authentication based on a same user input
US9818020B2 (en) 2013-04-02 2017-11-14 Precise Biometrics Ab Fingerprint pore analysis for liveness detection
US10580243B2 (en) 2013-04-16 2020-03-03 Imageware Systems, Inc. Conditional and situational biometric authentication and enrollment
US20140355387A1 (en) 2013-06-03 2014-12-04 Qualcomm Incorporated Ultrasonic receiver with coated piezoelectric layer
CN111626111B (zh) 2013-07-16 2024-03-08 加利福尼亚大学董事会 Mut指纹id系统
US9984270B2 (en) 2013-08-05 2018-05-29 Apple Inc. Fingerprint sensor in an electronic device
WO2015023981A1 (fr) 2013-08-15 2015-02-19 Rowe Technologies, Inc. Appareil transducteur de sous-réseau et procédés
US10806431B2 (en) 2013-09-25 2020-10-20 Massachusetts Institute Of Technology Application specific integrated circuit with column-row-parallel architecture for ultrasonic imaging
US20150082890A1 (en) 2013-09-26 2015-03-26 Intel Corporation Biometric sensors for personal devices
US9475093B2 (en) 2013-10-03 2016-10-25 Fujifilm Dimatix, Inc. Piezoelectric ultrasonic transducer array with switched operational modes
US9967100B2 (en) 2013-11-05 2018-05-08 Samsung Electronics Co., Ltd Method of controlling power supply for fingerprint sensor, fingerprint processing device, and electronic device performing the same
SG10201407632UA (en) 2013-11-26 2015-06-29 Agency Science Tech & Res Transducer and method for forming the same
EP3080686A1 (fr) 2013-12-12 2016-10-19 Qualcomm Incorporated Transducteurs ultrasonores micromécaniques et affichage
KR20150068846A (ko) 2013-12-12 2015-06-22 삼성전자주식회사 초음파 진단 장치 및 그 제어방법
US20160326477A1 (en) 2013-12-20 2016-11-10 Jose FERNANDEZ-ALCON Organomimetic devices and methods of use and manufacturing thereof
KR101700998B1 (ko) 2014-01-02 2017-01-31 삼성전기주식회사 지문 감지 센서 및 이를 포함하는 전자 기기
US9224030B2 (en) 2014-01-10 2015-12-29 Qualcomm Incorporated Sensor identification
US9817108B2 (en) 2014-01-13 2017-11-14 Qualcomm Incorporated Ultrasonic imaging with acoustic resonant cavity
US20150206738A1 (en) 2014-01-21 2015-07-23 Sematech, Inc. Surface Cleaning Method and Apparatus Using Surface Acoustic Wave Devices
WO2015112452A1 (fr) 2014-01-24 2015-07-30 The Regents Of The University Of California Transducteurs piézoélectriques arrondis
US9336346B2 (en) 2014-01-30 2016-05-10 Qualcomm Technologies International, Ltd. Integral fabrication of asymmetric CMOS transistors for autonomous wireless state radios and sensor/actuator nodes
KR102171082B1 (ko) 2014-02-06 2020-10-28 삼성전자주식회사 지문 처리 방법 및 그 전자 장치
WO2015120132A1 (fr) 2014-02-07 2015-08-13 The Regents Of The University Of California Réglage de la fréquence d'accord et/ou poursuite en fréquence d'un système mécanique de faible sensibilité à la traversée électrique
US9945818B2 (en) 2014-02-23 2018-04-17 Qualcomm Incorporated Ultrasonic authenticating button
WO2015130809A1 (fr) 2014-02-25 2015-09-03 Lumidigm, Inc. Détection d'usurpation d'adresse par bioimpédance
EP3110628B1 (fr) 2014-02-28 2019-07-03 The Regents of the University of California Diaphragme d'épaisseur variable pour un transducteur ultrasonore micro-usiné piézoélectrique robuste à large bande
WO2015134816A1 (fr) 2014-03-06 2015-09-11 Qualcomm Incorporated Imagerie ultrasonore multispectrale
US10503948B2 (en) * 2014-03-06 2019-12-10 Qualcomm Incorporated Multi-spectral ultrasonic imaging
CN106457475A (zh) 2014-03-14 2017-02-22 康宁股份有限公司 嵌入玻璃的传感器及其制造过程
KR102283922B1 (ko) 2014-04-02 2021-07-30 삼성디스플레이 주식회사 터치 센서
JP6342695B2 (ja) 2014-04-18 2018-06-13 ラピスセミコンダクタ株式会社 半導体装置、表示システム、検出方法、及び検出プログラム
WO2015171224A1 (fr) 2014-05-09 2015-11-12 Chirp Microsystems, Inc. Transducteur à ultrasons micro-usiné utilisant de multiples matériaux piézoélectriques
KR102212632B1 (ko) 2014-05-12 2021-02-08 삼성전자주식회사 지문 인식 방법 및 이를 수행하는 전자 장치
US10107645B2 (en) 2014-05-30 2018-10-23 Fujifilm Dimatix, Inc. Piezoelectric transducer device with flexible substrate
DE102014107819A1 (de) 2014-06-03 2016-01-14 Ge Sensing & Inspection Technologies Gmbh Verfahren zur zerstörungsfreien Prüfung eines Prüflings mittels Ultraschall sowie Vorrichtung hierzu
CN105225217B (zh) 2014-06-23 2018-04-10 株式会社理光 基于深度的背景模型更新方法和系统
CA2950919A1 (fr) 2014-07-08 2016-01-14 Qualcomm Incorporated Transducteur ultrasonore piezoelectrique et son procede
WO2016011172A1 (fr) 2014-07-16 2016-01-21 Chirp Microsystems Transducteurs ultrasonores micro-usinés piézoélectriques utilisant deux substrats liés
US9230150B1 (en) 2014-07-28 2016-01-05 Google Technology Holdings LLC Finger print sensor and auxiliary processor integration in an electronic device
KR20160023154A (ko) 2014-08-21 2016-03-03 삼성전자주식회사 초음파 변환기
US9665763B2 (en) 2014-08-31 2017-05-30 Qualcomm Incorporated Finger/non-finger determination for biometric sensors
US9582705B2 (en) 2014-08-31 2017-02-28 Qualcomm Incorporated Layered filtering for biometric sensors
EP3757884A1 (fr) 2014-09-08 2020-12-30 InvenSense, Inc. Transducteur ultrasonore piézoélectrique micro-usiné (pmut) intégré sur circuit intégré (ic) pour détection d'empreintes digitales
US9613246B1 (en) 2014-09-16 2017-04-04 Apple Inc. Multiple scan element array ultrasonic biometric scanner
US9904836B2 (en) 2014-09-30 2018-02-27 Apple Inc. Reducing edge effects within segmented acoustic imaging systems
US9984271B1 (en) 2014-09-30 2018-05-29 Apple Inc. Ultrasonic fingerprint sensor in display bezel
US9747488B2 (en) 2014-09-30 2017-08-29 Apple Inc. Active sensing element for acoustic imaging systems
US9607203B1 (en) 2014-09-30 2017-03-28 Apple Inc. Biometric sensing device with discrete ultrasonic transducers
FR3026877B1 (fr) 2014-10-03 2018-01-05 Commissariat A L'energie Atomique Et Aux Energies Alternatives Capteur d'empreintes digitales ou palmaires
KR20160041516A (ko) 2014-10-08 2016-04-18 삼성전자주식회사 빔포밍 장치 및 이를 포함하는 초음파 진단장치
US9995821B2 (en) 2014-10-15 2018-06-12 Qualcomm Incorporated Active beam-forming technique for piezoelectric ultrasonic transducer array
WO2016061406A1 (fr) 2014-10-15 2016-04-21 Qualcomm Incorporated Mosaïque de superpixels de transducteurs piézoélectriques à ultrasons pour formation de faisceau 2d
US9734381B2 (en) 2014-12-17 2017-08-15 Northrop Grumman Systems Corporation System and method for extracting two-dimensional fingerprints from high resolution three-dimensional surface data obtained from contactless, stand-off sensors
US9582102B2 (en) 2015-01-27 2017-02-28 Apple Inc. Electronic device including finger biometric sensor carried by a touch display and related methods
KR102277155B1 (ko) 2015-01-29 2021-07-14 삼성전자주식회사 지문 인식을 통해 사용자를 인증하는 방법 및 이를 위한 전자 장치
KR102338864B1 (ko) 2015-02-12 2021-12-13 삼성전자주식회사 전자 장치 및 전자 장치에서의 지문 등록 방법
US9939972B2 (en) 2015-04-06 2018-04-10 Synaptics Incorporated Matrix sensor with via routing
US10229304B2 (en) 2015-06-05 2019-03-12 Synaptics Incorporated Finger detection with auto-baseline tracking
US9424456B1 (en) 2015-06-24 2016-08-23 Amazon Technologies, Inc. Ultrasonic fingerprint authentication based on beam forming
US10387704B2 (en) 2015-06-29 2019-08-20 Qualcomm Incorporated Method and apparatus for enabling the touchscreen display of a mobile device
US10339178B2 (en) 2015-06-30 2019-07-02 Samsung Electronics Co., Ltd. Fingerprint recognition method and apparatus
CN107924259B (zh) 2015-06-30 2021-09-24 辛纳普蒂克斯公司 用于显示集成的具有1-tft像素架构的有源矩阵电容性指纹传感器
US9672409B2 (en) 2015-07-03 2017-06-06 Fingerprint Cards Ab Apparatus and computer-implemented method for fingerprint based authentication
US11538126B2 (en) 2015-07-30 2022-12-27 The Government of the United States of America, as represented by the Secretary of Homeland Security Identity verification system and method
US9959444B2 (en) 2015-09-02 2018-05-01 Synaptics Incorporated Fingerprint sensor under thin face-sheet with aperture layer
US10146981B2 (en) 2015-09-10 2018-12-04 Qualcomm Incorporated Fingerprint enrollment and matching with orientation sensor input
US10261804B2 (en) 2015-09-11 2019-04-16 Qualcomm Incorporated Gradual power wake-up mechanism
US10275638B1 (en) 2015-09-29 2019-04-30 Apple Inc. Methods of biometric imaging of input surfaces
US20170100091A1 (en) 2015-10-08 2017-04-13 General Electric Company Ultrasound system and method for use with a heat-affected region
US10497748B2 (en) 2015-10-14 2019-12-03 Qualcomm Incorporated Integrated piezoelectric micromechanical ultrasonic transducer pixel and array
US10635878B2 (en) 2015-10-23 2020-04-28 Shenzhen GOODIX Technology Co., Ltd. Optical fingerprint sensor with force sensing capability
US10682118B2 (en) 2015-10-30 2020-06-16 General Electric Company Ultrasound system and method for analyzing cardiac periodicity
US9626549B1 (en) 2015-11-16 2017-04-18 MorphoTrak, LLC Derived virtual quality parameters for fingerprint matching
CN105511625B (zh) 2015-12-15 2019-02-12 小米科技有限责任公司 屏幕的唤醒方法及装置
US9983656B2 (en) 2015-12-31 2018-05-29 Motorola Mobility Llc Fingerprint sensor with power saving operating modes, and corresponding devices, systems, and methods
US10262188B2 (en) 2016-02-15 2019-04-16 Qualcomm Incorporated Liveness and spoof detection for ultrasonic fingerprint sensors
US10296145B2 (en) 2016-03-03 2019-05-21 Invensense, Inc. Determining force applied to an ultrasonic sensor
KR101661629B1 (ko) 2016-03-11 2016-09-30 주식회사 베프스 Pzt 무결정 합금 도금액 및 이를 사용한 pzt 무결정 합금 도금방법
KR101661634B1 (ko) 2016-03-11 2016-09-30 주식회사 베프스 복수의 압전 소자를 선택적으로 활성화 시키는 방법 및 이를 위한 생체정보 인식장치
US9898640B2 (en) 2016-05-02 2018-02-20 Fingerprint Cards Ab Capacitive fingerprint sensing device and method for capturing a fingerprint using the sensing device
US10325915B2 (en) 2016-05-04 2019-06-18 Invensense, Inc. Two-dimensional array of CMOS control elements
US10315222B2 (en) 2016-05-04 2019-06-11 Invensense, Inc. Two-dimensional array of CMOS control elements
US10656255B2 (en) 2016-05-04 2020-05-19 Invensense, Inc. Piezoelectric micromachined ultrasonic transducer (PMUT)
US10670716B2 (en) 2016-05-04 2020-06-02 Invensense, Inc. Operating a two-dimensional array of ultrasonic transducers
US10445547B2 (en) 2016-05-04 2019-10-15 Invensense, Inc. Device mountable packaging of ultrasonic transducers
US9955325B2 (en) 2016-05-06 2018-04-24 Qualcomm Incorporated Personal medical device interference mitigation
US10441975B2 (en) 2016-05-10 2019-10-15 Invensense, Inc. Supplemental sensor modes and systems for ultrasonic transducers
US10452887B2 (en) 2016-05-10 2019-10-22 Invensense, Inc. Operating a fingerprint sensor comprised of ultrasonic transducers
US10632500B2 (en) 2016-05-10 2020-04-28 Invensense, Inc. Ultrasonic transducer with a non-uniform membrane
US10539539B2 (en) 2016-05-10 2020-01-21 Invensense, Inc. Operation of an ultrasonic sensor
US10600403B2 (en) 2016-05-10 2020-03-24 Invensense, Inc. Transmit operation of an ultrasonic sensor
US10408797B2 (en) 2016-05-10 2019-09-10 Invensense, Inc. Sensing device with a temperature sensor
US10706835B2 (en) 2016-05-10 2020-07-07 Invensense, Inc. Transmit beamforming of a two-dimensional array of ultrasonic transducers
US10562070B2 (en) 2016-05-10 2020-02-18 Invensense, Inc. Receive operation of an ultrasonic sensor
US11673165B2 (en) 2016-05-10 2023-06-13 Invensense, Inc. Ultrasonic transducer operable in a surface acoustic wave (SAW) mode
US9785819B1 (en) 2016-06-30 2017-10-10 Synaptics Incorporated Systems and methods for biometric image alignment
US10599911B2 (en) 2016-07-20 2020-03-24 Cypress Semiconductor Corporation Anti-spoofing protection for fingerprint controllers
DE102016114188A1 (de) 2016-08-01 2018-02-01 JENETRIC GmbH Vorrichtung und Verfahren zur Direktaufnahme von Abdrücken abgerollter Finger
US10235552B2 (en) 2016-10-12 2019-03-19 Qualcomm Incorporated Hybrid capacitive and ultrasonic sensing
US10032063B2 (en) 2016-10-14 2018-07-24 Identification International, Inc. System and method for generating a representation of variations in elevation of friction ridges in a friction ridge pattern
US10410034B2 (en) 2016-11-07 2019-09-10 Qualcomm Incorporated Ultrasonic biometric system with harmonic detection
US10068124B2 (en) 2016-11-10 2018-09-04 Synaptics Incorporated Systems and methods for spoof detection based on gradient distribution
KR102330327B1 (ko) 2016-11-29 2021-11-24 삼성전자주식회사 압력에 기초한 지문 인식 방법 및 장치
IT201600131844A1 (it) 2016-12-28 2018-06-28 St Microelectronics Srl Trasduttore ultrasonico piezoelettrico microlavorato (pmut) e metodo di fabbricazione del pmut
KR20180081356A (ko) 2017-01-06 2018-07-16 삼성전자주식회사 지문 이미지의 왜곡을 처리하는 방법 및 장치
US20180206820A1 (en) 2017-01-26 2018-07-26 Carestream Health, Inc. Ultrasound apparatus and method
US10607055B2 (en) * 2017-02-06 2020-03-31 Fingerprint Cards Ab Method for authenticating a finger of a user of an electronic device
JP6618938B2 (ja) 2017-02-10 2019-12-11 株式会社東芝 トランスデューサおよびトランスデューサアレイ
US10515255B2 (en) 2017-03-24 2019-12-24 Qualcomm Incorporated Fingerprint sensor with bioimpedance indicator
WO2018194506A1 (fr) 2017-04-19 2018-10-25 Fingerprint Cards Ab Procédé d'authentification d'empreintes digitales en utilisant une valeur de force
US9953205B1 (en) 2017-04-28 2018-04-24 The Board Of Trustees Of The Leland Stanford Junior University Acoustic biometric touch scanner
US10846501B2 (en) 2017-04-28 2020-11-24 The Board Of Trustees Of The Leland Stanford Junior University Acoustic biometric touch scanner
KR102354415B1 (ko) 2017-05-12 2022-01-21 삼성전자주식회사 전자 장치 및 전자 장치 제어 방법
US10140499B1 (en) 2017-05-25 2018-11-27 Synaptics Incorporated Systems and methods for touch and press detection using a sensor
US10474862B2 (en) 2017-06-01 2019-11-12 Invensense, Inc. Image generation in an electronic device using ultrasonic transducers
KR20180135242A (ko) 2017-06-12 2018-12-20 주식회사 하이딥 단말 및 그 제어 방법
US20180373913A1 (en) 2017-06-26 2018-12-27 Qualcomm Incorporated Ultrasonic fingerprint sensor for under-display applications
US10569302B2 (en) 2017-06-26 2020-02-25 Qualcomm Incorporated Biometric sensor with force detection and ultrasonic imaging capability
US10643052B2 (en) 2017-06-28 2020-05-05 Invensense, Inc. Image generation in an electronic device using ultrasonic transducers
US10733606B2 (en) 2017-07-07 2020-08-04 Bank Of America Corporation Biometric authentication for paper-based transactions
US10461124B2 (en) 2017-08-07 2019-10-29 Invensense, Inc. Ultrasonic sensing device
US10699097B2 (en) 2017-09-15 2020-06-30 Cross Match Technologies, Inc. System, method, and apparatus for acquiring rolled-equivalent fingerprint images
MY191624A (en) 2017-09-29 2022-07-04 Silterra Malaysia Sdn Bhd Monolithic integration of pmut on cmos
US11086453B2 (en) 2017-09-29 2021-08-10 Qualcomm Incorporated Layer for inducing varying delays in ultrasonic signals propagating in ultrasonic sensor
WO2019074423A1 (fr) 2017-10-13 2019-04-18 Fingerprint Cards Ab Procédé et système d'amélioration d'image d'empreinte digitale
US10691781B2 (en) 2017-10-30 2020-06-23 Qualcomm Incorporated Apparatus and method for device security
WO2019100329A1 (fr) 2017-11-24 2019-05-31 深圳市汇顶科技股份有限公司 Procédé d'élimination d'arrière-plan, module d'image et système d'identification d'empreinte digitale optique
US10997388B2 (en) 2017-12-01 2021-05-04 Invensense, Inc. Darkfield contamination detection
WO2019109010A1 (fr) 2017-12-01 2019-06-06 Invensense, Inc. Suivi de fond noir
US20190188442A1 (en) 2017-12-01 2019-06-20 Invensense, Inc. Correcting a fingerprint image
US11623246B2 (en) 2018-02-26 2023-04-11 Invensense, Inc. Piezoelectric micromachined ultrasound transducer device with piezoelectric barrier layer
US10755067B2 (en) 2018-03-22 2020-08-25 Invensense, Inc. Operating a fingerprint sensor comprised of ultrasonic transducers
US10846502B2 (en) 2018-04-20 2020-11-24 Invensense, Inc. Ultrasonic fingerprint sensor with a non-uniform contact layer
US11651610B2 (en) 2018-05-31 2023-05-16 Qualcomm Incorporated Heart rate and respiration rate measurement using a fingerprint sensor
FR3084945B1 (fr) 2018-08-10 2020-08-07 Idemia Identity & Security France Procede d'analyse d'une empreinte
CN109255323B (zh) 2018-09-04 2021-01-22 京东方科技集团股份有限公司 一种指纹识别结构、显示基板和显示装置
WO2020102564A1 (fr) 2018-11-14 2020-05-22 Invensense, Inc. Dispositif transducteur ultrasonore micro-usiné piézoélectrique à deux électrodes
US10936843B2 (en) 2018-12-28 2021-03-02 Invensense, Inc. Segmented image acquisition
US10817694B2 (en) 2019-03-22 2020-10-27 Qualcomm Incorporated Reducing background signal in imaging sensors
US10909348B2 (en) 2019-04-28 2021-02-02 Novatek Microelectronics Corp. Optical fingerprint sensing device and operation method thereof
CN110286738B (zh) 2019-06-29 2021-06-08 Oppo广东移动通信有限公司 指纹采集方法及相关产品

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160070967A1 (en) * 2014-09-05 2016-03-10 Qualcomm Incorporated Multi-Stage Liveness Determination
EP3086261A2 (fr) * 2015-04-21 2016-10-26 Samsung Electronics Co., Ltd. Procédé et appareil pour la détection d'empreintes digitales
WO2017053877A2 (fr) * 2015-09-26 2017-03-30 Qualcomm Incorporated Dispositifs et procédés d'imagerie à ultrasons
US20190018123A1 (en) * 2017-07-17 2019-01-17 Invensense, Inc. Defective ultrasonic transducer detection in an ultrasonic sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TANG HAO-YEN ET AL: "11.2 3D ultrasonic fingerprint sensor-on-a-chip", 2016 IEEE INTERNATIONAL SOLID-STATE CIRCUITS CONFERENCE (ISSCC), IEEE, 31 January 2016 (2016-01-31), pages 202 - 203, XP032873571, ISBN: 978-1-4673-9466-6, [retrieved on 20160223], DOI: 10.1109/ISSCC.2016.7417977 *

Also Published As

Publication number Publication date
US11188735B2 (en) 2021-11-30
US20200401783A1 (en) 2020-12-24

Similar Documents

Publication Publication Date Title
US11635840B2 (en) Determining touch applied to an ultrasonic sensor
US11513205B2 (en) System and method associated with user authentication based on an acoustic-based echo-signature
KR102367761B1 (ko) 바이오메트릭 인식을 위한 시스템들 및 방법들
EP3526715B1 (fr) Détection hybride capacitive et ultrasonore
US9436864B2 (en) Electronic device performing finger biometric pre-matching and related methods
CN107223254B (zh) 用于隐藏设置处理的方法、用户装置和存储介质
US20220043144A1 (en) Acoustic multipath correction
US10783346B2 (en) Enhancing quality of a fingerprint image
US11151355B2 (en) Generation of an estimated fingerprint
US20210097257A1 (en) System and method for fingerprint authentication
US10366271B2 (en) Method and apparatus for authenticating fingerprints using reflected wave
CN111414119A (zh) 用于生物特征认证系统的方法、系统和装置
US11216681B2 (en) Fake finger detection based on transient features
KR20110100666A (ko) 얼굴 자세 추정을 제공하기 위한 방법, 장치 및 컴퓨터 판독 가능한 저장 매체
Zhou et al. Multi-modal face authentication using deep visual and acoustic features
US11328165B2 (en) Pressure-based activation of fingerprint spoof detection
US11188735B2 (en) Fake finger detection using ridge features
JP7229254B2 (ja) 超音波を使用したユーザ認証制御
US11392789B2 (en) Fingerprint authentication using a synthetic enrollment image
CN112863523B (zh) 语音防伪方法、装置、终端设备及存储介质
US20220392249A1 (en) Deep finger ultrasonic sensor
US20220019754A1 (en) Multipath reflection correction
US20210295012A1 (en) Fingerprint sensing apparatus with in-sensor fingerprint enrollment and verification
US11232549B2 (en) Adapting a quality threshold for a fingerprint image
KR20210074749A (ko) 라이브니스 검사 방법 및 라이브니스 검사 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20742533

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20742533

Country of ref document: EP

Kind code of ref document: A1