SE1950681A1 - Ultrasonic imaging device and method for image acquisition in the ultrasonic device - Google Patents

Ultrasonic imaging device and method for image acquisition in the ultrasonic device

Info

Publication number
SE1950681A1
SE1950681A1 SE1950681A SE1950681A SE1950681A1 SE 1950681 A1 SE1950681 A1 SE 1950681A1 SE 1950681 A SE1950681 A SE 1950681A SE 1950681 A SE1950681 A SE 1950681A SE 1950681 A1 SE1950681 A1 SE 1950681A1
Authority
SE
Sweden
Prior art keywords
ultrasonic
image
data
target area
touch
Prior art date
Application number
SE1950681A
Other languages
Swedish (sv)
Inventor
Farzan Ghavanini
Hamed Bouzari
Original Assignee
Fingerprint Cards Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fingerprint Cards Ab filed Critical Fingerprint Cards Ab
Priority to SE1950681A priority Critical patent/SE1950681A1/en
Priority to CN202080041870.7A priority patent/CN113994392A/en
Priority to US17/615,126 priority patent/US11972628B2/en
Priority to EP20823519.2A priority patent/EP3983937A4/en
Priority to PCT/SE2020/050550 priority patent/WO2020251445A1/en
Publication of SE1950681A1 publication Critical patent/SE1950681A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1306Sensors therefor non-optical, e.g. ultrasonic or capacitive sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • G06F3/0436Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves in which generating transducers and detecting transducers are attached to a single acoustic waves transmission substrate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0093Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

Method for image acquisition in an ultrasonic biometric imaging device (100), the method comprising: determining (200) a target area (107) of a touch surface (104); by a plurality of ultrasonic transducers (106) arranged at a periphery of the touch surface, emitting (202) a shaped ultrasonic beam towards the target area using transmit beamforming; by the ultrasonic transducers, receiving (204) reflected ultrasonic echo signals defined by received RF-data, the reflected ultrasonic echo signals resulting from interactions with an object in contact with the touch surface at the target area; subtracting (206) background RF-data from the received RF-data to form a clean image; performing (208) receive side beamforming to form a reconstructed image from the clean image; and for a plurality of reconstructed images resulting from a plurality of emitted ultrasonic beams for a given target area, adding (210) the plurality of reconstructed images to form a summed image.

Description

ULTRASONIC IMAGING DEVICE AND METHOD FOR IMAGEACQUISITION IN THE ULTRASONIC DEVICE Field of the lnvention The present invention relates to an ultrasonic imaging device and to amethod for image acquisition in an ultrasonic device. ln particular, the presentinvention relates to forming an image based on ultrasonic reflections in the imaging device.
Backqround of the lnvention Biometric systems are widely used as means for increasing theconvenience and security of personal electronic devices, such as mobilephones etc. Fingerprint sensing systems in particular are now included in alarge proportion of all newly released personal communication devices, suchas mobile phones.
Due to their excellent performance and relatively low cost, capacitivefingerprint sensors have been used in an oven/vhelming majority of allbiometric systems.
Among other fingerprint sensing technologies, ultrasonic sensing alsohas the potential to provide advantageous performance, such as the ability toacquire fingerprint (or palmprint) images from very moist fingers etc.
One class of ultrasonic fingerprint systems of particular interest aresystems in which acoustic signals are transmitted along a surface of a deviceelement to be touched by a user, and a fingerprint (palmprint) representationis determined based on received acoustic signals resulting from theinteraction between the transmitted acoustic signals and an interface betweenthe device member and the user's skin.
Such ultrasonic fingerprint sensing systems, which are, for example,generally described in US 2017/0053151 may provide for controllableresolution, and allow for a larger sensing area, which may be opticallytransparent, without the cost of the fingerprint sensing system necessarily scaling with the sensing area and thereby allowing integration of ultrasonicfingerprint sensors in a display of a device.
However, current solutions Struggle to provide a high-resolutionfingerprint with a large coverage area of the full in-display screen, as it isdifficult to handle and process the large amount of RF-data generated foreach touch event and thereby apply the image reconstruction and matchingprocedures required.
Accordingly, there is a need for improved methods and systems for large area fingerprint imaging using ultrasonic technology.
Summaryln view of above-mentioned and other drawbacks of the prior art, it is an object of the present invention to provide an improved method and systemfor image acquisition in an ultrasonic biometric imaging device usingbeamforming.
According to a first aspect of the invention, there is provided a methodfor image acquisition in an ultrasonic biometric imaging device. The methodcomprises: determining a target area of a touch surface; by a plurality ofultrasonic transducers arranged at a periphery of the touch surface, emitting ashaped ultrasonic beam towards the target area using transmit beamforming;by the ultrasonic transducers, receiving reflected ultrasonic echo signalsdefining received RF-data, the reflected ultrasonic echo signals resulting frominteractions with an object in contact with the touch surface at the target area;subtracting background RF-data from the received RF-data to form a cleanimage; performing receive side beamforming to form a reconstructed imagefrom the clean image; and for a plurality of reconstructed images resultingfrom a plurality of emitted ultrasonic beams for a given target area, adding theplurality of reconstructed images to form a summed image.
The present method is aimed at acquiring an image of a biometricfeature such as a fingerprint or palmprint when a finger or a palm is placed incontact with the touch surface. The touch surface may for example be asurface of a display cover glass in a smartphone, tablet or the like. However, the described method can equally well be implemented in other devices, suchas an interactive TV, meeting-table, smart-board, information terminal or anyother device having a cover structure where u|trasonic waves can propagate.Since the transducers are arranged at the periphery of the active touchsurface, the described method can also be employed in e.g. an interactiveshop window or a display cabinet in a store, museum or the like. Thebiometric object may in some applications be the cheek or ear of a user.
The step of forming a shaped u|trasonic beam may also be referred toas transmit side beamforming, where the beamforming is based on thedetected target area of the touch surface. Transmit beamforming may meanusing a number of transducer elements in a transmit step so that by adjustingtransmission delays of the respective transducers, a focused, defocused, orunfocused u|trasonic beam is generated and emitted towards the target area.
The u|trasonic transducers typically comprise a piezoelectric materialgenerating an u|trasonic signal in response to an electric field applied acrossthe material by means of the top and bottom electrodes. ln principle, it is alsopossible to use other types of u|trasonic transducers, such as capacitivemicromachined u|trasonic transducers (CMUT). The u|trasonic transducerswill be described herein as transceivers being capable of both transmittingand receiving u|trasonic signals. However, it is also possible to form a systemcomprising individual and separate u|trasonic transmitters and receivers.
The device is further considered to comprise u|trasonic transducercontrol circuitry configured to control the transmission and reception ofu|trasonic signals and considered to comprise appropriate signal processingcircuitry required for extracting an image from the received u|trasonic echosignals.
The present invention is based on the realization that a method forimage acquisition including both transmit and receive beamforming provides anumber of advantages over other approaches where a single element isengaged for each transmit-receive (pulse-echo) operation.
By using a focused beam, a higher lateral resolution can be achieved.Another advantageous result of focusing is that the energy aimed at the target area is maximized since less energy is dispersed. The same is true forreceive side beamforming which will lead to an increased signal-to-noise ratio(SNR), in turn resulting in an increased penetration depth of the emittedultrasonic beam.
Furthermore, the described method results in an improved SNR as aresult of the summation of a plurality of reconstructed images which acts tosuppress random and uncorrelated noise. Moreover, in a device using thedescribed method, the SNR-requirements of the analog-to-digital converters(ADCs) may be lowered since the individual channels of the ADC will notdictate the SNR of the final image. lf a focused beam is used, the energy is more concentrated towardsthe target and even better sensitivity will be achieved. lf the beam is defocused or unfocused, the acoustic energy ispropagating in many different directions and therefore with a lowerpenetration depth but can cover the finger area with much lower number of repetitions and may thus be used if the target area is close to the transducers.
According to one embodiment of the invention, the method may furthercomprise forming a final image by taking the envelope of the summed image.By taking the envelope of the summed image, RF-data values of the imagewhich may be both positive and negative are transformed into only positivevalues. The summed image comprising positive and negative values may forexample be used by a fingerprint matching algorithm adapted for handlingraw data. However, in some applications it may be desirable to acquire amore visually accurate representation of a fingerprint, which can be achievedby taking the envelope of the summed image as described above.
According to one embodiment of the invention, the method may furthercomprise converting the received RF-data to in-phase quadrature complexdata. Converting the received RF-data to in-phase quadrature complex datamay comprise using Hilbert transform. Furthermore, converting the receivedRF-data to in-phase data makes it straight-forvvard to add the plurality ofreconstructed images in-phase to form a summed image. Since the noise ofthe received RF-data can be considered to be random, the likelihood of the noise to be in-phase from one received set of image data resulting from oneultrasonic beam to the next is very low. Thus, by adding the data in-phase thenoise can be significantly suppressed. However, due to the nature of thereceived RF-data which can have both positive and negative values,interaction of the RF-data belonging to different ultrasonic beams cangenerate constructive and deconstructive effects. Constructive wheneverpositive or negative values are added to each other, and deconstructivewhenever positive values are added to negative values.
According to one embodiment of the invention, the method maycomprise adding the plurality of reconstructed images out-of-phase to form asummed image. As described above, in-phase summation of images maylead to interference effects and in situation when such effects are particularlydisadvantageous or when strong interference is observed, it may be desirableto add the images using out-of-phase data. Out-of-phase addition can help toincrease the contrast in the resulting summed image by ensuring that the RF-values values are always added together without their phase information.
According to one embodiment of the invention, the number ofultrasonic transducers used for receiving the ultrasonic echo signals is thesame as the number of ultrasonic transducers used for emitting the shapedultrasonic beam. Thereby, there is no need to use additional transducers forecho signal reception even though the propagation direction of the echosignals cannot be exactly known.
According to one embodiment of the invention, the method may furthercomprise controlling a resolution of the final image by controlling the numberof emitted ultrasonic beams used for forming a summed image. ln a firstattempt at forming an image, every second emitted ultrasonic beam may beused for forming a summed image, and if the resolution turns out to beinsufficient, all of the emitted ultrasonic beams may be used.
According to one embodiment of the invention, determining the targetarea comprises receiving information describing the target area from a touchsensing arrangement configured to detect a location of an object in contactwith the touch surface. The step of determining the touch area can for example be performed using the capacitive elements of a capacitivetouchscreen or by using only the ultrasonic transducers. By determining thetarget area, e.g. the position of the finger on the screen, the ultrasonic beamcan be aimed and transmitted towards the target location. Using an ultrasonicapproach to detect the position of the finger on the screen may be a preferredchoice since it allows the ultrasonic imaging system to operate as a stand-alone system. When using the ultrasonic system, an unfocused beam (planewave) is transmitted to cover the whole region of the touch surface, e.g. adisplay cover glass, and by analyzing the received echoes the position of thefinger on the display can be determined with sufficient accuracy. lt is hereassumed that there is only one finger in contact with the touch surface. Formultiple fingers on the display, the transmit procedure is the same, i.e. anunfocused beam is transmitted. However, the received signals requiresfurther analysis and a different approach where than one location for thefinger position is expected.
According to a second aspect of the invention, there is provided anultrasonic biometric imaging device comprising: a cover structure comprisinga touch surface, a plurality of ultrasonic transducers arranged at a peripheryof the touch surface, the plurality of ultrasonic transducers being configured toemit a shaped ultrasonic beam towards the target area using transmitbeamforming and to receive a reflected ultrasonic echo signals definingreceived RF-data, the reflected ultrasonic echo signals resulting fromreflections by an object in contact with the touch surface at the target area;and a biometric imaging control unit. The biometric imaging control unit isconfigured to: subtract background RF-data from the received RF-data toform a clean image; performing receive side beamforming to form areconstructed image from the clean image; for a plurality of reconstructedimages resulting from a plurality of emitted ultrasonic beams for a given targetarea, add the plurality of reconstructed images to form a summed image; andform a final image by taking the envelope of the summed image.
According to one embodiment of the invention, the plurality of transducers may be arranged in a single row on a single side of the touch surface. Since transducers arranged in a row can be controlled by means ofbeamforming to generate an ultrasonic beam aimed at a specific targetlocation, it is sufficient to provide transducers on a single side of the touchsurface while still being able to acquire an image of a target area anywhereon the touch surface.
According to one embodiment of the invention, the ultrasonic imagingdevice may further comprise a touch sensing arrangement configured todetect a location of an object in contact with the touch surface. The touchsensing arrangement may for example comprise a plurality of touch sensingelements located under the cover structure, such as in a capacitive touchpanel. The touch sensing arrangement may also comprise the sameultrasonic transducers used for acquiring the biometric image.
Additional effects and features of the second aspect of the inventionare largely analogous to those described above in connection with the firstaspect of the invention.
Further features of, and advantages with, the present invention willbecome apparent when studying the appended claims and the followingdescription. The skilled person realize that different features of the presentinvention may be combined to create embodiments other than thosedescribed in the following, without departing from the scope of the present invenfion.
Brief Description of the Drawinqs These and other aspects of the present invention will now be describedin more detail, with reference to the appended drawings showing an exampleembodiment of the invention, wherein: Fig. 1A schematically illustrates a display arrangement comprising abiometric imaging device according to an embodiment of the invention; Fig. 1B is a cross section view of a display arrangement comprising abiometric imaging device according to an embodiment of the invention; Fig. 2 is a flow chart outlining the general steps of a method for acquiring an image according to an embodiment of the invention; Figs. 3A-B schematically illustrate features of a method and systemaccording to an embodiment of the invention; andFig. 4 A-B schematically illustrate features of a biometric imaging device according to an embodiment of the invention.
Detailed Description of Example Embodiments ln the present detailed description, various embodiments of the systemand method according to the present invention are mainly described withreference to a biometric imaging device adapted to form an image of a fingerplaced on a display glass of a smartphone. lt should however be noted thatthe described technology may be implemented in a range of differentapplications.
Fig. 1A schematically illustrates a biometric imaging device 100integrated in an electronic device in the form of a smartphone 103. Theillustrated smartphone 100 comprises a display panel having a coverstructure 102 in the form of a cover glass 102. The cover glass 102 definesan exterior surface 104 configured to be touched by a finger 105, hereinreferred to as the touch surface 104. The cover structure 102 is hereillustrated as a transparent cover glass of a type commonly used in a displaypanel of the smartphone 103. However, the cover structure 102 may equallywell be a non-transparent cover plate as long as the acoustic properties of thecover structure 102 allows for propagation of ultrasound energy.
The display arrangement further comprises a plurality of ultrasonictransducers 106 connected to the cover structure 102 and located at theperiphery of the cover structure 102. Accordingly, the ultrasonic transducers106 are here illustrated as being non-overlapping with an active sensing area104 of the biometric imaging device formed by the ultrasonic transducers 106and the cover structure 102. However, the ultrasonic transducers 106 mayalso be arranged and configured such that they overlap an active sensingarea. Fig. 1A illustrates an example distribution of the transducers 106 wherethe transducers 106 are evenly distributed around the periphery of the coverstructure 102 along all sides of the display panel. However, other transducer distributions are equally possible, such as arranging the transducers 106 onone, two or three sides of the display panel, and also irregular distributionsare possible.
Fig. 1B is a cross section view of the cover structure 102 where it isillustrated that the ultrasonic transducers 106 are arranged underneath thecover structure 102 and attached to the bottom surface 118 of the coverstructure 102. The ultrasonic transducer 106 is a piezoelectric transducercomprising a first electrode 108 and second electrode 110 arranged onopposing sides of a piezoelectric element 112 such that by controlling thevoltage of the two electrodes 108, 110, an ultrasonic signal can be generatedwhich propagates into the cover structure 102.
The pitch of the transducers may be between half the wavelength ofthe emitted signal and 1.5 times the wavelength, where the wavelength of thetransducer is related to the size of the transducer. For an application where itis known that beam steering will be required, the pitch may preferably be halfthe wavelength so that grating lobes are located outside of an active imagingarea. A pitch approximately equal to the wavelength of the emitted signal maybe well suited for applications where no beam steering is required since thegrating lobes will be close to the main lobe. The wavelength of the transducershould be approximately equal to the size of the features that are to bedetected, which in the case of fingerprint imaging means using a wavelengthin the range of 50-300um. An ultrasonic transducer 106 can have differentconfigurations depending on the type of transducer and also depending onthe specific transducer package used. Accordingly, the size and shape of thetransducer as well as electrode configurations may vary. lt is furthermorepossible to use other types of devices for the generation of ultrasonic signalssuch as micromachined ultrasonic transducers (MUTs), including bothcapacitive (cMUTs) and piezoelectric types (pMUTs).
Moreover, suitable control circuitry 114 is required for controlling thetransducer to emit an acoustic signal having the required properties withrespect to e.g. amplitude, pulse shape and timing. However, such control circuitry for ultrasonic transducers is well known to the skilled person and willnot be discussed in detail herein.
Each ultrasonic transducer 106 is configured to transmit an acousticsignal Sr propagating in the cover structure 102 and to receive a reflectedultrasonic signal SR having been influenced by an object 105, hererepresented by a finger 105, in contact with the sensing surface 104.
The acoustic interaction signals SR are presently believed to mainly bedue to so-called contact scattering at the contact area between the coverstructure 102 and the skin of the user (finger 105). The acoustic interaction atthe point of contact between the finger 105 and the cover plate 103 may alsogive rise to refraction, diffraction, dispersion and dissipation of the acoustictransmit signal Sr. Accordingly, the interaction signals SR are advantageouslyanalyzed based on the described interaction phenomena to determineproperties of the finger 105 based on the received ultrasonic signal. Forsimplicity, the received ultrasonic interaction signals SR will henceforth bereferred to as reflected ultrasonic echo signals SR.
Accordingly, the ultrasonic transducers 106 and associated controlcircuitry 114 are configured to determine properties of the object based on thereceived ultrasonic echo signal SR. The plurality of ultrasonic transducers 106are connected to and controlled by ultrasonic transducer control circuitry 114.The control circuitry 114 for controlling the transducers 106 may be embodiedin many different ways. The control circuitry 114 may for example be onecentral control unit 114 responsible for determining the properties of theacoustic signals Sr to be transmitted, and for analyzing the subsequentinteraction signals Siiv. Moreover, each transducer 106 may additionallycomprise control circuitry for performing specified actions based on areceived command.
The control unit 114 may include a microprocessor, microcontroller,programmable digital signal processor or another programmable device. Thecontrol unit 114 may also, or instead, include an application specificintegrated circuit, a programmable gate array or programmable array logic, aprogrammable logic device, or a digital signal processor. Where the control 11 unit 114 includes a programmable device such as the microprocessor,microcontroller or programmable digital signal processor mentioned above,the processor may further include computer executable code that controlsoperation of the programmable device. The functionality of the control circuitry114 may also be integrated in control circuitry used for controlling the displaypanel or other features of the smartphone 100.
Fig. 2 is a flow chart outlining the general steps of a method for imageacquisition in an ultrasonic biometric imaging device 100 according to anembodiment of the invention. The method will be described with reference tothe device 100 illustrated in Figs. 1A-B.
The first step comprises determining 200 a target area 107 of the touchsurface 104. Determining the target area 107 may comprise receivinginformation describing the target area 107 from a touch sensing arrangementconfigured to detect a location of an object in contact with the touch surface.The touch sensing arrangement may for example be a capacitive touch panelin a display panel or it may be formed by the ultrasonic transducers.
Once the target area is determined, a shaped ultrasonic beam isemitted 202 by the plurality of ultrasonic transducers 106 towards the targetarea 107 using transmit beamforming. The ultrasonic beam is thus emitted bya plurality of ultrasonic transducers 106 arranged at a periphery of the touchsurface 104 towards a selected subarea of the touch surface. Transmitbeamforming can be performed by controlling the firing delays of respectivetransducers, i.e. the specific time when a pulse from a specific transducer isemitted. Thereby a focused, defocused, or unfocused ultrasonic beam isgenerated and emitted towards the target area 107.
Next, the ultrasonic transducers receive 204 reflected ultrasonic echosignals defined by the received RF-data. As discussed above, the reflectedultrasonic echo signals SR result from interactions with an object in contactwith the touch surface at the target area. ln order to more clearly distinguish the echo signal SR in the receivedRF-data, background RF-data is subtracted 206 from the received RF-data toform what is here referred to as a clean image. The subtraction of the 12 background RF-data from the acquired RF-data can be done either in the rawRF-data or after a receive side beamforming procedure which will bedescribed in further detail below. For subtraction of background RF-data inthe RF-data domain, the response of each individual transducer element isstored and a corresponding background measurement for each transducerelement is subtracted from the acquired RF-data. lt should be noted that alloperations are performed in the digital domain, meaning that AD-conversionis performed before subtraction of the background RF-data, and that thebackground RF-data needs to be available in digital form. The resulting imageafter subtraction of background RF-data is herein referred to as a cleanimage.
The background RF-data may be acquired in different ways. Thebackground data may for example be acquired by capturing an image of theentire touch surface either at regular intervals or when it is anticipated that afinger will be placed on the touch surface, for example if prompted by anapplication in the device. However, capturing an image of the touch surfacerequires acquiring and storing large amounts of data and if possible, it isdesirable to only acquire background data of a subarea of the touch surfacecorresponding to the target area. This in turn requires prior knowledge ofwhere on the touch surface the finger will be placed. ln a device comprising a capacitive touch screen, it can be possible touse a so-called hover mode of the capacitive touch screen to determine thetarget are before the actual contact takes place. ln the hover mode, theproximity of a finger can be detected, the target area can be anticipated andbackground RF-data for the anticipated target are can be acquired prior toimage acquisition. lt would however in principle also be possible to acquirethe background noise after the touch has taken place, i.e. when the userremoves the finger, even though this may limit the possible implementationsof the image acquisition device.
Receive side beamforming to form a reconstructed image from theclean image can be performed 208 either before or after the subtraction ofbackground RF-data described above. The receive side beamforming is 13 performed dynamically by adjusting the delay values of the received echosignals so that they are “focused” at every single imaging pixel. The receivedsignals are focused at any imaging point, which will be repeated until a fullimage is generated. ln general, an example implementation of receive sidebeamforming referred to as delay-and-sum beamforming can be described bythree steps: 1) The delay between each imaging point from the focal point as wellas back to each receiving element is estimated. 2) The estimated delay is used in an interpolation step to estimate theRF-data value. The interpolation is used since the delay might be betweentwo samples. For example, a Spline interpolation may be used. 3) The RF amplitudes are summed across all receive channels.
The method further comprises adding 210 a plurality of reconstructedimages resulting from a plurality of emitted ultrasonic beams for a given targetarea to form a summed image. The number of transmit events required forcapturing the target area can be estimated based on the relation between thewidth of the transmitted beam at the target area and the width of the targetarea. Accordingly, for a focused emitted beam, a larger number of emittedbeams is typically required compared to when using an unfocused ordefocused beam, assuming that the width of the transmitted beam at thetarget area is lower than the width of the target area.
The reconstructed images for each transmit event may be eithercoherently or incoherently added together, i.e. in-phase or out-of-phasedepending on if there is a need to reduce the noise in the image (achieved byin-phase addition) or if it is desirable to increase the contrast of the image(can be achieved by out-of-phase addition). ln-phase addition of the reconstructed images can be achieved byconverting the received RF-data into in-phase quadrature complex data, IQ-data, thereby making the phase information available. Thereby, reconstructedimages represented by IQ data will subsequently be added in-phase(coherently). However, if the reconstructed images should be added out-of-phase (incoherently), IQ data is not needed. 14 Out-of-phase combining can help to increase the contrast by makingsure that the impulse values are always added together without their phaseinformation, i.e. whether they are positive values or negative.
A final image is formed 212 by taking the envelope of the summedimage. The final values for every imaging pixel can be either positive ornegative due to the nature of the RF-values. However, it is preferred to showthe full image based on the brightness of the image. ln the RF-values, largevalues in both positive and negative represent a strong reflectivity and valuesclose to zero represent low reflectivity. Accordingly, envelope detection canbe used to convert the original representation into values only in the positiverange. However, it should be noted that the step of taking the envelope of theimage is optional and that it in some applications is possible to derivesufficient information directly from the summed image.
Figs. 3A-B schematically illustrate examples of transmit beamformingwhere Fig. 3A illustrates a defocused beam 300 and Fig. 3B illustrates afocused beam 302. A row of ultrasonic transducers 106 is here placed to theleft of the touch surface 104 and in Fig. 3A a plurality of virtual sources 304a-c are illustrated to the left of the transducers 106. ln Fig. 3B, the virtualsources 306a-c are instead located on the right-hand side of the row oftransducers 106.
Fig. 4A is a graph showing of the intensity profile 400 of a beamformedshaped ultrasonic transmit beam Sr having a focal point 402 approximately atthe center of the image, corresponding to a target area.
Fig. 4B is a graph showing of the intensity profile 404 of a beamformedreceived reflected echo signals SR having a focal point 404 approximately atthe center of the image, i.e. at the same location as the focal point 402 of thetransmit signal.
Fig. 4C is a graph illustrating the combination of transmit and receivebeamforming forming a combined focus point 408 corresponding to a virtualtarget area. Accordingly, efficient biometric imaging at the target area 107 canbe achieved by the combination of transmit and receive beamforming.
Fig. 4A illustrates a focused beam and the same reasoning appliesalso when emitting a defocused or unfocused beam with the difference thatthe resulting focus point will be larger. Thereby, since the focus point is larger,fewer transmissions will be required for covering the target area but theresolution will be correspondingly lower. lt is thus possible to select whetherto use a focused, unfocused or defocused emitted beam based on therequirements of imaging speed vs imaging resolution.
The spatial resolution of the system refers to the ability to resolvepoints that are very close to each other. ln the described system the lateralresolution (x-axis) and the axial resolution (y-axis) is preferably the same.This will make sure that the total resolution is uniform and symmetrical in bothdirections. The spatial resolution can be represented by a point spreadfunction (PSF) and in the present case the PSF will substantially circular.Biometric image acquisition requires a spatial resolution which is sufficientlyhigh to resolve the features of the biometric object, e.g. to resolve the ridgesand valleys of a fingerprint. However, the described method and system mayalso be used in applications where a much lower resolution is required, e.g. ina touch detection system.
By using the described system it is possible to form touch detectionsystem or a touch tracking system where only a few transducers located onone side of a touch area are required for detecting a touch event by an objectsuch as a finger or the tip of a stylus. The complexity of the system canthereby be very much reduced since it is only required to detect and track asingle point. Thereby, there is no need to beamform or reconstruct an imagebut only to finding the position of the maximum amplitude on the screen andkeep tracking that point with high accuracy. The size of the elements couldthen be relatively large as again there is no need to generate an image, sotherefore the signal amplitude can be also relatively high. The processing ofthe RF data can be handled with a small processor in the ADC ASIC or insidethe host CPU.
Even though the invention has been described with reference tospecific exemplifying embodiments thereof, many different alterations, 16 modifications and the like will become apparent for those skilled in the art.Also, it should be noted that parts of the method and system may be omitted,interchanged or arranged in various ways, the method and system yet beingable to perform the functionality of the present invention.
Additionally, variations to the disclosed embodiments can beunderstood and effected by the skilled person in practicing the claimedinvention, from a study of the drawings, the disclosure, and the appendedclaims. ln the claims, the word "comprising" does not exclude other elementsor steps, and the indefinite article "a" or "an" does not exclude a plurality. Themere fact that certain measures are recited in mutually different dependentclaims does not indicate that a combination of these measures cannot beused to advantage.

Claims (16)

1. Method for image acquisition in an ultrasonic biometric imagingdevice (100), the method comprising: determining (200) a target area (107) of a touch surface (104); by a plurality of ultrasonic transducers (106) arranged at a periphery ofthe touch surface, emitting (202) a shaped ultrasonic beam towards the targetarea using transmit beamforming; by the ultrasonic transducers, receiving (204) ref|ected ultrasonic echosignals defined by received RF-data, the ref|ected ultrasonic echo signalsresu|ting from interactions with an object in contact with the touch surface atthe target area; subtracting (206) background RF-data from the received RF-data toform a clean image; performing (208) receive side beamforming to form a reconstructedimage from the clean image; and for a plurality of reconstructed images resu|ting from a plurality ofemitted ultrasonic beams for a given target area, adding (210) the plurality of reconstructed images to form a summed image.
2. The method according to claim 1, further comprising, forming(212) a final image by taking the envelope of the summed image.
3. The method according to claim 1 or 2, further comprising converting the received RF-data to in-phase quadrature complex data.
4. The method according to claim 3, wherein converting thereceived RF-data to in-phase quadrature complex data comprises usingHilbert transform. 18
5. The method according to any one of the preceding claims,wherein adding the plurality of reconstructed images to form a summed imagecomprises adding the plurality of images in-phase.
6. The method according to claim 1 or 2, wherein adding theplurality of reconstructed images to form a summed image comprises addingthe plurality of images out-of-phase.
7. The method according to any one of the preceding claims,wherein the number of ultrasonic transducers used for receiving the ultrasonicecho signals is the same as the number of ultrasonic transducers used foremitting the shaped ultrasonic beam.
8. The method according to any one of the preceding claims,further comprising contro||ing a resolution of the final image by contro||ing the number of emitted ultrasonic beams used for forming a summed image.
9. The method according to any one of the preceding claims,wherein determining the target area comprises receiving informationdescribing the target area from a touch sensing arrangement configured to detect a location of an object in contact with the touch surface.
10.a cover structure (103) comprising a touch surface (104); An ultrasonic biometric imaging device comprising: a plurality of ultrasonic transducers arranged at a periphery of thetouch surface, the plurality of ultrasonic transducers being configured to emita shaped ultrasonic beam towards a target area using transmit beamformingand to receive a reflected ultrasonic echo signals defining received RF-data,the reflected ultrasonic echo signals resulting from reflections by an object incontact with the touch surface at the target area; and a biometric imaging control unit (114) configured to: 19 subtract background RF-data from the received RF-data to form aclean image; performing receive side beamforming to form a reconstructed imagefrom the clean image; and for a plurality of reconstructed images resulting from a plurality ofemitted ultrasonic beams for a given target area, add the plurality of reconstructed images to form a summed image.
11. of transducers are arranged in a single row on a single side of the touch The ultrasonic imaging device to claim 10, wherein the plurality surface.
12. the plurality of transducers are arranged and configured such that a resolution The ultrasonic imaging device according to claim 10, wherein in an x-direction is the same as a resolution in a y-direction, the x- and y-directions defining the plane of the touch area.
13.to 12, further comprising a touch sensing arrangement configured to detect a The ultrasonic imaging device according to any one of claims 10 location of an object in contact with the touch surface.
14. to 12, wherein the touch sensing arrangement comprises a plurality of touch The ultrasonic imaging device according to any one of claims 10 sensing elements located under the cover structure.
15.imaging device according to any one of claims 10 to 14. An electronic user device comprising an ultrasonic biometric
16.cover structure is a display panel. The electronic user device according to claim 15, wherein the
SE1950681A 2019-06-10 2019-06-10 Ultrasonic imaging device and method for image acquisition in the ultrasonic device SE1950681A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
SE1950681A SE1950681A1 (en) 2019-06-10 2019-06-10 Ultrasonic imaging device and method for image acquisition in the ultrasonic device
CN202080041870.7A CN113994392A (en) 2019-06-10 2020-06-01 Ultrasonic imaging device and method for acquiring image in ultrasonic device
US17/615,126 US11972628B2 (en) 2019-06-10 2020-06-01 Ultrasonic imaging device and method for image acquisition in the ultrasonic device
EP20823519.2A EP3983937A4 (en) 2019-06-10 2020-06-01 Ultrasonic imaging device and method for image acquisition in the ultrasonic device
PCT/SE2020/050550 WO2020251445A1 (en) 2019-06-10 2020-06-01 Ultrasonic imaging device and method for image acquisition in the ultrasonic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1950681A SE1950681A1 (en) 2019-06-10 2019-06-10 Ultrasonic imaging device and method for image acquisition in the ultrasonic device

Publications (1)

Publication Number Publication Date
SE1950681A1 true SE1950681A1 (en) 2020-12-11

Family

ID=74086361

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1950681A SE1950681A1 (en) 2019-06-10 2019-06-10 Ultrasonic imaging device and method for image acquisition in the ultrasonic device

Country Status (1)

Country Link
SE (1) SE1950681A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1046928A2 (en) * 1999-04-23 2000-10-25 General Electric Company Method and apparatus for flow imaging using coded excitation
EP1884197A1 (en) * 2005-05-20 2008-02-06 Hitachi Medical Corporation Image diagnosing device
US20150055821A1 (en) * 2013-08-22 2015-02-26 Amazon Technologies, Inc. Multi-tracker object tracking
US20150189136A1 (en) * 2014-01-02 2015-07-02 Samsung Electro-Mechanics Co., Ltd. Fingerprint sensor and electronic device including the same
WO2017052836A1 (en) * 2015-09-24 2017-03-30 Qualcomm Incorporated Receive-side beam forming for an ultrasonic image sensor
US20180055369A1 (en) * 2016-08-31 2018-03-01 Qualcomm Incorporated Layered sensing including rf-acoustic imaging
US10198610B1 (en) * 2015-09-29 2019-02-05 Apple Inc. Acoustic pulse coding for imaging of input surfaces
US20190094981A1 (en) * 2014-06-14 2019-03-28 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
WO2019125273A1 (en) * 2017-12-21 2019-06-27 Fingerprint Cards Ab Display arrangement comprising ultrasonic biometric sensing system and method for manufacturing the display arrangement

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1046928A2 (en) * 1999-04-23 2000-10-25 General Electric Company Method and apparatus for flow imaging using coded excitation
EP1884197A1 (en) * 2005-05-20 2008-02-06 Hitachi Medical Corporation Image diagnosing device
US20150055821A1 (en) * 2013-08-22 2015-02-26 Amazon Technologies, Inc. Multi-tracker object tracking
US20150189136A1 (en) * 2014-01-02 2015-07-02 Samsung Electro-Mechanics Co., Ltd. Fingerprint sensor and electronic device including the same
US20190094981A1 (en) * 2014-06-14 2019-03-28 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
WO2017052836A1 (en) * 2015-09-24 2017-03-30 Qualcomm Incorporated Receive-side beam forming for an ultrasonic image sensor
US10198610B1 (en) * 2015-09-29 2019-02-05 Apple Inc. Acoustic pulse coding for imaging of input surfaces
US20180055369A1 (en) * 2016-08-31 2018-03-01 Qualcomm Incorporated Layered sensing including rf-acoustic imaging
WO2019125273A1 (en) * 2017-12-21 2019-06-27 Fingerprint Cards Ab Display arrangement comprising ultrasonic biometric sensing system and method for manufacturing the display arrangement

Similar Documents

Publication Publication Date Title
US20220071601A1 (en) Systems and methods for improving ultrasound image quality by applying weighting factors
US8998812B2 (en) Ultrasound method and probe for electromagnetic noise cancellation
US11096671B2 (en) Sparkle artifact detection in ultrasound color flow
US11972628B2 (en) Ultrasonic imaging device and method for image acquisition in the ultrasonic device
EP3199251B1 (en) Ultrasonic transducer and ultrasonic probe including the same
US11432806B2 (en) Information processing apparatus, information processing method, and storage medium
WO2021103493A1 (en) Shear wave-based imaging method, system and apparatus
US20150099960A1 (en) Ultrasonic probe and medical apparatus including the same
US10448925B2 (en) Ultrasonic diagnostic apparatus and method for reducing clutter
US20150025383A1 (en) Ultrasonic imaging apparatus and control method thereof
US20160199031A1 (en) Matching member and ultrasound probe including the same
US20230404539A1 (en) Noise reduction for ultrasound operations
US20110245676A1 (en) Method and apparatus for ultrasound signal acquisition and processing
US9320498B2 (en) Twinkle artifact suppression in ultrasound color flow
US20220237940A1 (en) Ultrasonic imaging device and method for image acquisition in the ultrasonic device
SE1950681A1 (en) Ultrasonic imaging device and method for image acquisition in the ultrasonic device
CN110013276A (en) The calibration of ARFI imaging
US11830276B2 (en) Ultrasonic biometric imaging system and method for controlling the ultrasonic biometric imaging system
US20200081107A1 (en) Methods and systems for filtering ultrasound image clutter
JP2016527020A5 (en)
Kuc Forming maps of targets having multiple reflectors with a biomimetic audible sonar
CN115607185A (en) Ultrasonic imaging method and ultrasonic imaging system

Legal Events

Date Code Title Description
NAV Patent application has lapsed