WO2018162305A1 - Location device and system for locating an acoustic sensor - Google Patents

Location device and system for locating an acoustic sensor Download PDF

Info

Publication number
WO2018162305A1
WO2018162305A1 PCT/EP2018/054999 EP2018054999W WO2018162305A1 WO 2018162305 A1 WO2018162305 A1 WO 2018162305A1 EP 2018054999 W EP2018054999 W EP 2018054999W WO 2018162305 A1 WO2018162305 A1 WO 2018162305A1
Authority
WO
WIPO (PCT)
Prior art keywords
location
acoustic sensor
transmit
signals
ultrasound
Prior art date
Application number
PCT/EP2018/054999
Other languages
French (fr)
Inventor
Man Nguyen
Hua Xie
Sheng-Wen Huang
Carolina AMADOR CARRASCAL
Vijay Thakur SHAMDASANI
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Priority to US16/492,446 priority Critical patent/US11747456B2/en
Priority to EP18707716.9A priority patent/EP3592240B1/en
Priority to JP2019548395A priority patent/JP7167045B2/en
Priority to CN201880017100.1A priority patent/CN110392553B/en
Publication of WO2018162305A1 publication Critical patent/WO2018162305A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52036Details of receivers using analysis of echo signal for target characterisation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52085Details related to the ultrasound signal acquisition, e.g. scan sequences
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3925Markers, e.g. radio-opaque or breast lesions markers ultrasonic
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6846Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
    • A61B5/6847Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
    • A61B5/6852Catheters

Definitions

  • One solution for marking a needle tip under ultrasound guidance is to embed a small ultrasound sensor at the tip of the needle.
  • Such a sensor receives the direct ultrasound signals that impinge upon it as imaging beams from an ultrasound imaging probe sweep the field of view.
  • a sensor may also be implanted in the body, for monitoring a condition in the body.
  • the acoustic sensor device may function solely for location, by generating an identifiable signal at its location. However, by calibrating the frequency response of the sensor, information about the external environment (such as the pressure in a fluid flow field) can also be encoded in the signal received from the acoustic sensor. For example, for pressure sensors, the relationship between the device resonance frequency and the ambient pressure can be calibrated. Based on the frequency detected, the ambient pressure around the device can be determined.
  • the acoustic sensor device is often implanted with other interventional devices such as stents or prosthetic heart valves. As a result, it is challenging to locate the device under B-mode ultrasound.
  • a location device for determining the location of an acoustic sensor comprising:
  • an ultrasound transducer array arranged to transmit a plurality of ultrasound beams and receive corresponding reflected echo signals
  • the scanning approaches may be of different types (such as unfocused and focused) or they may be of the same type but with different scanning parameters (such as different density of scan lines). Overall, the aim is to enable a high precision location to be identified in a way which reduces the amount of time and/or image processing required to reach the desired location accuracy.
  • the controller arrangement is adapted to implement a location process which comprises: providing a first, non-focused, transmit beam and obtaining a first location area;
  • the accuracy of the location is increased in iterative steps, making use of an adaptive transmit pattern.
  • the pattern starts as a non-focused beam such as a broad beam pattern and narrows to a focused beam based on the receive beams that have components of the resonant signals from the acoustic sensor.
  • the broad beam pattern provides a coarse location (i.e. a location area) and the spatial resolution and bandwidth of the focused beam allows a more precise location to be determined.
  • the controller arrangement is adapted to implement a location process which comprises scanning a first plurality of focused transmit beams across a region of interest to provide focusing positions with a first spacing (first beam density), and scanning at least a second plurality of focused transmit beams across the region of interest to provide more closely spaced focusing positions (more densely packed scanning lines).
  • the higher first spacing corresponds to a lower precision of the first location area, whereas the more closely spaced focusing positions correspond to a more precise location.
  • the controller arrangement is thus for example adapted to identify an orientation of the acoustic sensor by determining a transmission angle and emission
  • the invention also provides a location system comprising:
  • the acoustic sensor for example comprises a membrane having a resonant frequency within a reception frequency range of the acoustic transducer array, for generating an echo at the resonant frequency.
  • the method comprises:
  • Figure 1 shows a known ultrasound imaging system
  • An image is formed by combining multiple transmit scan lines, where one scan line is a transmitted and received narrow beam. By combining the received echo data for the set of limes the ultrasound image is created.
  • CMUT transducers in particular are able to function over a broad bandwidth, enable high resolution and high sensitivity imaging, and produce a large pressure output so that a large depth of field of acoustic signals can be received at ultrasonic frequencies.
  • the transducer array 6 is coupled to a micro-beamformer 12 which controls transmission and reception of signals by the CMUT array cells.
  • Micro-beamformers are capable of at least partial beam forming of the signals received by groups or "patches" of transducer elements for instance as described in US patents US 5,997,479 (Savord et al), US 6,013,032 (Savord), and US 6,623,432 (Powers et al.)
  • the Doppler processor typically includes a wall filter with parameters which may be set to pass and/or reject echoes returned from selected types of materials in the body.
  • the wall filter can be set to have a passband characteristic which passes signal of relatively low amplitude from higher velocity materials while rejecting relatively strong signals from lower or zero velocity material.
  • This passband characteristic will pass signals from flowing blood while rejecting signals from nearby stationary or slowing moving objects such as the wall of the heart.
  • An inverse characteristic would pass signals from moving tissue of the heart while rejecting blood flow signals for what is referred to as tissue Doppler imaging, detecting and depicting the motion of tissue.
  • the Doppler processor receives and processes a sequence of temporally discrete echo signals from different points in an image field, the sequence of echoes from a particular point referred to as an ensemble.
  • An ensemble of echoes received in rapid succession over a relatively short interval can be used to estimate the Doppler shift frequency of flowing blood, with the correspondence of the Doppler frequency to velocity indicating the blood flow velocity.
  • An ensemble of echoes received over a longer period of time is used to estimate the velocity of slower flowing blood or slowly moving tissue.
  • Output data from the quantification processor is coupled to a graphics processor 36 for the reproduction of measurement graphics and values with the image on the display 40.
  • the graphics processor 36 can also generate graphic overlays for display with the ultrasound images. These graphic overlays can contain standard identifying information such as patient name, date and time of the image, imaging parameters, and the like. For these purposes the graphics processor receives input from the user interface 38, such as patient name.
  • the user interface is also coupled to the transmit controller 18 to control the generation of ultrasound signals from the transducer array 6 and hence the images produced by the transducer array and the ultrasound system.
  • the user interface is also coupled to the multiplanar reformatter 44 for selection and control of the planes of multiple multiplanar reformatted (MPR) images which may be used to perform quantified measures in the image field of the MPR images.
  • MPR multiplanar reformatted
  • the emitted ultrasound excites the membrane of the acoustic sensor device, which then generates an echo signal at its resonant frequency.
  • the resonance frequency of the acoustic sensor will depend on the surrounding environment, such as the pressure.
  • the acoustic signals emitted from the sensors will have frequency components of the incident signals transmitted from transducer 6 and of the resonant frequency of the sensor 52, and shifted frequency components compared to the signals transmitted from the ultrasound transducer array 6. While the components of the resonant frequency of sensors do not necessarily need to be in the frequency bandwidth covered by the ultrasound array, the shifted frequency components will be. By detecting the presence of shifted frequency components in each receive beam, the location of the acoustic sensor device can be identified.
  • the acoustic sensor responds to an incident acoustic wave by generating a resonant echo signal.
  • the sensor may be of the type disclosed in US 2013/0060139.
  • the sensor may be used for pressure monitoring, based on changes in the frequency response of the sensor to local pressure variations.
  • This invention is concerned in particular with locating the sensor.
  • the obtaining and processing of a sensor reading may be performed in known manner and as described above.
  • Those receive beams are indicated by region 54 within an image 56 representing the spatially received beams.
  • the image 56 represents elevation versus azimuth.
  • the received beams of Figure 2A are obtained based on providing a first, plane wave, transmit beam, and from this the region 54 defines a first obtained location for the ultrasound sensor.
  • This first obtained location is a general location area, of lower precision than is required.
  • delays may be applied to steer the plane-waves to different angles or to adjust how much the beams are diverging.
  • the unfocused (e.g. plane wave) beam imaging the emitted acoustic signals (beams) cover larger areas. For each of these transmit beams, a large field of view can be generated (with lower quality compared to focused-beam imaging).
  • a final image can be generated with image quality comparable to a transmit-focused beam.
  • the advantages of unfocused beam imaging include faster frame rate (less transmits required to reconstruct the whole image) but at the cost of increased system complexity because more parallel-line processing is required to reconstruct the larger field of view.
  • the received echo signals from the sensor will be received at multiple elements of the transducer. These signals arrive at the elements at different time, based on the travelling paths. Therefore, during subsequent receive beamforming, the location of the source signals can be identified.
  • the receive beamforming takes place for every point in the field of view and comprises a delay-and-sum operation for the employed set of transducer elements.
  • the received signals may have different amplitudes and frequencies, from which the location of the sensor can be identified but also information about the environment outside the sensor (i.e. making use of the pressure sensing functionality of the implanted acoustic sensor).
  • a focused beam 58 is formed as shown in Figure 2B. This focuses to a point behind the coarse sensor location as determined by the plane wave imaging. The focusing is achieved by transmit beamforming.
  • This approach provides a first, focused, transmit beam for a smaller region of interest. From this, a more accurate location within the location area found in Figure 2A is found, as shown as pane 60.
  • This process may be repeated iteratively so that a second focused beam 62 is provided as shown in Figure 2C resulting in a more focused beam around the region 60.
  • the transmit beam starts as a broad beam in-plane pattern and progressively narrows down to a focused beam based on the receive beams that have the components of resonance signals from the acoustic sensor.
  • This adaptive transmit pattern provides a quick way to locate the sensor. While the broad beam provides a coarse location of the sensor at the beginning of the location process, the greater spatial resolution and smaller beam width of the focusing beams allow more precise location.
  • An alternative example is to sweep a transmit focus beam throughout the region of interest and identify the beams that carry signals specific to the acoustic sensor device.
  • a first sweep is at a lower resolution, for example with a transmit beam for every Nth scan line (e.g. every fifth scan line).
  • a higher resolution image may be obtained, for example based on a sub-set of adjacent scan lines (e.g. scan line numbers 10 to 15).
  • an imaging process is to be understood as a number of transmit sequences to localize the sensor.
  • a first sequence may involve transmitting every 5 scan lines for the whole 3D volume.
  • a second sequence may then involve transmitting every 3 scan lines with a smaller region 60, and a third sequence then involves transmitting every scan lines for the final sequence with the smallest region 54.
  • the acoustic sensor device will emit pressure waves which are stronger in a direction normal to the sensor membrane, and they will be set into resonance more strongly by an incident ultrasound wave which is directed normally to the membrane orientation.
  • the position of the ultrasound transducer may be selected taking into account an orientation and/or position of the sensor.
  • the known directional angle of the focused beam is used to derive an improved position of the transducer array.
  • the user may then be instructed to move the ultrasound transducer array to a better position shown in Figure 3B.
  • An indicator 64 for example provides a measure of the resonant signal strength so that the user can be directed to move the transducer array to obtain the best location signal.
  • the sensor may not have its membrane parallel to the transducer array plane.
  • Figure 4 shows a transmit beam directed to a sensor 52 which is central to the ultrasound transducer array 6 but the membrane is not parallel.
  • Figure 4A shows that the reflected echo signal 62 is directed laterally. The excitation of the resonance is not optimized.
  • the known directional angle of the received beam is used to derive an improved position of the transducer array.
  • the user may then be instructed to move the ultrasound transducer array to a better position shown in Figure 4B.
  • the transmit beam is then directed perpendicularly to the membrane to provide a resonant excitation resulting in the maximum intensity of the reflected echo 62 towards the direction of the array 6.
  • An indicator 64 again for example provides a measure of the resonant signal strength so that the user can be directed to move the transducer array to obtain the best location signal.
  • the signal processing may compare signal strengths derived from several adjacent transmit beams and select the beam that results in the highest signal strength from sensor. This can happen intermittently throughout the monitoring period and the selected beam can be updated and used as reference for position and signal strength.

Abstract

A location device is provided for determining the location of an acoustic sensor. A location process makes use of a plurality of transmit beams (wherein a beam is defined as a transmission from all transducers of an ultrasound array), with a frequency analysis to identify if there is a signal reflected from the acoustic sensor. A location is obtained from the plurality of frequency analyses.

Description

Location device and system for locating an acoustic sensor
FIELD OF THE INVENTION
This invention relates to a device and method for locating an acoustic sensor, for example for locating an implanted object within a subject. This is of interest for guidewire, catheter or needle tip tracking, and hence guided vascular access generally.
BACKGROUND OF THE INVENTION
Needles, catheters and other interventional tools are often difficult to visualize under ultrasound due to their specular nature and unfavorable incidence angles. One solution for marking a needle tip under ultrasound guidance is to embed a small ultrasound sensor at the tip of the needle. Such a sensor receives the direct ultrasound signals that impinge upon it as imaging beams from an ultrasound imaging probe sweep the field of view. A sensor may also be implanted in the body, for monitoring a condition in the body.
An acoustic sensor device in general has a membrane which deforms in response to an external stimulus, and has a resonance frequency. It can receive and emit acoustic signals with a certain frequency spectra. The resonance frequencies of the sensor device are dependent on the characteristics of the device, for example the internal pressure of the membrane acoustic sensor, or size and material of the device. The external environment also affects the device resonance frequencies. As a result, information about the external environment can be extracted from the resonance frequencies generated by the sensor.
The acoustic sensor device may function solely for location, by generating an identifiable signal at its location. However, by calibrating the frequency response of the sensor, information about the external environment (such as the pressure in a fluid flow field) can also be encoded in the signal received from the acoustic sensor. For example, for pressure sensors, the relationship between the device resonance frequency and the ambient pressure can be calibrated. Based on the frequency detected, the ambient pressure around the device can be determined.
Different ways of using the signals received from the sensor to highlight the position of the sensor in the ultrasound image have been proposed. These rely on time-of- flight of ultrasound from the imaging probe to the sensor for estimating the range coordinate of the sensor, and on the intensity of the received signals as the imaging beams sweep the field of view to recover the lateral coordinate.
The acoustic sensor device is often implanted with other interventional devices such as stents or prosthetic heart valves. As a result, it is challenging to locate the device under B-mode ultrasound.
Ultrasound array transducers may be configured as a one-dimensional (ID) array for imaging a two dimensional (2D) image plane, or as a two dimensional (2D) array of transducer element for imaging a three dimensional region. A 2D array comprises elements extending in both azimuth and elevation directions which can be operated fully independently to both focus and steer beams in any azimuth or elevation direction. These arrays can be configured in either flat or curved orientations.
Each element of the transducer array is individually controlled on transmit and receive using transmit and receive beamforming. A 2D array may have 100-200 rows of elements in one dimension and 100-200 columns of elements in the other dimension, totaling thousands of individual elements. To cope with the number of elements, a microbeamformer integrated circuit can be attached to the transducer array which performs partial beamforming of groups of elements referred to as patches. The individually delayed and summed signals of each patch are conducted over a standard size cable to the ultrasound system beamformer where the summed signal from each patch is applied to a channel of the system beamformer, which completes the beamforming operation.
It is known to make use of a 2D ultrasound transducer array to provide imaging of a volumetric region of interest by electronically modifying the beamforming
(instead of mechanically moving the probe) which has advantages of high frame rates, efficient workflow, and robust elevation focusing. Parallel beamforming approaches allow fast imaging. The electronic control of the beamforming also enables plane wave beams, diverging wave beams as well as focusing beams to be transmitted.
There remains a need for an approach which enables effective location of an acoustic sensor using an ultrasound transducer, for example before extracting information regarding the surrounding environment. It would therefore be desirable to make use of the capabilities of electronic beamforming control to provide an improved location approach. SUMMARY OF THE INVENTION
The invention is defined by the claims.
According to examples in accordance with an aspect of the invention, there is provided a location device for determining the location of an acoustic sensor, comprising:
an ultrasound transducer array arranged to transmit a plurality of ultrasound beams and receive corresponding reflected echo signals;
a controller arrangement, comprising:
a transmit controller for controlling the transmitted signals of the transducer array to provide a transmit beam comprising a transmission from each transducer of the array;
a receive controller for analyzing the received reflected signals, wherein the controller arrangement is adapted to implement a location process which comprises, for each of the plurality of transmitted ultrasound beams, performing a frequency analysis to identify if there is a signal reflected from the acoustic sensor and to identify a location area, and to derive a progressively more accurate final location within said location area from the plurality of frequency analyses.
By performing a set of analyses, for each of a plurality of transmit beams, more accurate location information may be obtained. The device makes use of an adaptive transmit beam pattern to precisely locate an acoustic sensor device.
In this way, the device is able to reduce the uncertainty and improve the workflow in locating the sensor device. The transmit beams can be steered and swept electronically without the transducer being mechanically moved.
The final location is obtained in a manner which progressively increases accuracy. By this is meant that location information is first obtained with a relatively low precision (i.e. identifying only a location area), and then a different imaging approach is conducted using that coarse location area information to obtain a higher precision location.
The scanning approaches may be of different types (such as unfocused and focused) or they may be of the same type but with different scanning parameters (such as different density of scan lines). Overall, the aim is to enable a high precision location to be identified in a way which reduces the amount of time and/or image processing required to reach the desired location accuracy.
In one set of examples, the controller arrangement is adapted to implement a location process which comprises: providing a first, non-focused, transmit beam and obtaining a first location area;
providing at least one further, focused, transmit beam for a smaller region of interest within the first location area and obtaining at least one more accurate location.
In this process, the accuracy of the location is increased in iterative steps, making use of an adaptive transmit pattern. The pattern starts as a non-focused beam such as a broad beam pattern and narrows to a focused beam based on the receive beams that have components of the resonant signals from the acoustic sensor. The broad beam pattern provides a coarse location (i.e. a location area) and the spatial resolution and bandwidth of the focused beam allows a more precise location to be determined.
The controller arrangement may be adapted to implement a location process which comprises iteratively providing multiple transmit beams for successively smaller regions of interest with successively smaller depths of field and obtaining successively more accurate locations. As the depth of field is reduced, the sensor location is closer and closer to the focal point of the transmit beam.
In a second set of examples, the controller arrangement is adapted to implement a location process which comprises scanning a first plurality of focused transmit beams across a region of interest to provide focusing positions with a first spacing (first beam density), and scanning at least a second plurality of focused transmit beams across the region of interest to provide more closely spaced focusing positions (more densely packed scanning lines). The higher first spacing corresponds to a lower precision of the first location area, whereas the more closely spaced focusing positions correspond to a more precise location. Acquiring each scanning line takes time, therefore, this embodiment permits faster defining the first location with a course precision; and further scanning a smaller region of interest within the first location area with more densely packed scanning beams (lines), thereby obtaining with further accuracy a further location of the acoustic sensor without investing additional time.
In this way, beams that carry signals specific to the acoustic sensor device are identified from a lower resolution sweep, and from this information, a more accurate location may be found from a higher resolution sweep.
The controller arrangement may be further adapted to identify an orientation of the acoustic sensor. This can be achieved for a membrane sensor, since the membrane resonance results in a normally directed echo signal. Based on knowledge of the transmit beam direction and the associated received echo signal both the location of the acoustic sensor and its membrane orientation may be derived.
The controller arrangement is thus for example adapted to identify an orientation of the acoustic sensor by determining a transmission angle and emission
(reflection) angle for which the received echo (or signal corresponding to said reflection angle) is strongest.
The device may comprise an output for directing the user to move the ultrasound transducer array to a location which gives the strongest location signal characteristic to the acoustic sensor reflection. This may be directly above the sensor, but for an angled sensor the transducer array location may be offset from the sensor so that the transmit beam is directed normally to the membrane and the receive beam is received from the same normal direction.
In this way, the sensitivity of the location process is improved, based on transmit and receive beamforming to maximize the acoustic pressure reaching the sensor as well as maximizing the beamsum data provide by the receive beamforming process.
The transducer array preferably comprises a 2D array of transducer elements, for 3D localization.
The invention also provides a location system comprising:
an acoustic sensor; and
a location device as defined above.
The acoustic sensor for example comprises a membrane having a resonant frequency within a reception frequency range of the acoustic transducer array, for generating an echo at the resonant frequency.
The invention also provides a method of locating an acoustic sensor, comprising:
controlling the transmission of ultrasound signals of a transducer array to provide a transmit beam comprising a transmission from each transducer of the array; and analyzing received reflected signals,
wherein the method comprises, for each of a plurality of transmit beams, performing a frequency analysis to identify if there is a signal reflected from the acoustic sensor and to determine a location area, and to derive a progressively more accurate final location within the location area from the plurality of frequency analyses.
This approach enables an accurate location to be obtained in a short time, based on combining multiple imaging processes. In one set of examples, the method comprises:
providing a first, unfocused, transmit beam and obtaining a first location area; providing at least one further, focused, transmit beam for a smaller region of interest within the first location area and obtaining at least one more accurate location.
In another set of examples, the method comprises:
scanning a first plurality of focused transmit beams across a region of interest to provide focusing positions with a first spacing; and
scanning at least a second plurality of focused transmit beams across the region of interest to provide more closely spaced focusing positions.
An orientation of the acoustic sensor may also be obtained by determining a transmission angle and corresponding reflection angle for which the received echo is strongest, and directing the user to move the ultrasound transducer array to a location which gives the strongest location signal.
The invention may be implemented at least in part in computer software.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
Examples of the invention will now be described in detail with reference to the accompanying drawings, in which:
Figure 1 shows a known ultrasound imaging system;
Figure 2 shows a first example of a location process using the system of
Figure 1
Figure 3 shows a second example of a location process using the system of
Figure 1 ; and
Figure 4 shows a third example of a location process using the system of
Figure 1.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The invention will be described with reference to the Figures.
It should be understood that the detailed description and specific exampli while indicating exemplary embodiments of the apparatus, systems and methods, are intended for purposes of illustration only and are not intended to limit the scope of the invention. These and other features, aspects, and advantages of the apparatus, systems and methods of the present invention will become better understood from the following description, appended claims, and accompanying drawings. It should be understood that the Figures are merely schematic and are not drawn to scale. It should also be understood that the same reference numerals are used throughout the Figures to indicate the same or similar parts.
The invention provides a location device for determining the location of an acoustic sensor. A location process makes use of a plurality of transmit beams (wherein a beam is defined as a transmission from all transducers of an ultrasound array), with a frequency analysis to identify if there is a signal reflected from the acoustic sensor. A location is obtained from the plurality of frequency analyses, by obtaining a location with progressively increasing resolution and hence accuracy.
The invention makes use of adaptive beamforming.
Transmit beamforming involves providing delays in the transmitted ultrasound signals from a transducer array in order to create an interference pattern in which the majority of the signal energy propagates in one angular direction. Adaptive transmit beamforming enables different interference patterns to be created, including plane wave propagation, or a focused beam directed to a specific point, and at a particular depth from the ultrasound transducer array.
Receive beamforming involves adjusting the amplitude and delays of the received signal on each element in order to measure reception from a chosen angular direction. Thus, to build up an image, receive beamforming is applied for the transducer array in respect of each point in turn to derive the signal intensity received from that point.
An image is formed by combining multiple transmit scan lines, where one scan line is a transmitted and received narrow beam. By combining the received echo data for the set of limes the ultrasound image is created.
For the purposes of this document, a "transmit beam" is meant to indicate the acoustic pressure field that is emitted from a set of transducer elements. The transmit beam can use all transducer elements or a sub-set of elements, depending on the design (imaging depth, resolution, etc.). The shape of the transmit beam can also vary, for example it may have a focus or no focus (e.g. a divergent beam or a plane wave beam).
The general architecture of an ultrasound imaging system will first be described, with reference to Figure 1 which shows an ultrasonic diagnostic imaging system 2 with an array transducer probe 4 in block diagram form. The array transducer probe 4 comprises an array 6 of transducer cells 8.
Traditionally, piezoelectric materials have been used for ultrasonic transducers. Examples are lead zirconate titanate (PZT) and polyvinylidene difluoride (PVDF) materials, with PZT being particularly popular as the material of choice. Single crystal piezoelectric materials are used to achieve high piezoelectric and electro-mechanical coupling constants for high performance transducers.
Recent developments have led to the prospect that medical ultrasound transducers can be batch manufactured by semiconductor processes. Desirably these processes should be the same ones used to produce the application specific integrated circuits (ASICs) needed by an ultrasound probe such as a CMOS process, particularly for 3D ultrasound. These developments have produced micro machined ultrasonic transducers or MUTs, the preferred form being the capacitive MUT (CMUT). CMUT transducers are tiny diaphragm-like devices with electrodes that convert the sound vibration of a received ultrasound signal into a modulated capacitance.
CMUT transducers in particular are able to function over a broad bandwidth, enable high resolution and high sensitivity imaging, and produce a large pressure output so that a large depth of field of acoustic signals can be received at ultrasonic frequencies.
Figure 1 shows a transducer array 6 of CMUT cells 8 as discussed above for transmitting ultrasonic waves and receiving echo information. The transducer array 6 of the system 2 may generally be a one- or a two-dimensional array of transducer elements capable of scanning in a 2D plane or in three dimensions for 3D imaging.
The transducer array 6 is coupled to a micro-beamformer 12 which controls transmission and reception of signals by the CMUT array cells. Micro-beamformers are capable of at least partial beam forming of the signals received by groups or "patches" of transducer elements for instance as described in US patents US 5,997,479 (Savord et al), US 6,013,032 (Savord), and US 6,623,432 (Powers et al.)
The micro-beamformer 12 is coupled by the probe cable, e.g. coaxial wire, to a transmit/receive (T/R) switch 16 which switches between transmission and reception modes and protects the main beam former 20 from high energy transmit signals when a micro- beamformer is not present or used and the transducer array 6 is operated directly by the main system beam former 20. The transmission of ultrasonic beams from the transducer array 6 under control of the micro-beamformer 12 is directed by a transducer controller 18 coupled to the micro-beamformer by the T/R switch 16 and the main system beam former 20, which receives input from the user's operation of the user interface or control panel 38. One of the functions controlled by the transducer controller 18 is the direction in which beams are steered and focused. Beams may be steered straight ahead from (orthogonal to) the transducer array 6, or at different angles for a wider field of view.
The transducer controller 18 may be coupled to control a voltage source 45 for the transducer array. For instance, the voltage source 45 sets DC and AC bias voltage(s) that are applied to the CMUT cells of a CMUT array 6, e.g. to generate the ultrasonic RF pulses in transmission mode.
The partially beam- formed signals produced by the micro-beamformer 12 are forwarded to the main beam former 20 where partially beam- formed signals from individual patches of transducer elements are combined into a fully beam- formed signal. For example, the main beam former 20 may have 128 channels, each of which receives a partially beam- formed signal from a patch of dozens or hundreds of CMUT transducer cells 8. In this way the signals received by thousands of transducer elements of a transducer array 410 can contribute efficiently to a single beam- formed signal.
The beam- formed signals are coupled to a signal processor 22. The signal processor 22 can process the received echo signals in various ways, such as bandpass filtering, decimation, I and Q component separation, and harmonic signal separation which acts to separate linear and nonlinear signals so as to enable the identification of nonlinear (higher harmonics of the fundamental frequency) echo signals returned from tissue and microbubbles.
The signal processor 22 optionally may perform additional signal enhancement such as speckle reduction, signal compounding, and noise elimination. The bandpass filter in the signal processor 22 may be a tracking filter, with its passband sliding from a higher frequency band to a lower frequency band as echo signals are received from increasing depths, thereby rejecting the noise at higher frequencies from greater depths where these frequencies are devoid of anatomical information.
The processed signals are coupled to a B-mode processor 26 and optionally to a Doppler processor 28. The B-mode processor 26 employs detection of an amplitude of the received ultrasound signal for the imaging of structures in the body such as the tissue of organs and vessels in the body. B-mode images of structure of the body may be formed in either the harmonic image mode or the fundamental image mode or a combination of both for instance as described in US Patents US 6,283,919 (RoundhiU et al.) and US 6,458,083 (Jago et al.) The Doppler processor 28, if present, processes temporally distinct signals from tissue movement and blood flow for the detection of the motion of substances, such as the flow of blood cells in the image field. The Doppler processor typically includes a wall filter with parameters which may be set to pass and/or reject echoes returned from selected types of materials in the body. For instance, the wall filter can be set to have a passband characteristic which passes signal of relatively low amplitude from higher velocity materials while rejecting relatively strong signals from lower or zero velocity material.
This passband characteristic will pass signals from flowing blood while rejecting signals from nearby stationary or slowing moving objects such as the wall of the heart. An inverse characteristic would pass signals from moving tissue of the heart while rejecting blood flow signals for what is referred to as tissue Doppler imaging, detecting and depicting the motion of tissue. The Doppler processor receives and processes a sequence of temporally discrete echo signals from different points in an image field, the sequence of echoes from a particular point referred to as an ensemble. An ensemble of echoes received in rapid succession over a relatively short interval can be used to estimate the Doppler shift frequency of flowing blood, with the correspondence of the Doppler frequency to velocity indicating the blood flow velocity. An ensemble of echoes received over a longer period of time is used to estimate the velocity of slower flowing blood or slowly moving tissue.
The structural and motion signals produced by the B-mode (and Doppler) processor(s) are coupled to a scan converter 32 and a multiplanar reformatter 44. The scan converter 32 arranges the echo signals in the spatial relationship from which they were received in a desired image format. For instance, the scan converter may arrange the echo signal into a two dimensional (2D) sector-shaped format, or a pyramidal three dimensional (3D) image.
The scan converter can overlay a B-mode structural image with colors corresponding to motion at points in the image field with their Doppler-estimated velocities to produce a color Doppler image which depicts the motion of tissue and blood flow in the image field. The multiplanar reformatter 44 will convert echoes which are received from points in a common plane in a volumetric region of the body into an ultrasonic image of that plane, for instance as described in US Patent US 6,443,896 (Detmer). A volume renderer 42 converts the echo signals of a 3D data set into a projected 3D image as viewed from a given reference point as described in US Pat. 6,530,885 (Entrekin et al.)
The 2D or 3D images are coupled from the scan converter 32, multiplanar reformatter 44, and volume renderer 42 to an image processor 30 for further enhancement, buffering and temporary storage for display on an image display 40. In addition to being used for imaging, the blood flow values produced by the Doppler processor 28 and tissue structure information produced by the B-mode processor 26 are coupled to a quantification processor 34. The quantification processor produces measures of different flow conditions such as the volume rate of blood flow as well as structural measurements such as the sizes of organs and gestational age. The quantification processor may receive input from the user control panel 38, such as the point in the anatomy of an image where a measurement is to be made.
Output data from the quantification processor is coupled to a graphics processor 36 for the reproduction of measurement graphics and values with the image on the display 40. The graphics processor 36 can also generate graphic overlays for display with the ultrasound images. These graphic overlays can contain standard identifying information such as patient name, date and time of the image, imaging parameters, and the like. For these purposes the graphics processor receives input from the user interface 38, such as patient name.
The user interface is also coupled to the transmit controller 18 to control the generation of ultrasound signals from the transducer array 6 and hence the images produced by the transducer array and the ultrasound system. The user interface is also coupled to the multiplanar reformatter 44 for selection and control of the planes of multiple multiplanar reformatted (MPR) images which may be used to perform quantified measures in the image field of the MPR images.
As will be understood by the skilled person, the above embodiment of an ultrasonic diagnostic imaging system is intended to give a non-limiting example of such an ultrasonic diagnostic imaging system. The skilled person will immediately realize that several variations in the architecture of the ultrasonic diagnostic imaging system are feasible without departing from the teachings of the present invention. For instance, as also indicated in the above embodiment, the micro-beamformer 12 may be omitted, the ultrasound probe 4 may not have 3D imaging capabilities and so on. Other variations will be apparent to the skilled person.
Figure 2 is used to show a first example of a location process.
Figure 2A shows the transducer array 6 generating a plane wave transmit beam 50. The implanted acoustic sensor device is shown as 52. The acoustic sensor device comprises a membrane, having a resonant frequency within the band pass frequency band of the receiver of the system 2, for example within 1 to 5MHz.
The emitted ultrasound excites the membrane of the acoustic sensor device, which then generates an echo signal at its resonant frequency. The resonance frequency of the acoustic sensor will depend on the surrounding environment, such as the pressure. The acoustic signals emitted from the sensors will have frequency components of the incident signals transmitted from transducer 6 and of the resonant frequency of the sensor 52, and shifted frequency components compared to the signals transmitted from the ultrasound transducer array 6. While the components of the resonant frequency of sensors do not necessarily need to be in the frequency bandwidth covered by the ultrasound array, the shifted frequency components will be. By detecting the presence of shifted frequency components in each receive beam, the location of the acoustic sensor device can be identified. The frequency components of the received signals are typically analyzed using the Fourier Transform. Furthermore, by calculating how much the emitted frequency from the sensor is shifted, the resonant frequency can be determined. As previously mentioned, this resonant frequency carries information about the surrounding pressure.
The acoustic sensor responds to an incident acoustic wave by generating a resonant echo signal. By way of example, the sensor may be of the type disclosed in US 2013/0060139. The sensor may be used for pressure monitoring, based on changes in the frequency response of the sensor to local pressure variations.
This invention is concerned in particular with locating the sensor. The obtaining and processing of a sensor reading may be performed in known manner and as described above.
Among the received beams, there is a set for which the resonant frequency of the acoustic sensor is detected based on frequency analysis (i.e. Fourier Transform analysis). Those receive beams are indicated by region 54 within an image 56 representing the spatially received beams. The image 56 represents elevation versus azimuth.
Thus, the received beams of Figure 2A are obtained based on providing a first, plane wave, transmit beam, and from this the region 54 defines a first obtained location for the ultrasound sensor. This first obtained location is a general location area, of lower precision than is required. For the plane wave beam, delays may be applied to steer the plane-waves to different angles or to adjust how much the beams are diverging. For the unfocused (e.g. plane wave) beam imaging, the emitted acoustic signals (beams) cover larger areas. For each of these transmit beams, a large field of view can be generated (with lower quality compared to focused-beam imaging). However, by coherently summing these large individually reconstructed fields of view (FOVs), a final image can be generated with image quality comparable to a transmit-focused beam. The advantages of unfocused beam imaging include faster frame rate (less transmits required to reconstruct the whole image) but at the cost of increased system complexity because more parallel-line processing is required to reconstruct the larger field of view.
The transmit and receive elements used for one transmit beam can include multiple rows and columns of the 2D transducer array. The signals received are stored and then used for beamforming.
The received echo signals from the sensor will be received at multiple elements of the transducer. These signals arrive at the elements at different time, based on the travelling paths. Therefore, during subsequent receive beamforming, the location of the source signals can be identified. The receive beamforming takes place for every point in the field of view and comprises a delay-and-sum operation for the employed set of transducer elements. The received signals may have different amplitudes and frequencies, from which the location of the sensor can be identified but also information about the environment outside the sensor (i.e. making use of the pressure sensing functionality of the implanted acoustic sensor).
After the plane wave transmit beam has been processed, a focused beam 58 is formed as shown in Figure 2B. This focuses to a point behind the coarse sensor location as determined by the plane wave imaging. The focusing is achieved by transmit beamforming.
The focused-beam imaging is for example based on a scan line approach whereby the imaging is performed as a sequence of scan lines, each of which has a single focus. The received echo information is used to build up an image progressively as a set of scan lines.
This approach provides a first, focused, transmit beam for a smaller region of interest. From this, a more accurate location within the location area found in Figure 2A is found, as shown as pane 60.
This process may be repeated iteratively so that a second focused beam 62 is provided as shown in Figure 2C resulting in a more focused beam around the region 60. This yields improved location as shown in pane 64, wherein the region 54 again represents the location of the receive beams that contain the resonant signals from the sensor. Thus, in this way, the transmit beam starts as a broad beam in-plane pattern and progressively narrows down to a focused beam based on the receive beams that have the components of resonance signals from the acoustic sensor. This adaptive transmit pattern provides a quick way to locate the sensor. While the broad beam provides a coarse location of the sensor at the beginning of the location process, the greater spatial resolution and smaller beam width of the focusing beams allow more precise location.
The approach of Figure 2 may also be used to detect movement of the acoustic sensor, by comparing the received sensor signal location from one transmit beam with the received sensor signal location from previous transmit beams. If there is no overlap between them, it indicates that the transducer has moved to a different location. The circles 54 show the possible locations of the sensor in the different images. As the transmit beam pattern changes (un-focused beams become closer and closer to focused beams), the circles 54 should become smaller giving a more precise location, and hence they should overlap (or be contained within) previous location circles. If there is no such overlap as the imaging process proceeds, it means the relative position between the external transducer and the internal sensor has changed, so the sequence should then be restarted with an un-focused transmit beam.
An alternative example is to sweep a transmit focus beam throughout the region of interest and identify the beams that carry signals specific to the acoustic sensor device. A first sweep is at a lower resolution, for example with a transmit beam for every Nth scan line (e.g. every fifth scan line). From the scan lines for which an echo is received from the sensor, a higher resolution image may be obtained, for example based on a sub-set of adjacent scan lines (e.g. scan line numbers 10 to 15). There may then be a progressive increase in resolution if multiple scans are carried out. There may be exactly two imaging processes, but three imaging processes at progressively higher precision are also possible. In this context, an imaging process is to be understood as a number of transmit sequences to localize the sensor. For example, a first sequence may involve transmitting every 5 scan lines for the whole 3D volume. A second sequence may then involve transmitting every 3 scan lines with a smaller region 60, and a third sequence then involves transmitting every scan lines for the final sequence with the smallest region 54.
The acoustic sensor device will emit pressure waves which are stronger in a direction normal to the sensor membrane, and they will be set into resonance more strongly by an incident ultrasound wave which is directed normally to the membrane orientation. Thus, to receive a strongest location signal, the position of the ultrasound transducer may be selected taking into account an orientation and/or position of the sensor.
For a sensor which remains relatively stationary, it may be desirable to reposition the ultrasound transducer array to obtain the best signal.
Figure 3 shows a beam directed to a sensor 52 which is off-center to the ultrasound transducer array 6. Figure 3A shows that the focused beam 60 is directed laterally so that it does not provide optimal excitation of the sensor (assuming the sensor membrane plane is parallel to the plane of the transducer array). The reflected from the sensor echo is shown as 62.
The known directional angle of the focused beam is used to derive an improved position of the transducer array. The user may then be instructed to move the ultrasound transducer array to a better position shown in Figure 3B.
An indicator 64 for example provides a measure of the resonant signal strength so that the user can be directed to move the transducer array to obtain the best location signal.
The sensor may not have its membrane parallel to the transducer array plane.
Figure 4 shows a transmit beam directed to a sensor 52 which is central to the ultrasound transducer array 6 but the membrane is not parallel. Figure 4A shows that the reflected echo signal 62 is directed laterally. The excitation of the resonance is not optimized.
The known directional angle of the received beam is used to derive an improved position of the transducer array. The user may then be instructed to move the ultrasound transducer array to a better position shown in Figure 4B. The transmit beam is then directed perpendicularly to the membrane to provide a resonant excitation resulting in the maximum intensity of the reflected echo 62 towards the direction of the array 6.
An indicator 64 again for example provides a measure of the resonant signal strength so that the user can be directed to move the transducer array to obtain the best location signal.
By receiving echo signals normally the sensor has improved signal-to-noise- ratio for the analysis. By providing the transmit beam normally to the membrane surface, the resonance excitation is improved.
There may be a loss of signal strength when the relative position of the transducer and the sensors varies, if the transducer position is fixed for a relatively longer period. The signal processing may compare signal strengths derived from several adjacent transmit beams and select the beam that results in the highest signal strength from sensor. This can happen intermittently throughout the monitoring period and the selected beam can be updated and used as reference for position and signal strength.
The examples above is based on a single external transducer array. However, multiple transducer arrays may be used, each of which behaves similarly to cover larger areas.
The imaging capability of an ultrasound system is described above. However, for the purposes of location, there is no need to provide an ultrasound image to the end user. Thus, the invention in its most basic form may simply provide location information for example for the purposes of then performing a sensor measurement, and there may be no need to an actual ultrasound image to be produced. Of course, in many cases, the location function goes hand in hand with an imaging function so that the identified location is presented in combination with an image of the surrounding area.
Frequency analysis is performed in respect of the received signals. This may for example be performed by the processor 18. The frequency analysis is for example to identify the Doppler shift of the received signals in relation to the transmitted beams. This general method of identifying a resonant frequency of a resonator, such as an ultrasonic sensor, is for example described in US 2004/0211260.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. A location device for determining the location of an acoustic sensor, comprising:
an ultrasound transducer array (6) arranged to transmit a plurality of ultrasound beams and receive corresponding reflected echo signals;
a controller arrangement (18), said arrangement comprising:
a transmit controller for controlling transmitted signals of the transducer array to provide a transmit beam comprising a transmission from each transducer of the array;
a receive controller for analyzing the received reflected signals, wherein the controller arrangement is adapted to implement a location process which comprises, for each of the plurality of transmitted ultrasound beams, performing a frequency analysis of the received reflected signals to identify if there is a signal reflected from the acoustic sensor and to identify a location area of said acoustic sensor, and to derive a progressively more accurate final location within said location area from the plurality of frequency analyses.
2. The device as claimed in claim 1, wherein the controller arrangement (18) is adapted to implement a location process which comprises:
providing a first, non-focused, transmit beam (50) and obtaining a first location area;
providing at least one further, focused, transmit beam (58) for a smaller region of interest within the first location area and obtaining at least one more accurate location of the acoustic sensor.
3. The device as claimed in claim 2, wherein the controller arrangement (18) is adapted to implement a location process which comprises iteratively providing multiple transmit beams for successively smaller regions of interest with successively smaller depths of field and obtaining successively more accurate location areas of the acoustic sensor.
4. The device as claimed in claim 1, wherein the controller arrangement (18) is adapted to implement a location process which comprises:
scanning a first plurality of focused transmit beams across a region of interest to provide focusing positions with a first spacing; and
scanning at least a second plurality of focused transmit beams across the region of interest to provide more closely spaced focusing positions.
5. The device as claimed in any preceding claim, wherein the controller arrangement (18) is further adapted to identify an orientation of the acoustic sensor.
6. The device as claimed in claim 5, wherein the controller arrangement (18) is adapted to identify the orientation of the acoustic sensor by determining a transmission angle of the transmitted beam and corresponding reflection angle for which the received signal reflected from the acoustic sensor is strongest.
7. The device as claimed in claim 5 or 6, comprising an output for directing a user to move the ultrasound transducer array to a location which gives the strongest reflected from the acoustic sensor signal.
8. The device as claimed in any preceding claim, wherein the transducer array (6) comprises a 2D array of transducer elements (8).
9. A location system comprising:
an acoustic sensor (52); and
a location device as claimed in any preceding claim.
10. The system as claimed in claim 11, wherein the acoustic sensor (52) comprises a membrane having a resonant frequency within a reception frequency range of the acoustic transducer array, for generating an echo at the resonant frequency.
11. A method of locating an acoustic sensor, comprising:
controlling the transmission of ultrasound signals of a transducer array to provide a transmit beam comprising a transmission from each transducer of the array; and analyzing received reflected signals, wherein the method comprises, for each of a plurality of transmit beams, performing a frequency analysis of received reflected signals to identify if there is a signal refiected from the acoustic sensor and to determine a location area of said acoustic sensor, and to derive a progressively more accurate final location within the location area from the plurality of frequency analyses.
12. The method as claimed in claim 11, wherein the method comprises:
providing a first, unfocused, transmit beam and obtaining a first location area; providing at least one further, focused, transmit beam for a smaller region of interest within the first location area and obtaining at least one more accurate location.
13. The method as claimed in claim 12, wherein the method comprises:
scanning a first plurality of focused transmit beams across a region of interest to provide focusing positions with a first spacing; and
scanning at least a second plurality of focused transmit beams across the region of interest to provide more closely spaced focusing positions.
14. The method as claimed in claim 12 or 13, comprising identifying an orientation of the acoustic sensor by determining a transmission angle and corresponding reflection angle for which the received echo is strongest, and directing the user to move the ultrasound transducer array to a location which gives the strongest location signal.
15. A computer program comprising computer program code means which is adapted when said program is run on a computer, to implement the method of any one of claims 10 to 14.
PCT/EP2018/054999 2017-03-10 2018-03-01 Location device and system for locating an acoustic sensor WO2018162305A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US16/492,446 US11747456B2 (en) 2017-03-10 2018-03-01 Location device and system for locating an ultrasound acoustic sensor
EP18707716.9A EP3592240B1 (en) 2017-03-10 2018-03-01 Location system for locating an acoustic sensor
JP2019548395A JP7167045B2 (en) 2017-03-10 2018-03-01 Location devices and systems for positioning acoustic sensors
CN201880017100.1A CN110392553B (en) 2017-03-10 2018-03-01 Positioning device and system for positioning acoustic sensors

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201762469592P 2017-03-10 2017-03-10
EP17160264.2 2017-03-10
EP17160264 2017-03-10
US62/469,592 2017-03-10
US201762577198P 2017-10-26 2017-10-26
US62/577198 2017-10-26

Publications (1)

Publication Number Publication Date
WO2018162305A1 true WO2018162305A1 (en) 2018-09-13

Family

ID=58401349

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/054999 WO2018162305A1 (en) 2017-03-10 2018-03-01 Location device and system for locating an acoustic sensor

Country Status (1)

Country Link
WO (1) WO2018162305A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019243896A3 (en) * 2018-06-20 2020-04-02 Microtech Medical Technologies, Ltd. Apparatus and system for increasing object visibility
CN111616736A (en) * 2019-02-27 2020-09-04 深圳市理邦精密仪器股份有限公司 Ultrasonic transducer alignment method, device and system and storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6013032A (en) 1998-03-13 2000-01-11 Hewlett-Packard Company Beamforming methods and apparatus for three-dimensional ultrasound imaging using two-dimensional transducer array
US6283919B1 (en) 1996-11-26 2001-09-04 Atl Ultrasound Ultrasonic diagnostic imaging with blended tissue harmonic signals
US6443896B1 (en) 2000-08-17 2002-09-03 Koninklijke Philips Electronics N.V. Method for creating multiplanar ultrasonic images of a three dimensional object
US6458083B1 (en) 1996-11-26 2002-10-01 Koninklijke Philips Electronics N.V. Ultrasonic harmonic imaging with adaptive image formation
US6530885B1 (en) 2000-03-17 2003-03-11 Atl Ultrasound, Inc. Spatially compounded three dimensional ultrasonic images
US6623432B2 (en) 2000-08-24 2003-09-23 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging transducer with hexagonal patches
US6685645B1 (en) * 2001-10-20 2004-02-03 Zonare Medical Systems, Inc. Broad-beam imaging
US20040211260A1 (en) 2003-04-28 2004-10-28 Doron Girmonsky Methods and devices for determining the resonance frequency of passive mechanical resonators
WO2004107963A2 (en) * 2003-06-03 2004-12-16 Allez Physionix Limited Non-invasive determination of intracranial pressure via acoustic transducers
US20060085049A1 (en) * 2004-10-20 2006-04-20 Nervonix, Inc. Active electrode, bio-impedance based, tissue discrimination system and methods of use
US20080255452A1 (en) * 2004-09-29 2008-10-16 Koninklijke Philips Electronics, N.V. Methods and Apparatus For Performing Enhanced Ultrasound Diagnostic Breast Imaging
WO2010033875A1 (en) * 2008-09-19 2010-03-25 Physiosonics, Inc. Acoustic palpation using non-invasive ultrasound techniques to identify and localize tissue eliciting biological responses
US20120179046A1 (en) * 2011-01-07 2012-07-12 General Electric Company Abdominal Sonar System and Apparatus
US20130060139A1 (en) 2011-09-01 2013-03-07 Microtech Medical Technologies Ltd. Method of detecting portal and/or hepatic pressure and a portal hypertension monitoring system
US20130296701A1 (en) * 2011-11-02 2013-11-07 Seno Medical Instruments, Inc. Playback mode in an optoacoustic imaging system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6283919B1 (en) 1996-11-26 2001-09-04 Atl Ultrasound Ultrasonic diagnostic imaging with blended tissue harmonic signals
US6458083B1 (en) 1996-11-26 2002-10-01 Koninklijke Philips Electronics N.V. Ultrasonic harmonic imaging with adaptive image formation
US6013032A (en) 1998-03-13 2000-01-11 Hewlett-Packard Company Beamforming methods and apparatus for three-dimensional ultrasound imaging using two-dimensional transducer array
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6530885B1 (en) 2000-03-17 2003-03-11 Atl Ultrasound, Inc. Spatially compounded three dimensional ultrasonic images
US6443896B1 (en) 2000-08-17 2002-09-03 Koninklijke Philips Electronics N.V. Method for creating multiplanar ultrasonic images of a three dimensional object
US6623432B2 (en) 2000-08-24 2003-09-23 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging transducer with hexagonal patches
US6685645B1 (en) * 2001-10-20 2004-02-03 Zonare Medical Systems, Inc. Broad-beam imaging
US20040211260A1 (en) 2003-04-28 2004-10-28 Doron Girmonsky Methods and devices for determining the resonance frequency of passive mechanical resonators
WO2004107963A2 (en) * 2003-06-03 2004-12-16 Allez Physionix Limited Non-invasive determination of intracranial pressure via acoustic transducers
US20080255452A1 (en) * 2004-09-29 2008-10-16 Koninklijke Philips Electronics, N.V. Methods and Apparatus For Performing Enhanced Ultrasound Diagnostic Breast Imaging
US20060085049A1 (en) * 2004-10-20 2006-04-20 Nervonix, Inc. Active electrode, bio-impedance based, tissue discrimination system and methods of use
WO2010033875A1 (en) * 2008-09-19 2010-03-25 Physiosonics, Inc. Acoustic palpation using non-invasive ultrasound techniques to identify and localize tissue eliciting biological responses
US20120179046A1 (en) * 2011-01-07 2012-07-12 General Electric Company Abdominal Sonar System and Apparatus
US20130060139A1 (en) 2011-09-01 2013-03-07 Microtech Medical Technologies Ltd. Method of detecting portal and/or hepatic pressure and a portal hypertension monitoring system
US20130296701A1 (en) * 2011-11-02 2013-11-07 Seno Medical Instruments, Inc. Playback mode in an optoacoustic imaging system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019243896A3 (en) * 2018-06-20 2020-04-02 Microtech Medical Technologies, Ltd. Apparatus and system for increasing object visibility
CN111616736A (en) * 2019-02-27 2020-09-04 深圳市理邦精密仪器股份有限公司 Ultrasonic transducer alignment method, device and system and storage medium

Similar Documents

Publication Publication Date Title
EP2816958B1 (en) Determining material stiffness using multiple aperture ultrasound
EP3548920B1 (en) Methods and systems for filtering ultrasound image clutter
KR102607014B1 (en) Ultrasound probe and manufacturing method for the same
EP3598950A1 (en) Ultrasound controller unit and method
EP3592240B1 (en) Location system for locating an acoustic sensor
CN113412087A (en) Method and system for valve regurgitation assessment
JP7346586B2 (en) Method and system for acquiring synthetic 3D ultrasound images
EP3787517B1 (en) Systems and methods for ultrasound screening
WO2018162305A1 (en) Location device and system for locating an acoustic sensor
KR20170045985A (en) Ultrasound imaging apparatus and controlling method for the same
WO2018099867A1 (en) Methods and systems for filtering ultrasound image clutter
JP7261870B2 (en) Systems and methods for tracking tools in ultrasound images
KR20180096342A (en) Ultrasound probe and manufacturing method for the same
EP3900846A1 (en) Acoustic imaging probe with a transducer element
EP4132364B1 (en) Methods and systems for obtaining a 3d vector flow field
EP3849424B1 (en) Tracking a tool in an ultrasound image
EP4179978A1 (en) 3d ultrasound imaging with fov adaptation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18707716

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019548395

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2018707716

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2018707716

Country of ref document: EP

Effective date: 20191010