GB2557913A - Ultrasonic imaging device - Google Patents

Ultrasonic imaging device Download PDF

Info

Publication number
GB2557913A
GB2557913A GB1621423.1A GB201621423A GB2557913A GB 2557913 A GB2557913 A GB 2557913A GB 201621423 A GB201621423 A GB 201621423A GB 2557913 A GB2557913 A GB 2557913A
Authority
GB
United Kingdom
Prior art keywords
phase
imaging device
ultrasonic imaging
analogue
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1621423.1A
Other versions
GB201621423D0 (en
Inventor
Peyton Graham
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ip2ipo Innovations Ltd
Original Assignee
Imperial Innovations Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imperial Innovations Ltd filed Critical Imperial Innovations Ltd
Priority to GB1621423.1A priority Critical patent/GB2557913A/en
Publication of GB201621423D0 publication Critical patent/GB201621423D0/en
Priority to PCT/GB2017/053761 priority patent/WO2018109490A1/en
Publication of GB2557913A publication Critical patent/GB2557913A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8997Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using synthetic aperture techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/07Endoradiosondes
    • A61B5/073Intestinal transmitters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • A61B8/4227Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by straps, belts, cuffs or braces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4472Wireless probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • A61B8/4488Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8927Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array using simultaneously or sequentially two or more subarrays or subapertures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8995Combining images from different aspect angles, e.g. spatial compounding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52025Details of receivers for pulse systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52025Details of receivers for pulse systems
    • G01S7/52026Extracting wanted echo signals
    • G01S7/52028Extracting wanted echo signals using digital techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/5208Constructional features with integration of processing functions inside probe or scanhead

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Acoustics & Sound (AREA)
  • General Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Gynecology & Obstetrics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An ultrasonic imaging device is disclosed comprising an ultrasonic transducer array 11, a multiplexer 15 comprising a plurality of inputs and at least one output, each input coupled to a respective transducer element of the array, an analogue front-end 23 comprising at least one channel, each channel coupled to a respective output of the multiplexer and each channel including a respective analogue demodulator 44, 45 arranged to extract in phase and quadrature components of a signal from the transducer and to provide demodulated in-phase and quadrature signal components 50, 51, at least two analogue-to-digital converters 60/61 configured to receive signals comprising or obtained from the in-phase and quadrature signal components and to provide digitised demodulated in-phase and quadrature signal components and a wireless interface 32 configured to transmit signals comprising or obtained from the digitised demodulated in-phase and quadrature signal components. The device may include first and second low pass filters for the respective in-phase and quadrature signals. The analogue-to-digital converters may sample at a non-uniform rate. An independent claim is also included for a signal processor device that receives parametric or filtered quasi in-phase and quadrature signals and reconstructs them to perform synthetic aperture beamforming.

Description

(54) Title of the Invention: Ultrasonic imaging device
Abstract Title: Ultrasonic imaging device comprising analogue signal processing, digital conversion and wireless transmission.
(57) An ultrasonic imaging device is disclosed comprising an ultrasonic transducer array 11, a multiplexer 15 comprising a plurality of inputs and at least one output, each input coupled to a respective transducer element of the array, an analogue front-end 23 comprising at least one channel, each channel coupled to a respective output of the multiplexer and each channel including a respective analogue demodulator 44, 45 arranged to extract in-phase and quadrature components of a signal from the transducer and to provide demodulated in-phase and quadrature signal components 50, 51, at least two analogue-to-digital converters 60/61 configured to receive signals comprising or obtained from the in-phase and quadrature signal components and to provide digitised demodulated in-phase and quadrature signal components and a wireless interface 32 configured to transmit signals comprising or obtained from the digitised demodulated in-phase and quadrature signal components. The device may include first and second low pass filters for the respective in-phase and quadrature signals. The analogue-to-digital converters may sample at a non-uniform rate.
An independent claim is also included for a signal processor device that receives parametric or filtered quasi inphase and quadrature signals and reconstructs them to perform synthetic aperture beamforming.
Figure GB2557913A_D0001
Fig. 1
Figure GB2557913A_D0002
Figure GB2557913A_D0003
Figure GB2557913A_D0004
Figure GB2557913A_D0005
Figure GB2557913A_D0006
Figure GB2557913A_D0007
m ¢/) o
co
Receive synthetic aperture processing (where M = 1)
Figure GB2557913A_D0008
higher res image #1
Figure GB2557913A_D0009
Figure GB2557913A_D0010
Figure GB2557913A_D0011
Figure GB2557913A_D0012
Sample
Figure GB2557913A_D0013
First 2D-mode image obtained Second 2D-mode image using quadrature beamforming obtaining using quadrature earned out with 8 transmit beamforming carried out with 48
Third 2D-mode image obtained using beamfbrminq carried out in
RF domain with 48 transmit
G dements (F# = 2.5) transmit dements (F#= 2.5) elements i'F#·-· 2.5)
Figure GB2557913A_D0014
-10 0 10
-10 0 10
Lateral axis [mm
100
Figure GB2557913A_D0015
Figure GB2557913A_D0016
Figure GB2557913A_D0017
CO
Figure GB2557913A_D0018
Q beamforming
V
113
Compressive SAB beamforming carried out with 48 transmit elements K == 10
Compressive SAB beamforming carried out with 48 transmit elements K = 20
Compressive SAB beamforming carried out with 48 transmit elements K = 40 ο
100
Figure GB2557913A_D0019
0 10 -10
-10 0 10
Lateral axis fmm
Figure GB2557913A_D0020
Figure GB2557913A_D0021
Fig. 9
Figure GB2557913A_D0022
Fig. 10
Ultrasonic imaging device
Field of the Invention
The present invention relates to an ultrasonic imaging device and to an ultrasonic 5 imaging system which includes an ultrasonic imaging device and a processing system.
Background
Point of care ultrasonography has been the focus of extensive research over the past few decades and various single-chip, miniaturized and wearable ultrasonic imaging systems have been proposed.
For example, US 2014/288428 Al describes single-chip ultrasonic imaging in which on-chip signal processing is used to reduce data bandwidth. US 2012/101386 Al describes wrapping an array of acoustic transducers wrapped around the circumference of a capsule for endoscopy and sending generated echo image signals to receiver devices attached or worn on the body.
WO 2015/138643 Ai discloses a wearable ultrasound system comprising an ultrasound probe, a proximal wearable component electrically interconnected with the ultrasound probe adapted to be wearable on the hand, wrist, or arm of a user, and including at least one user interface mechanism. US 2012/0065479 Ai describes a wearable patch which comprises an ultrasound sensor array.
As ultrasonic imaging systems become smaller and adapted for remote use, for instance in the form of a capsule to be swallowed or a patch to be worn by a patient, they will be faced with additional challenges such as limits on power consumption and transmission data rates.
Summary
According to a first aspect of the present invention there is provided an ultrasonic imaging device comprising an ultrasonic transducer array comprising a plurality of transducer elements, a multiplexer comprising a plurality of inputs and at least one output, each input coupled to a respective transducer element, an analogue front-end comprising at least one channel, each channel coupled to a respective output of the multiplexer and each channel including a respective analogue demodulator arranged to extract in-phase and quadrature components of a signal from the transducer and to provide demodulated in-phase and quadrature signal components, at least two analogue-to-digital converters configured to receive signals comprising or obtained from the in-phase and quadrature signal components and to provide digitised demodulated in-phase and quadrature signal components and a wireless interface configured to transmit signals comprising or obtained from the digitised demodulated in-phase and quadrature signal components.
This can be helpful particularly in applications employing frame rates less than 30 Hz, for example, in a range of 2 to 4 Hz, and in applications with severe power, size and data bandwidth constraints.
The analogue front-end may include the at least two analogue-to-digital converters.
Each channel of the analogue front-end may include first and second low-pass filters arranged to filter the in-phase and quadrature signal components respectively before the in-phase and quadrature signal components are digitised by the analogue-to-digital converters.
The analogue-to-digital converters may sample at a uniform rate. Thus, the low-pass filters may provide, to the wireless interface, filtered digitised demodulated in-phase and quadrature signal components (herein also referred to as “quasi digitised demodulated in-phase and quadrature signals”) which are based on the digitised demodulated in-phase and quadrature signal components using low-pass filters.
The analogue-to-digital converters may sample at a non-uniform rate. Thus, the analogue-to-digital converters may provide filtered digitised demodulated in-phase and quadrature signal components to the wireless interface without the need for low-pass filters.
-3The first and second low-pass filters may have respective bandwidths which are less than the Nyquist cut-off frequency. The analogue-to-digital converters may be configured to supply the digitised demodulated in-phase and quadrature signal components to the wireless interface and wherein wireless interface is configured to transmit the digitised demodulated in-phase and quadrature signal components.
The ultrasonic imaging device may further comprise a processor arranged, for each channel, to receive the digitised demodulated in-phase and quadrature signal components and to perform synthetic aperture beamforming based on the digitised demodulated in-phase and quadrature signal components.
The processor may be configured to apply a time delay to digitised demodulated inphase and quadrature signal components. The processor maybe arranged to carry out interpolation and to provide interpolated in-phase and quadrature signal values. The time delay maybe applied during interpolation, i.e. as part of interpolation. The time delay maybe applied after interpolation. The processor may be arranged to cariy out phase rotation on the interpolated in-phase and quadrature signal values to produce first and second RF signal values. The processor maybe arranged to carry out summation of the first and second RF signals to a given memory location corresponding to a given pixel. The processor may be arranged to repeat interpolation, phase rotation and summation for a series of digitised demodulated in-phase and quadrature signal components.
The ultrasonic imaging device may further comprise memoiy (such as Flash memory or SRAM) for storing a two-dimensional image or a three-dimensional image comprising pixel values received from the processor.
The processor may be an ASIC, FPGA or other type of monolithic integrated circuit.
The monolithic integrated circuit may include the memory (i.e. be on-chip memory). The memory may comprise a separate integrated circuit (i.e. be off-chip memory).
The wireless interface may be configured to transmit a two-dimensional image or three dimensional image.
-4The ultrasonic imaging device may further comprise an excitation circuit comprising a pulse, a transmit beamformer and a processor configured to control the beamformer. The processor of the excitation circuit may be configured to perform synthetic transmit aperture beamforming.
The same processor may be used for synthetic transmit aperture beamforming and synthetic receive aperture beamforming.
The processor(s) may comprise an ASIC or an FPGA. A monolithic integrated circuit may provide the analogue front end and the processor(s). In other words, the imaging device maybe implemented in a single chip.
The analogue front-end may comprise one channel.
The ultrasonic imaging device may comprise a housing which contains the ultrasonic transducer array, the multiplexer, the analogue front-end, the at least two analogue-todigital converters and the wireless interface and wherein the housing has a volume less than 5 cm3 and preferably less than 2 cm3.
The ultrasonic imaging device maybe adapted to be a capsule for swallowing by a human subject or a non-human animal subject or for passing through a passage of a non-animal subject.
The ultrasonic imaging device may be adapted to be a patch for applying to a surface of a subject.
The ultrasonic imaging device may be adapted to be a hand-held wand for scanning over a surface of a subject.
According to a second aspect of the present invention there is provided a device comprising a network interface and a processor coupled to the network interface (for example, a wired or wireless network interface), wherein the processor is configured to receive parametric or filtered quasi in-phase and quadrature signals, to reconstruct inphase and quadrature signals and to perform synthetic aperture beamforming.
-5According to a third aspect of the present invention there is provided an ultrasonic imaging system comprising an ultrasonic imaging device according to the first aspect of the present invention and a processing device comprising a network interface (for example, a wired or wireless network interface), a processor, storage and a display, wherein the ultrasonic imaging device and the processing device are in wireless communication.
-6Brief Description of the Drawings
Certain embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:
Figure l is a schematic block diagram of an ultrasonic imaging system;
Figure 2 schematically illustrates synthetic aperture beamforming;
Figure 3 is a schematic block diagram of an imaging device having a first receive signal processing arrangement employing synthetic aperture beamforming on demodulated I/Q signals;
Figure 4 is a finite state machine for a synthetic aperture beamforming process;
Figure 5 shows first, second and third two-dimensional mode ultrasound images;
Figure 6 is a schematic block diagram of an imaging device having a second receive signal processing arrangement employing compressive synthetic aperture beamforming on demodulated I/Q signals and a back-end processing device;
Figure 7 shows fourth, fifth and sixth two-dimensional mode ultrasound images;
Figure 8 is a schematic view of an ultrasonic imaging device in the form of a capsule; Figure 9 is a schematic view of an ultrasonic imaging device in the form of a wearable patch; and
Figure 10 is a schematic view of an ultrasonic imaging device in the form of a hand-held wand.
Detailed Description of Certain Embodiments
In the following description, like parts are denoted by like reference numerals.
Ultrasonic imaging system 1
Referring to Figure 1, a system 1 for ultrasonically imaging a sample 2 in real time is shown. The system 1 includes an ultrasonic imaging device 3 and a processing system
4. The imaging device 3 is capable of capturing ultrasound signals of the sample 2, processing the ultrasound signals in real-time, in particular demodulating RF signals in the analogue domain and optionally carrying out synthetic aperture beamforming to form two-dimensional images, and transmitting processed signals to the processing system 4. The processing system 4 may carry out further processing of the signals, such as signal reconstruction and baseband beamforming, and display images.
The ultrasonic imaging device 3 includes an ultrasonic transducer array 11 comprising an array of N transducer elements 12 for generating ultrasound waves 13 and detecting reflected ultrasound waves 14, where N is greater than 1 and may be, for example, N
-Ίequal to 32, 64 or 128. The imaging device 3 includes a multiplexer/demultiplexer 15 and transmit/receive switches 16 which couple the transducer array 11 to excitation and detection circuitry 17,18.
The excitation circuity 17 generates excitation pulses 19 for the transducer array 11 and includes a pulser 20 and a transmit beamformer 21. The detection circuitry 18 processes signals 22 received from the transducer array 11 and includes an analogue front-end 23 for demodulating the received signals 22 in the analogue domain into inphase and quadrature signals (herein also referred to as “demodulated I/Q signals” or “I/Q signals”) and generating digitised I/Q signals 29,30. As will be explained in more detail hereinafter, demodulating the received signals 22 in the analogue domain can help to reduce digital processing overhead which can lead to reduced power consumption and reduced bandwidth needed for transmission to the processing device 4·
The ultrasonic imaging device 3 includes a digital processor 28. The digital processor 28, among other things, controls transmission beamforming. The digital processor 28 can also perform other functions, such as (e.g. log) compression.
Optionally, the digital processor 28 may carry out synthetic aperture beamforming of digitalised I/Q signals received from analogue front-end 22 to produce twodimensional images. However, as will be also be explained hereinafter, the ultrasonic imaging device 3 need not carry out synthetic aperture beamforming. Instead, the ultrasonic imaging device 3 simply demodulate signals, employ a low-pass filter having a bandwidth which is reduced below the Nyquist cut-off frequency and transmit lowrate samples to the processing system 5 which carries out signal reconstruction and baseband beamforming.
The ultrasonic imaging device 3 may include memory 31, for example in the form of static random access memory (SRAM), for storing processed signals 29,30, for example in the form of 2D image pixel values, before transmission to the processing device 4. The ultrasonic imaging device 3 includes memory 31 when image processing is carried out in the ultrasonic imaging device 3. However, if image processing is carried out in the processing device 4, then the memory 31 may be omitted.
-8The ultrasonic imaging device 3 includes a wireless interface 32 for transmitting wireless signals 32 to the processing system 4. The wireless interface 32 may take the form of a MICS band transceiver having a suitably high bit rate (for example, greater than 800 kps), an ISM-band transceiver, a WiFi transceiver, Bluetooth (RTM) transceiver, ZigBee (RTM) transceiver or other form of suitable wireless network transceiver.
The ultrasonic imaging device 3 includes a battery (not shown) and may include an energy-harvesting device (not shown). As will be explained in more detail later, the ultrasonic imaging device 3 is capable of operating remotely from the processing system 4.
Referring still to Figure 1, the processing system 4 includes a wireless interface 33, a digital processor 34, non-volatile memory or storage 35 (for example, in Flash memory or solid-state disk drive) and a display 36 for displaying two-dimensional images 39. The processing system 4 takes the form of a computer system having wireless connectivity and maybe a portable or handheld computing device having wireless connectivity, such as a lap-top computer, tablet computer or even smartphone.
Transmission beamforming
Referring still to Figure 1, transmit beamforming employs synthetic aperture beamforming whereby an aperture is synthetically formed by multiplexing a group of transducer elements 12 over the transducer array 11. A single transducer element 12 can be excited at each step in the beamforming, although multiple transducer elements 12 may be excited (with or without aperture apodization) to improve signal-to-noise ratio. A phase delay, τη, is applied to each pulse 19 exciting a respective transducer element 12 so as to form a parabolic defocusing lens. The delay, τη, for the nth transducer element 12 is set to (l/u)(x£/2zd) where xn is the distance to the nth element 12 from the sub-aperture centre, zd is the distance of a defocal point from the sub-aperture and v is the velocity of sound which, for body tissue, is 1,540 ms-1.
The digital processor 28 controls the digital beamformer 21 which produces delayed excitation pulses 37. These pulses are amplified by the pulser 20 to an appropriate voltage level, for example, between 15 and 50 V, depending on the type of transducer. For example, piezoelectric micro-machined ultrasound transducers maybe excited using CMOS-level voltages, i.e. 3.3 V to 5 V. The excitation signal is a unipolar pulse
-9with a duration of half the carrier period. Other transducer types, such as bulk piezoceramic, capacitive micro-machined ultrasound transduce and the like, may require higher voltage excitation signals, i.e. > 5 V.
External high voltage switches (not shown) in the transmit/receive switches 16 may be used protect the analogue front-end 23 during transmission. These switches are controlled using the digital processor 28. During transmission, the switches are open, thereby isolating high voltage pulses from the analogue front-end 23. During reception, the switches are closed thereby allowing reflected signals through to the analogue front-end 23.
Synthetic aperture beamforming
As mentioned hereinbefore, the ultrasonic imaging device 3 can perform synthetic aperture beamforming on digitised demodulated I/Q signals. Thus, the ultrasonic imaging device 3 may employ both receive and transmit aperture beamforming with I/Q demodulation. This can be used to reduce hardware complexity and power consumption.
Details regarding synthetic receive aperture beamforming protocol can be found in K.
L. Gammelmark and J. A. Jensen: “Multielement synthetic transmit aperture imaging using temporal encoding”, IEEE Transactions on Medical Imaging, volume 22, pages 552-563 (2003) which is incorporated herein by reference. Details regarding synthetic transmit aperture beamforming can be found in H. Azhari, Basics of Biomedical Ultrasound for Engineers, 1st ed. Wiley-IEEE Press, 2010 which is incorporated herein by reference.
In synthetic receive aperture beamforming (SRA), only one or a group of elements 12 are used in receive, resulting in a small active receive aperture. Different elements 12 are multiplexed to synthesise a larger aperture. In regarding synthetic transmit aperture (STA), only a single element 12 is used for transmission. This creates a cylindrical wave 13 front which covers the whole region of interest. The echo 14 is received by all elements 12 and processing is done in parallel to form a low-resolution image. A second transmission yields a second image and so forth. After Nt transducer elements have transmitted, low-resolution images are summed and a high-resolution image is created.
- 10 The ultrasonic imaging device 3 can combine synthetic receive aperture and synthetic transmit aperture by serialising formation of both transmit and receive apertures so as to maximise the signal-to-noise ratio and reduce hardware complexity.
Referring to Figure 2, an element 12 or a group of elements 12 forming a parabolic defocusing lens is excited to ensonify a region of interest (step SA). Transmission is carried out n=N/M times for all receive elements 12 (step SB) and M receive channels are required to process the reflection signal (where N> M and M > 1). For example, Figure 2 shows the beamforming process for M = 1 receive channel, which is the simplest case requiring the least hardware complexity. The result is a series of n lowresolution images 371, 372...37n which are combined to form a higher-resolution image 38 (step SC). Spatial compounding is then used to increase the signal-to-noise ratio. This process is repeated for Nt different transmit positions, such that the final image is an average of the higher resolution images. This approach can help to reduce hardware complexity and cost and means that only one channel is required, if necessary.
In conventional systems, beamforming is carried out in the RF domain. By demodulating data first, however, the bandwidth and sampling rate may be decreased, leading to a substantial saving in power, as will now be described in more detail.
Overview
A transducer can be considered to produce a bandpass signal R(t) which may be expressed as:
R(t) = d(i)cos(aici + φ) (i) where A(t) is the amplitude envelope, ω<· the carrier frequency in radians per second, φ the phase and t is time. Expansion of R(t) yields:
R(t) = d;(t)cos(n)ci + </?) + 2lQ(t)sin(<Oct + φ) where A,(t) = d(t)cos<nci and Λρ(ί) = d(t)sin<Oct are the in-phase and quadrature components respectively. These components maybe obtained by mixing the signal with a reference signal in the analogue domain and filtering the result. Since AXf) and Αρ(ί) are baseband signals, they may be sampled at a lower rate. This reduces the
- 11 computational burden on the beamforming processor. After sampling, the next step is to appropriately phase-rotate the I/Q data for focusing.
According to the synthetic aperture focusing method, for a given pixel location r^ at depth index k, the required time delay instance tp(ij) to take the signal value for summation is calculated by dividing the distance by the speed of sound, c, in the medium, namely:
tp(tj) = ll-gml + 11-ΐωΙ (3) where r^ is the imaging point, (i) is the location of the Ith transmitting element and (j) is the location of thejth receiving element. A corresponding discretised delay index IP(ij) may then be calculated. If this delay is applied directly to the 1/Q data, then critical frequency-dependent phase errors distort the final image. Furthermore, uniform samples do not fall on the exact sampling points required for focusing at all pixels. Therefore, an interpolation factor of K is applied. In particular, if Ns sample points are obtained, then there are many as K x Ns index locations between l and Ip(ij)max· The index value is read from a lookup table that is calculated a priori, based on the locations of each pixel r?, and transmitting element, i, or receiving element, j.
For each index location Ip(ij), the I or Q data are then interpolated on-the-fly using linear, quadratic or other suitable form of interpolation.
The next step is to phase rotate by re-modulating or upconverting the IQ sample points back to RF by mixing the interpolated result with new discrete reference signals:
Iref[n] = COS[<Ocn] (4)
QrefW = sin[(Ocn] (5) where ω<· is the carrier frequency in rad/s and n is the discretised time index. Again, Mn] and ζλν/[η] are calculated a priori. The interpolated I and Q values are multiplied by the reference signals at n = Ip and then summed to yield the RF amplitude:
R [n] = d/[n] cos[<ncn] + 4ρ[η] sin[<Ocn] (6)
- 12 = A [n] cos φ. cos[mcn] + A [n] sin φ. sinfcyn] = A [n] cos[<Dcn + φ] (7) (8)
This value is then added to the pixel location, and the process is repeated for all i,j and n values, resulting in a low-resolution image. These low-resolution images are summed or averaged to obtain a higher resolution image, which may then be transmitted via a wireless transmission link to an external post-processor, i.e. processor 34 (Figure 1). The final focussed signal ( η?) is:
(9) where a(/P(i,y) is the apodization (weighting) function, R(IP(i.,j) is the phase-shifter 1/Q sum at IP(iJ), N is the number of transducer elements and Nt is the number of transmissions.
The process lends itself to an iterative, pipelined approach that may easily be implemented in a hardware description language (HDL). Calculations for parallel groups of pixels may be pipelined during the reflection period, and the only memory required is for the image frame (which is updated dynamically), a single delay matrix, an array of sine/cosine values and dynamic apodization constants. The frame rate is a function of the number of transmissions, imax, the lateral pixel resolution/size of the receive aperture, jmax, the axial pixel resolution, kmax, the number of pixels calculated in parallel per pipeline interval, Np, and the system clock frequency,/*:
FR = fclk
Figure GB2557913A_D0023
(10)
A factor of two in the denominator is introduced to account for pipelining send and 25 receive operations in hardware over two clock cycles.
The frame rate/image quality can be increased at the expense of frequency/area/power consumption. For instance, more transmit positions imax implies better spatial compounding and signal-to-noise ratio and, therefore, better image quality. Likewise, a larger kmax implies a better axial pixel resolution. However, increasing imax or kmax leads
-13to a lower frame rate if the clock frequency fcik remains constant because transmit/receive operations are multiplexed, for example, through one or a few channels.
Dynamic apodization is also applied to keep the F-number (f#) constant as a function of imaging depth. The F-number is defined as the ratio of the imaging depth, z, to the aperture size, a. The synthetic aperture is dynamically grown as a function of the imaging depth in order to keep/# constant. The number of lines, I, to consider in a window for focusing to a depth z are calculated using the following expression:
l(z) =
Zk f#.Ax (ii) where Zk is the distance between the aperture and sample and Δχ is the inter-element spacing. This equation is used to derive a set of a priori constants that are stored in memory to allow for real-time dynamic apodization.
Synthetic aperture beamforming arrangement
Referring to Figure 3, a first arrangement of imaging device 3 is shown. For clarity, the excitation circuitry 17 is not shown in Figure 3.
The receive circuity 18 includes an analogue front end 23 comprising M analogue frontend channels 23,,...23^ and the digital processor 28 comprising M digital processing channels 28,,...,28^, where N > M and MZl, and N is the number of transducer elements.
Analogue front-end
The, or each, analogue front-end channel 231 (where i = 1,..., M andM> 1) includes a preamplifier 41, a signal splitter 43, first and second active or passive mixers 44, 45, an oscillator (not shown) and a 9O°phase shifter (not shown) which generate first and second carrier signals 48, 49 for the first and second mixers 44, 45 respectively, first and second programmable gain amplifiers 52,53, first and second low-pass filters 56, and first and second analogue-to-digital converters 60, 61. The carrier signals 48,49 may take the form of square, sine or other shape of wave which are orthogonal (i.e.
phase-shifted by 90°) with respect to each other.
-14The preamplifier 41 may take the form of a fully differential preamplifier which functions as a low-noise amplifier. The preamplifier 41 performs time-gain compensation to account for exponential tissue attenuation. The gain is increased exponentially over time by controlling the gain of a variable gain amplifier (VGA). The VGA sweeps the gain from, for example, 20 dB to 35 dB over the reflection period, thereby shifting the noise floor to an appropriate level.
After the signal 22 has been amplified, the amplified signal 42 is down-converted using the mixers 52, 53 into I and Q components 50, 51. The signals are processed along separate, matched channels. The signal 50,51 is then amplified again using the programmable gain amplifiers 52, 53. The required gain maybe selected by switching between resistor combinations (not shown) on the amplifier’s feedback loop (not shown). Lastly, image rejection is carried out by means of the low-pass filters 56,57 which may each take the form of sixth-order Butterworth low-pass filter.
After analogue-to-digital conversion, the discretised I/Q signals 29, 30 are processed by the digital beamformer 28.
The analogue front end 23 may operate with a transducer centre frequency of 2.5 MHz, I/Q bandwidth of 1.25 MHz (Nyquist sampling frequency of 2.5 MHz), gain of 46 ± 6 dB, an input referred dynamic range at 1 kHz (THD < 1%) of 61 dB, and input referred noise floor of 7.5 pV and a CMRR of 82 dB. These parameters can, however, be varied.
These parameters are variable depending on the resolution requirements of the application. For low frequency imaging applications such as devices 81 and 31 in Figure 9, the centre frequency maybe in the range of 2-20 MHz. The Nyquist sampling frequency depends on the bandwidth of the transducer and hence the I/Q signal bandwidth. The required receiver gain is a function of the power of the transmission pulse exciting to the transducer, which affects the pressure of the ultrasound wave. The receiver gain is typically in the range of 10-100 dB. The noise floor and the total harmonic distortion of the receiver affects the dynamic range of the device. With timegain control, the dynamic range may be, for example between 50-100 dB.
-15Beam-forming digital processor
Conventional synthetic aperture algorithms tend to be computationally intensive and typically require a large memory capacity. Thus, a serialised or pipelined approach is used herein to process data dynamically, in real-time.
The digital processor 28 carries out beamforming including dynamically focusing and apodizing data in order to form a 2D image. The digital processor 28 may take the form of an application-specific integrated circuit (ASIC) or field-programmable gate array (FPGA).
Referring still to Figure 3, the, or each, digital processing channel 281 (where i = 1,..., M and M> 1) includes an interpolation module 62, a phase rotate module 63 and a summing module 64. The interpolation module 62 includes, or has access to, memory 65, for example in the form of read-only memory, which holds a look-up table 66.
The modules 62, 63, 64 may be implemented in hardware.
Referring also to Figure 4, after initialisation, first transmission is carried out and I/Q signals 29, 30 are sampled and read into memory 31. At the end of a reflection period, calculations begin and then continue during a following reflection period. Calculations on parallel groups of pixels are pipelined over multiple clocks cycles. The number of parallel operations that may be carried out depends on the logic capacity of the device and this, in turn, determines the maximum frame rate and image size. Delays values, tp(i,j), are read from the look-up table 66 stored in read-only memory 65, and used in the interpolation process hereinbefore described above. Dynamic apodization is also applied by first reading values of Z(z) calculated a priori and stored in read-only memory 65. These values are used to dynamically apodize the receive aperture in realtime. Finally, after the pixel value is calculated, it is added to a global 2D image array stored in random access memory 31. At the end of one iterative cycle, a high-resolution frame 39 is formed and the process enters a write state in which the image is transmitted to a backend processor (not shown), for example, for log compression.
If the beamformer is implemented in an ASIC, it can be synthesized in Cadence (RTM) using, for example, an AMS 0.18 pm CMOS process. A clock frequency of 25 MHz can be used, Nt = 8 (i.e. eight transducers) and pixel resolution 32 x 350. For these parameters, the estimated maximum frame rate is 4 Hz, the estimated beamformer size
-16is 1.55 mm x 1.55 mm and the estimated power consumption is 15 mW. These parameters and characteristics can, however, be varied.
Simulation results
Raw ultrasound data are captured using a Verasonics (RTM) Vantage 256 system using a Philips (RTM) P4-1 phased array (central frequency at 2.9 MHz) with 96 active elements. The imaged medium takes the form a phantom containing 8x3 crosssectional wires.
The synthetic aperture beamforming process is simulated using in MATLAB (RTM) by mixing the RF signals which are sampled at 10 MHz with 2.5 MHz sine and cosine references and filtering the result with a 4th order Butterworth low-pass filter having a cut-off frequency of 1.25 MHz. The digital beamforming process hereinbefore described is implemented in Verilog (RTM) and simulated in an HDL simulation environment in the form of ModelSim (RTM).
Referring to Figure 5, first, second and third two-dimensional (2D)-mode (or “Bmode”) ultrasound images are shown.
The first 2D-mode image is obtained using quadrature beamforming carried out with 8 transmit elements. The second 2D-mode image is obtained using quadrature beamforming carried out with 48 transmit elements. The third 2D-mode image is a comparative example which is obtained using beamforming carried out in the RFdomain.
The signal-to-noise ratio of the third 2D-mode ultrasound image (i.e. obtained using beamforming carried out in the RF-domain) is practically identical to that of the quadrature image, for the same number of transmissions and F-number. Decreasing the number of transmissions (i.e. decreasing Nt) leads to a reduction in the signal-to30 noise ratio due to larger side lobes (not shown) and increased speckle noise.
Furthermore, the signal-to-noise and lateral/axial resolution decrease as a function of depth. Nevertheless, satisfactory images can be obtained using only 8 transmit elements.
-17Compressive synthetic aperture imaging within an FRI framework
Overview
A sampling paradigm for certain classes of parametric signals can be used. Parametric signals with k parameters may be sampled and reconstructed using only 2.k parameters. These signals have a finite rate of innovation (FRI). A sampling scheme can be applied to periodic and finite streams of FRI signals, such as Dirac impulses, non-uniform splines, and piecewise polynomials. An appropriate sample kernel, such as sine, Guassian, sum of sines, etc., can be applied to extract a set of Fourier coefficients which are then used to obtain an annihilating filter. The locations and amplitudes of the pulses are finally determined. A brief review can be found in M. Vetterli, P. Marziliano, and T. Blu: “Sampling signals with finite rate of innovation,” IEEE Transactions on Signal Processing, volume 50, pages 1417-1428 (2002).
Consider an FRI signal x(f), for example, an ultrasound A-mode signal, comprising a finite stream of pulses with pulse shape p(f), amplitudes {cfc}£Zo and time locations
Figure GB2557913A_D0024
K-l
Figure GB2557913A_D0025
(12)
The sample values are obtained by filtering the signal with a sampling kernel. A sine kernel is defined as hB(t) = Ssinc(St), with bandwidth B = l/T. The convolution product is:
yn = (hB(t — nT),x(t)) n = 0,..., A —1 (13)
This is equivalent to:
Figure GB2557913A_D0026
Figure GB2557913A_D0027
(14) (15) (16)
-18Since the signal has K degrees of freedom, N>2.K samples are required to sufficiently recover the signal. The reconstruction method requires two systems of linear equations, namely one for the locations of the Gaussian pulses involving a matrix V and one for the weights of the pulses involving a matrix A. Define a Lagrange polynomial Lk = (P(u)/(u - tk/Ty) of degree K- l, where P(u) = Π£/ο(η - tk/T). Multiplying both sides of equation 16 by P(n) yields an expression in terms of the interpolating polynomials:
(—l)w+1P(n)yn = Σ cfeSsin (yr) (17) k=° '-Dfo10
Y = A. c (18)
To find the K locations tk, i.e. the time delays of the pulses, an annihilating equation is derived to find the roots of P(u). Since the right-hand side of equation 17 is a polynomial of degree K -1 in the variable n, if K finite differences are applied, then the left-hand side will become zero, i.e. Δίί((-1)ηΡ(η)γη) = 0, n = K,..., N - 1. Letting P(u) = YkPku>i leads to an annihilating filter equation equal to:
κ
Σ Pk AK((-l)nnfcyn) = 0 (19) k=G [KInfc ^V-P = 0 (20) where V is an (N - Κ) χ (K + 1) matrix. The system admits a solution when Rank(V) < K and N > 2.K. Thus, equation 19 may be used to find the K + 1 unknowns pk, which leads to K locations tk as these are the roots of P(u). Once the locations have been determined, the weights of the Gaussian pulses Ck may be found by solving the system in equation 18 for n = o,..., K - 1. The system has no solution if Rank(A) = K, where
A = E RKxK is defined by equation 17. A more detailed discussion of the annihilating filter method is provided in M. Vetterli et al. ibid.. Theoretically, the result does not depend on the sampling period T. However, V may be poorly conditioned if T is not chosen appropriately. As simulation results show hereinafter, oversampling yields an increase in the signal-to-noise of the reconstructed result.
-19The sine kernel herein before described has infinite time support and is non-causal. In the frequency domain, it is represented by an ideal low-pass filter with an infinite rolloff. Practically, the sine kernel maybe approximated in hardware by means a highorder analogue low-pass filter. Simulations demonstrate the performance of multiple filter types and orders.
Compressive synthetic aperture beamforming arrangement
Referring to Figure 5, a second arrangement of imaging device 3 is shown. For clarity, the excitation circuitry 17 is not shown in Figure 6.
The receive circuity 18 includes an analogue front-end 23 which includes a preamplifier 41, a signal splitter 43, first and second passive mixers 44, 45, an oscillator (not shown) and a 9O°phase shifter (not shown) which generate first and second carrier signals 48, 49 for the first and second mixers 44,45 respectively, first and second programmable gain amplifiers 52,53, first and second low-pass filters 56, 57 and first and second analogue-to-digital converters 60, 61. The carrier signals 48, 49 may take the form of square, sine or other shape of wave which are orthogonal (i.e. phase-shifted by 90°) with respect to each other.
The analogue front-end 23 amplifies and demodulates the RF signal 22 into I and Q components. This is achieved by mixing the RF waveform 22 with reference signals 48, 49 centred at the carrier frequency. The assumption is made that both the I and Q signals satisfy the FRI criterion, namely they both have finite rates of innovation. The signals are then filtered and bandlimited below the original I/Q bandwidth. This is carried out in the analogue domain in order to reduce the sampling frequency and thus the data bandwidth. This leads to a significant power saving, as the power budget is predominated by the power consumption of the ADC and wireless transceiver. By compressing the signal in the analogue domain, the computational burden is shifted to the digital back end, which carries out reconstruction of the I/Q signals and finally baseband beamforming.
Simulation results
Raw ultrasound data are captured using a Verasonics (RTM) Vantage 256 system using a Philips (RTM) P4-1 phased array (central frequency at 2.9 MHz) with 96 active
- 20 elements. The imaged medium takes the form a phantom containing 8x3 crosssectional wires.
The compressive synthetic aperture beamforming processes is used to produce a full two-dimensional mode image from the RF dataset hereinbefore described. First, the RF dataset was filtered using a fourth-order low-pass filter kernel with K values of 10, 20 and 40. Given F = 4 and τ = 140.8 ps, the corresponding low-rate sampling frequencies are 575 kHz, 1.14 MHz and 2.28 MHz respectively. The I and Q components are reconstructed using the process hereinbefore described. Finally, synthetic aperture beamforming is carried out using the quadrature method process hereinbefore described.
Referring to Figure 7, fourth, fifth and sixth two-dimensional (2D)-mode (or “B-mode”) ultrasound images are shown.
Increasing L improves the reconstruction accuracy and, thus, lateral resolution and image quality. As hereinbefore described, increasing L eventually pushes the low-rate sampling above that of the Nyquist quadrature sampling frequency. Ideally L should be as small as possible to minimise the sampling rate, and thus the power consumption of the transmission link.
Devices
Referring to Figure 8, an endoscopy capsule 81 is shown.
The capsule 81 comprises a generally cylindrically-shaped sealed case 82 having rounded ends and having dimensions and shape which allows the capsule to be swallowed by a subject, such as a patient. The ultrasonic imaging device 3 is housed within the case 82 such that the transducer array 11 (Figure 1) is suitably positioned for imaging surrounding tissue, for example, in an annular section (not shown) around the perimeter of the case. The case 82 has an outer diameter, d, and a length, I. The case 82 may have dimensions which allow a capsule to be swallowed. For example, d may be 1 mm and I may be about 2 cm. The capsule 81 may be swallowed by patient and while it passes through the gastrointestinal tract (not shown), it can perform ultrasound and transmit signals to the processing device 4 (Figure 1). The capsule 81 may be used in veterinary applications, in other words, the subject may be a non- 21 human animal. A capsule-like device maybe used in non-medical or non-veterinary applications, such as inspection of pipes.
Referring to Figure 9, a wearable patch 91 is shown.
The patch 91 may comprises a flexible substrate 92 and the ultrasonic imaging device 3 is housed within a pocket (not shown) formed in part by the flexible substrate 92 or is attached to the flexible substrate 92. A face (not shown) of the substrate 92 maybe coated with an adhesive (not shown) which allows the patch 91 to be attached to a surface 93 of a subject, for example, the skin of a patient. The transducer array 11 (Figure 1) suitably positioned for imaging through the surface 93. The ultrasonic imaging device 3 has lateral dimensions, w, which allow the patch to be worn. For example, w may be between 1 and 4 cm.
Referring to Figure 10, a wand 101 is shown.
The wand 101 comprise a cylindrical or bar-like case 102. The ultrasonic imaging device 3 is housed within the case 102, for example, occupying the length 103 of the cylindrical case 102. The case 103 has a longitudinal axis 104. The transducer array 11 (Figure 1) is suitably positioned, for example a distal end 103 (or “tip”) of the device 102 emitting ultrasound waves axially (i.e. along the axis 104) from the tip 103 towards the surface 105. The transducer array 11 may be either a lD or 2D array providing 2D or 3D imaging respectively. The wand 101 may be scanned over the surface 104 of a subject, such as a patient. Thus, the wand 101 can be used by medical workers (such as doctors, nurses or paramedics) as a convenient point-of-care diagnostic device. The wand 101 may have similar dimensions as a pen, for example, having a diameter, 2r, of about 1 cm.
Modifications
It will be appreciated that various modifications may be made to the embodiments hereinbefore described. Such modifications may involve equivalent and other features which are already known in the design, manufacture and use of ultrasonic imaging devices and component parts thereof and which maybe used instead of or in addition to features already described herein. Features of one embodiment maybe replaced or supplemented by features of another embodiment.
- 22 The ultrasonic imaging device 3 may transmit the signals directly to the processing system 4. However, the ultrasonic imaging device 3 may transmit the signals via, for example, an antenna (not shown) which is located remotely from the processing device 4, via a wireless repeater (not shown), via a wired link (not shown) and/or wireless link (not shown) such as a wireless local area network.
Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel features or any novel combination of features disclosed herein either explicitly or implicitly or any generalization thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention. The applicants hereby give notice that new claims may be formulated to such features and/or combinations of such features during the prosecution of the present application or of any further application derived therefrom.

Claims (20)

  1. Claims
    1. An ultrasonic imaging device comprising:
    an ultrasonic transducer array (n) comprising a plurality of transducer elements (12);
    a multiplexer (15) comprising a plurality of inputs and at least one output, each input coupled to a respective transducer element;
    an analogue front-end (23) comprising at least one channel, each channel coupled to a respective output of the multiplexer and each channel including a respective analogue demodulator (44,45) arranged to extract in-phase and quadrature components of a signal from the transducer and to provide demodulated in-phase and quadrature signal components (50,51);
    at least two analogue-to-digital converters (60, 61) configured to receive signals comprising or obtained from the in-phase and quadrature signal components and to provide digitised demodulated in-phase and quadrature signal components; and a wireless interface (32) configured to transmit signals comprising or obtained from the digitised demodulated in-phase and quadrature signal components.
  2. 2. An ultrasonic imaging device according to claim 1, wherein each channel of the analogue front-end (23) includes first and second low-pass filters (56, 57) arranged to filter the in-phase and quadrature signal components respectively before the in-phase and quadrature signal components are digitised by the analogue-to-digital converters (60, 61).
  3. 3. An ultrasonic imaging device according to claim 2, wherein the first and second low-pass filters (56,57) have respective bandwidths which are less than the Nyquist cut-off frequency a transducer element (12).
  4. 4. An ultrasonic imaging device according to claim 1, wherein the at least two analogue-to-digital converters (60, 61) are configured to sample at a non-uniform rate.
  5. 5. An ultrasonic imaging device according to any one of claims 1 to 4, wherein the analogue-to-digital converters are configured to supply digitised demodulated in-phase and quadrature signal components and wherein wireless interface (32) is configured to transmit the digitised demodulated in-phase and quadrature signal components.
    -246. An ultrasonic imaging device according to any one of claims l to 4, further comprising:
    a processor (28) arranged, for each channel, to receive the digitised demodulated in-phase and quadrature signal components and to perform synthetic aperture beamforming based on the digitised demodulated in-phase and quadrature signal components.
  6. 7. An ultrasonic imaging device according to claim 6, wherein the processor (28) is arranged to carry out interpolation and to provide interpolated in-phase and quadrature signal values.
  7. 8. An ultrasonic imaging device according to claim 6 or 7, wherein the processor (28) is configured to apply a time delay to the interpolated in-phase and quadrature signal values.
  8. 9. An ultrasonic imaging device according to claim 7 or 8, wherein the processor (28) is arranged to carry out phase rotation of the interpolated in-phase and quadrature signal values to produce first and second RF signal values.
  9. 10. An ultrasonic imaging device according to claim 9, wherein the processor (28) is arranged to carry out summation of the first and second RF signal values to a given memory location corresponding to a given pixel.
  10. 11. An ultrasonic imaging device according to claim 10, wherein the processor (28) is arranged to repeat interpolation, phase rotation and summation for a series of digitised demodulated in-phase and quadrature signal components.
  11. 12. An ultrasonic imaging device according to any one of claims 6 to 11, further comprising:
    memory (28) for storing a two-dimensional image or three-dimensional image comprising pixel values received from the processor.
  12. 13. An ultrasonic imaging device according to any one of claims 6 to 12, wherein wireless interface (32) is configured to transmit two- or three-dimensional image.
    -2514- An ultrasonic imaging device according to any one of claims l to 13, further comprising:
    an excitation circuit (17) comprising a pulser (20), a transmit beamformer (21) and a processor configured to control the beamformer.
  13. 15. An ultrasonic imaging device according to claim 14, wherein the processor of the excitation circuit is configured to perform synthetic aperture beamforming.
  14. 16. An ultrasonic imaging device according to any one of claims 1 to 15, wherein the analogue front-end (23) comprises one channel.
  15. 17. An ultrasonic imaging device according to any one of claims 1 to 16, comprising: a housing which contains the ultrasonic transducer array (11), the multiplexer (15), the analogue front-end (23), the at least two analogue-to-digital converters (60,
    61) and the wireless interface (32) and wherein the housing has a volume less than 2 cm3.
  16. 18. An ultrasonic imaging device according to any one of claims 1 to 17, which adapted to be a capsule (81) for swallowing by a human subject or a non-human animal subject or for passing through a passage of a non-animal subject.
  17. 19. An ultrasonic imaging device according to any one of claims 1 to 17, which adapted to be a patch (91) for applying to a surface (93) of a subject.
  18. 20. An ultrasonic imaging device according to any one of claims 1 to 17, which adapted to be a hand-held wand for scanning over a surface (93) of a subject.
  19. 21. A processing device comprising: a network interface (33); and a processor (34) coupled to the network interface; wherein the processor is configured to receive parametric or filtered quasi in-phase and quadrature signals, to reconstruct in-phase and quadrature signals and to perform synthetic aperture beamforming.
  20. 22. An ultrasonic imaging system (1) comprising:
    an ultrasonic imaging device (3) according to any one of claims 1 to 23; and
    - 26 a processing device (4) comprising a network interface (33), a processor (34), storage (35) and a display (36), wherein the ultrasonic imaging device and the processing device are in wireless communication.
    Intellectual
    Property
    Office
    Application No: GB1621423.1 Examiner: Mr Max Emery
GB1621423.1A 2016-12-16 2016-12-16 Ultrasonic imaging device Withdrawn GB2557913A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1621423.1A GB2557913A (en) 2016-12-16 2016-12-16 Ultrasonic imaging device
PCT/GB2017/053761 WO2018109490A1 (en) 2016-12-16 2017-12-15 Ultrasonic imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1621423.1A GB2557913A (en) 2016-12-16 2016-12-16 Ultrasonic imaging device

Publications (2)

Publication Number Publication Date
GB201621423D0 GB201621423D0 (en) 2017-02-01
GB2557913A true GB2557913A (en) 2018-07-04

Family

ID=58284397

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1621423.1A Withdrawn GB2557913A (en) 2016-12-16 2016-12-16 Ultrasonic imaging device

Country Status (2)

Country Link
GB (1) GB2557913A (en)
WO (1) WO2018109490A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019158363A1 (en) * 2018-02-16 2019-08-22 Koninklijke Philips N.V. Digital ultrasound cable and associated devices, systems, and methods
EP3598950A1 (en) * 2018-07-24 2020-01-29 Koninklijke Philips N.V. Ultrasound controller unit and method
CN110833432B (en) * 2018-08-15 2023-04-07 深南电路股份有限公司 Ultrasonic simulation front-end device and ultrasonic imaging equipment
CN109700480B (en) * 2018-12-28 2021-12-10 深圳开立生物医疗科技股份有限公司 Ultrasonic imaging system, performance adaptation method thereof and data processing device
WO2020206075A1 (en) * 2019-04-05 2020-10-08 Butterfly Network, Inc. Wireless ultrasound device and related apparatus and methods
CN111248940B (en) * 2020-03-31 2022-06-07 京东方科技集团股份有限公司 Driving method of ultrasonic imaging system, and storage medium
EP4267316A1 (en) * 2020-12-22 2023-11-01 Geegah LLC Ghz cmos ultrasonic imager pixel architecture
US11559285B2 (en) * 2021-02-17 2023-01-24 Vortex Imaging Ltd. Reflection ultrasound tomographic imaging using full-waveform inversion

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004064619A2 (en) * 2003-01-14 2004-08-05 University Of Virginia Patent Foundation Ultrasound imaging beam-former apparatus and method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11508461A (en) * 1995-06-29 1999-07-27 テラテク・コーポレーシヨン Portable ultrasonic imaging system
JPH1057374A (en) * 1996-06-11 1998-03-03 Olympus Optical Co Ltd Ultrasonograph
US6142946A (en) * 1998-11-20 2000-11-07 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with cordless scanheads
CN201847706U (en) * 2010-11-11 2011-06-01 深圳市信步科技有限公司 Demodulator circuit of Doppler diagnostic apparatus
US9314225B2 (en) * 2012-02-27 2016-04-19 General Electric Company Method and apparatus for performing ultrasound imaging
EP3218705B1 (en) * 2014-11-14 2024-05-01 URSUS Medical Designs LLC Ultrasound beamforming system and method based on aram array

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004064619A2 (en) * 2003-01-14 2004-08-05 University Of Virginia Patent Foundation Ultrasound imaging beam-former apparatus and method

Also Published As

Publication number Publication date
GB201621423D0 (en) 2017-02-01
WO2018109490A1 (en) 2018-06-21

Similar Documents

Publication Publication Date Title
GB2557913A (en) Ultrasonic imaging device
US10371804B2 (en) Ultrasound signal processing circuitry and related apparatus and methods
EP3132281B1 (en) Ultrasonic imaging compression methods and apparatus
JP6924193B2 (en) Ultrasonic signal processing circuits and related equipment and methods
JP5174010B2 (en) Method and transducer array with integrated beaming
TW381226B (en) Portable ultrasound imaging system
Kang et al. A system-on-chip solution for point-of-care ultrasound imaging systems: Architecture and ASIC implementation
EP3232937A1 (en) Ultrasound system for high-speed and high resolution imaging applications
US20160228092A1 (en) Portable ultrasound apparatus and control method for the same
WO2011013329A1 (en) Ultrasonograph
Peyton et al. Quadrature synthetic aperture beamforming front-end for miniaturized ultrasound imaging
US9218802B2 (en) Ultrasonic probe and ultrasonic diagnostic apparatus
Angiolini et al. 1024-Channel 3D ultrasound digital beamformer in a single 5W FPGA
Kim et al. Hybrid volume beamforming for 3-D ultrasound imaging using 2-D CMUT arrays
JP5606661B2 (en) Ultrasonic diagnostic equipment
Chatar et al. Analysis of existing designs for fpga-based ultrasound imaging systems
JP2013063159A (en) Ultrasonograph and ultrasonic image generation method
JP2013063157A (en) Ultrasound diagnostic apparatus and ultrasound image generating method
Peyton et al. Comparison of synthetic aperture architectures for miniaturised ultrasound imaging front-ends
JP2010115356A (en) Ultrasonic probe and ultrasonic diagnostic apparatus
Kim et al. A smart-phone based portable ultrasound imaging system for point-of-care applications
US11808897B2 (en) Methods and apparatuses for azimuthal summing of ultrasound data
Peng et al. Design of diagnostic ultrasound signal processing system based on programmable computing architecture
Agarwal et al. P2B-17 Single-Chip Solution for Ultrasound Imaging Systems: Initial Results
JP2010213771A (en) Ultrasonic probe and ultrasonograph

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)