TW201927247A - Tablet ultrasound system - Google Patents

Tablet ultrasound system Download PDF

Info

Publication number
TW201927247A
TW201927247A TW108112049A TW108112049A TW201927247A TW 201927247 A TW201927247 A TW 201927247A TW 108112049 A TW108112049 A TW 108112049A TW 108112049 A TW108112049 A TW 108112049A TW 201927247 A TW201927247 A TW 201927247A
Authority
TW
Taiwan
Prior art keywords
touch screen
screen display
touch
image
ultrasound
Prior art date
Application number
TW108112049A
Other languages
Chinese (zh)
Other versions
TWI710356B (en
Inventor
艾利斯 M 江
威廉 M 汪
諾哈 柏格
Original Assignee
美商德拉工業公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/037,106 external-priority patent/US9877699B2/en
Application filed by 美商德拉工業公司 filed Critical 美商德拉工業公司
Publication of TW201927247A publication Critical patent/TW201927247A/en
Application granted granted Critical
Publication of TWI710356B publication Critical patent/TWI710356B/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4405Device being mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4411Device being modular
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4433Constructional features of the ultrasonic, sonic or infrasonic diagnostic device involving a docking unit
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4438Means for identifying the diagnostic device, e.g. barcodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B06GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS IN GENERAL
    • B06BMETHODS OR APPARATUS FOR GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS OF INFRASONIC, SONIC, OR ULTRASONIC FREQUENCY, e.g. FOR PERFORMING MECHANICAL WORK IN GENERAL
    • B06B1/00Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency
    • B06B1/02Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy
    • B06B1/06Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction
    • B06B1/0607Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction using multiple elements
    • B06B1/0611Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction using multiple elements in a pile
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B06GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS IN GENERAL
    • B06BMETHODS OR APPARATUS FOR GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS OF INFRASONIC, SONIC, OR ULTRASONIC FREQUENCY, e.g. FOR PERFORMING MECHANICAL WORK IN GENERAL
    • B06B1/00Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency
    • B06B1/02Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy
    • B06B1/06Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction
    • B06B1/0607Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction using multiple elements
    • B06B1/0622Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction using multiple elements on one surface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8925Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8979Combined Doppler and pulse-echo imaging systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/5206Two-dimensional coordinated display of distance and direction; B-scan display
    • G01S7/52066Time-position or time-motion displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52071Multicolour displays; using colour coding; Optimising colour or information content in displays, e.g. parametric imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52074Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/5208Constructional features with integration of processing functions inside probe or scanhead
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/52082Constructional features involving a modular construction, e.g. a computer with short range imaging equipment
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/52084Constructional features related to particular user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K11/00Methods or devices for transmitting, conducting or directing sound in general; Methods or devices for protecting against, or for damping, noise or other acoustic waves in general
    • G10K11/18Methods or devices for transmitting, conducting or directing sound
    • G10K11/26Sound-focusing or directing, e.g. scanning
    • G10K11/34Sound-focusing or directing, e.g. scanning using electrical steering of transducer arrays, e.g. beam steering
    • G10K11/341Circuits therefor
    • G10K11/346Circuits therefor using phase variation
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K11/00Methods or devices for transmitting, conducting or directing sound in general; Methods or devices for protecting against, or for damping, noise or other acoustic waves in general
    • G10K11/18Methods or devices for transmitting, conducting or directing sound
    • G10K11/26Sound-focusing or directing, e.g. scanning
    • G10K11/34Sound-focusing or directing, e.g. scanning using electrical steering of transducer arrays, e.g. beam steering
    • G10K11/341Circuits therefor
    • G10K11/348Circuits therefor using amplitude variation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8927Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array using simultaneously or sequentially two or more subarrays or subapertures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/26Layer connectors, e.g. plate connectors, solder or adhesive layers; Manufacturing methods related thereto
    • H01L2224/31Structure, shape, material or disposition of the layer connectors after the connecting process
    • H01L2224/32Structure, shape, material or disposition of the layer connectors after the connecting process of an individual layer connector
    • H01L2224/321Disposition
    • H01L2224/32151Disposition the layer connector connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive
    • H01L2224/32221Disposition the layer connector connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked
    • H01L2224/32245Disposition the layer connector connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked the item being metallic
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/4805Shape
    • H01L2224/4809Loop shape
    • H01L2224/48091Arched
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/481Disposition
    • H01L2224/48151Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive
    • H01L2224/48221Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked
    • H01L2224/48245Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked the item being metallic
    • H01L2224/48247Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked the item being metallic connecting the wire to a bond pad of the item
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/15Details of package parts other than the semiconductor or other solid state devices to be connected
    • H01L2924/181Encapsulation

Abstract

Exemplary embodiments provide systems and methods for portable medical ultrasound imaging. Preferred embodiments utilize a tablet touchscreen display operative to control imaging and display operations without the need for using traditional keyboards or controls. Certain embodiments provide ultrasound imaging system in which the scan head includes a beamformer circuit that performs far field sub array beamforming or includes a sparse array selecting circuit that actuates selected elements. Exemplary embodiments also provide an ultrasound engine circuit board including one or more multi-chip modules, and a portable medical ultrasound imaging system including an ultrasound engine circuit board with one or more multi-chip modules. Exemplary embodiments also provide methods for using a hierarchical two-stage or three-stage beamforming system, three dimensional ultrasound images which can be generated in real-time.

Description

平板電腦超聲波系統Tablet PC ultrasound system

醫療超聲波成像已成為許多醫療成像應用之一業界標準。近年來,對於醫療超聲波成像設備之需求不斷增加,該醫療超聲波成像設備可攜帶以容許醫療人員容易地運輸該設備至醫院及/或現場位置及自醫院及/或現場位置運輸該設備,及更人性化地適應可擁有一系列技能等級之醫療人員。
習知醫療超聲波成像設備通常包含至少一超聲波探測頭/傳感器、一鍵盤及/或一旋鈕、一電腦及一顯示器。在一典型操作模式中,超聲波探測頭/傳感器產生可基於頻率位準穿透組織至不同深度之超聲波及接收自該組織反射回之超聲波。此外,醫療人員可經由鍵盤及/或旋鈕將系統輸入輸入至電腦,及在顯示器上觀看組織結構之超聲波影像。
然而,採用此等鍵盤及/或旋鈕之習知醫療超聲波成像設備可為大尺寸的,且因此可能並不適於醫院及/或現場位置中之可攜式使用。此外,因為此等鍵盤及/或旋鈕通常具有不平坦表面,所以其等在醫院及/或現場環境中可能難以保持乾淨,在該等地方維持一無菌現場對患者健康可為至關重要的。一些習知醫療超聲波成像設備已併入觸控螢幕技術以提供一部分使用者輸入介面。然而,採用此觸控螢幕技術之習知醫療超聲波成像設備大體上僅提供結合一傳統鍵盤及/或旋鈕之有限觸控螢幕功能性,且因此可能不僅難以保持乾淨而且使用起來複雜。
Medical ultrasound imaging has become one of the industry standards for many medical imaging applications. In recent years, there has been an increasing demand for medical ultrasound imaging equipment that can be carried to allow medical personnel to easily transport the equipment to and from hospitals and / or on-site locations, and more Humanize to medical staff who can have a range of skill levels.
Conventional medical ultrasound imaging equipment usually includes at least an ultrasound probe / sensor, a keyboard and / or a knob, a computer, and a display. In a typical operating mode, the ultrasonic probe / sensor generates ultrasonic waves that can penetrate tissue to different depths based on frequency levels and receives ultrasonic waves reflected back from the tissue. In addition, medical personnel can input system input to a computer via a keyboard and / or knob, and view ultrasound images of the tissue structure on the display.
However, conventional medical ultrasound imaging devices employing such keyboards and / or knobs can be large in size and therefore may not be suitable for portable use in hospitals and / or field locations. In addition, because these keyboards and / or knobs often have uneven surfaces, they may be difficult to keep clean in hospital and / or field environments, and maintaining a sterile field in such places may be critical to patient health. Some conventional medical ultrasound imaging devices have incorporated touch screen technology to provide a portion of the user input interface. However, conventional medical ultrasound imaging devices employing this touch screen technology generally only provide limited touch screen functionality in combination with a traditional keyboard and / or knob, and therefore may not only be difficult to keep clean but also complicated to use.

根據本發明,揭示醫療超聲波成像之系統及方法。當前所揭示之醫療超聲波成像之系統及方法採用醫療超聲波成像設備,該醫療超聲波成像設備包含在一平板電腦外觀尺寸中之一手持式殼體及安置於該殼體之一前面板上之一觸控螢幕顯示器。該觸控螢幕顯示器包含一多點觸控式觸控螢幕,該多點觸控式觸控螢幕可辨識及區別在該觸控螢幕顯示器之一表面上之一或多個單點、多點及/或同時觸控,藉此容許使用手勢(範圍自簡單單點手勢至複雜多點移動手勢)作為至醫療超聲波成像設備之使用者輸入。
根據一態樣,例示性醫療超聲波成像系統包含具有在平行平面中剛性地安裝至彼此之一前面板及一後面板之一殼體、一觸控螢幕顯示器、具有至少一處理器及至少一記憶體之一電腦、一超聲波波束成形系統及一電池。醫療超聲波成像設備之該殼體係以一平板電腦外觀尺寸實施。觸控螢幕顯示器係安置於該殼體之前面板上且包含可辨識及區別在該觸控螢幕顯示器之一表面上之一或多個單點、多點及/或同時觸控或手勢之一多點觸控式LCD觸控螢幕。電腦、超聲波波束成形系統或引擎及電池係操作性地安置於殼體內。醫療超聲波成像設備可使用操作性地地連接於殼體內之電腦與超聲波引擎之間之一 Firewire連接及具有促進至少一超聲波探測頭/傳感器之連接之一探測頭附接/卸離桿之一探測頭連接器。此外,例示性醫療超聲波成像系統包含一I/O埠連接器及一DC電力輸入。
在一例示性操作模式中,醫療人員可採用簡單單點手勢及/或較複雜之多點手勢作為至多點觸控式LCD觸控螢幕之使用者輸入以用於控制例示性醫療超聲波成像設備之操作模式及/或功能。此等單點/多點手勢可對應於映射至可藉由電腦及/或超聲波引擎執行之一或多個預定操作之單點及/或多點觸控事件。醫療人員可藉由在觸控螢幕顯示器之表面上之各種手指、手掌及/或觸控筆運動進行此等單點/多點手勢。多點觸控式LCD觸控螢幕接收單點/多點手勢作為使用者輸入,且提供該等使用者輸入至電腦,該電腦使用處理器執行儲存於記憶體中之程式指令以進行與該等單點/多點手勢相關聯之預定操作(至少在有些時候結合超聲波引擎進行該等操作)。觸控螢幕顯示器之表面上之此等單點/多點手勢可包含(但不限於):一點選手勢、一捏合手勢、一撥動手勢、一旋轉手勢、一點兩下手勢、一展開型手勢、一拖曳手勢、一按壓手勢、一按壓及拖曳手勢及一手掌手勢。與依靠藉由機械切換、鍵盤元件或觸控墊軌跡球介面操作之許多控制特徵之現有超聲波系統相比,本發明之較佳實施例採用一單一開/關切換器。已使用觸控螢幕控制項實施全部其他操作。此外,較佳實施例採用足夠敏感以偵測藉由使用者之裸手指以及使用者之戴手套之手指致動之觸控手勢之一電容性觸控螢幕顯示器。通常醫療人員必須在醫療程序期間佩戴滅菌塑膠手套。因此,高度期望提供可由戴手套之手使用之一可攜式超聲波器件;然而,此先前已對要求無菌預防措施之許多應用阻止使用超聲波系統中之觸控螢幕顯示控制功能。本發明之較佳實施例提供藉由戴手套人員在觸控螢幕顯示器上使用經程式化之觸控手勢來控制全部超聲波成像操作。
根據一例示性態樣,可採用至少一撥動手勢以控制藉由超聲波探測頭/傳感器產生之超聲波之組織穿透深度。例如,觸控螢幕顯示器表面上之「向上」方向之一單一撥動手勢可增加穿透深度達一(1)公分或任何其他合適量,且觸控螢幕顯示器表面上之「向下」方向之一單一撥動手勢可降低穿透深度達一(1)公分或任何其他合適量。此外,觸控螢幕顯示器表面上之「向上」或「向下」方向之一拖曳手勢可增加或降低穿透深度達一(1)公分之倍數或任何其他合適量。藉由觸控螢幕顯示器表面上之特定單點/多點手勢控制之額外操作模式及/或功能可包含(但不限於):凍結/儲存操作、二維模式操作、增益控制、色彩控制、分割螢幕控制、PW成像控制、電影/時間序列影像剪輯捲動控制、變焦及水平搖攝控制、全螢幕控制、多普勒(Doppler)及二維波束導向控制及/或身體標記控制。可藉由實施於觸控螢幕顯示器上之一或多個觸控控制項來控制例示性醫療超聲波成像設備之至少一些操作模式及/或功能,其中可藉由移動觸控手勢來重設波束成形參數。醫療人員可提供作為使用者輸入之一或多個特定單點/多點手勢以用於指定根據要求及/或需要待實施於觸控螢幕顯示器上之觸控控制項之至少一選定子集。當一些或更多虛擬按鈕或圖標係可用時,較大數目個觸控控制項在以全螢幕模式操作時實現較大功能性。
根據另一例示性態樣,可在觸控螢幕顯示器之一區域內採用一按壓手勢,且回應於該按壓手勢,一虛擬視窗可提供於該觸控螢幕顯示器上以用於顯示該觸控螢幕顯示器上顯示之一超聲波影像之至少一經放大部分。根據又另一例示性態樣,可在觸控螢幕顯示器之區域內採用一按壓及拖曳手勢,且回應於該按壓及拖曳手勢,可追蹤超聲波影像之一預定特徵。此外,可在觸控螢幕顯示器之區域內採用一點選手勢(實質上與按壓及拖曳手勢之一部分同時),且回應於該點選手勢,可完成超聲波影像之預定特徵之追蹤。此等操作可在具有一單一顯示格式之不同區域中操作,使得(例如)影像內之一所關注區域內之一移動手勢可執行不同於執行於該影像內但在該所關注區域外之相同手勢之一功能。
藉由提供具有一多點觸控式觸控螢幕之醫療超聲波成像設備,醫療人員可在不需要一傳統鍵盤或旋鈕之情況下使用簡單單點手勢及/或較複雜之多點手勢控制該設備。因為多點觸控式觸控螢幕消除了對於一傳統鍵盤或旋鈕之需要,所以此醫療超聲波成像設備在醫院及/或現場環境中更易於保持乾淨,提供一直觀的人性化介面,同時提供全功能操作。此外,藉由以一平板電腦外觀尺寸提供此醫療超聲波成像設備,醫療人員可容易地在醫院及/或現場位置之間運輸該設備。
該系統可操作以經由一無線通信網路(諸如一3G或4G無線蜂巢式網路)與外部及遠端器件通信。該系統可因此提供語音及資料傳送(包含經由用於行動器件通信之一無線公共存取網路)。
某些例示性實施例提供用於一可攜式醫療超聲波成像系統之一超聲波引擎之一多晶片模組,其中一傳輸/接收(TR)晶片、一前置放大器/時間增益補償(TGC)晶片及一波束成形器晶片經組裝成一垂直堆疊組態。傳輸電路提供高電壓電驅動脈衝至傳感器元件以產生一傳輸波束。當傳輸晶片在大於80 V之電壓下操作時,利用一1微米設計規則之一CMOS處理程序已用於該傳輸晶片且一次微米設計規則已用於低壓接收電路(小於5 V)。
本發明之較佳實施例利用一次微米處理程序以提供具有在複數個電壓(例如,2.5 V、5 V及60 V或更高)下操作之子電路之積體電路。根據本發明之某些較佳實施例,此等特徵可結合一雙平面傳感器探測頭一起使用。
因此,可利用將高電壓傳輸、低電壓放大器/TGC及低電壓波束成形電路併入一單一晶片中之一單一IC晶片。使用一0.25微米設計規則,此混合信號電路可在小於0.7 x 0.7 (0.49) cm2 之一晶片面積中容納32個傳感器通道之波束成形。因此,可在小於1.5 x 1.5 (2.25) cm2 之一總電路板面積中使用四個32通道晶片處理128個通道。
如本文中使用之術語「多晶片模組」係指一電子封裝,其中使用一統一基板封裝多個積體電路(IC),從而促進其等用作為一單一組件(即,作為封裝於一小得多之體積中之一較高處理容量IC)。各IC可包括製造於一經薄化之半導體晶圓中之一電路。例示性實施例亦提供包含一或多個此等多晶片模組之一超聲波引擎及包含具有一或多個多晶片模組之一超聲波引擎電路板之一可攜式醫療超聲波成像系統。例示性實施例亦提供用於促進及組裝如本文中所教示之多晶片模組之方法。將TR晶片、前置放大器/TGC晶片及波束成形器晶片垂直堆疊於一電路板上最小化封裝大小(例如,長度及寬度)及該等晶片在該電路板上佔據之佔據面積。
一多晶片模組中之TR晶片、前置放大器/TGC晶片及波束成形器晶片可各包含多個通道(例如,每晶片8個通道至每晶片64個通道)。在某些實施例中,高電壓TR晶片、前置放大器/TGC晶片及樣本內插接納波束成形器晶片可各包含8個、16個、32個、64個通道。在一較佳實施例中,一個兩層波束成形器模組中之各電路具有32個波束成形器接納通道以提供一個64通道接納波束成形器。一第二64通道兩層模組可用於形成具有小於2 cm之一總厚度之一個128通道手持式平板電腦超聲波器件。亦可使用在各層中具有相同或類似通道密度之一傳輸多晶片波束成形器。
垂直整合於一多晶片模組中之晶片之例示性數目可包含(但不限於):兩個、三個、四個、五個、六個、七個、八個及類似者。在一超聲波器件之一實施例中,一單一多晶片模組係提供於執行超聲波特定操作之一超聲波引擎之一電路板上。在其他實施例中,複數個多晶片模組係提供於一超聲波引擎之一電路板上。該複數個多晶片模組可在該超聲波引擎之該電路板上垂直堆疊於彼此之頂部上以進一步最小化該電路板之封裝尺寸及佔據面積。
提供一或多個多晶片模組於一超聲波引擎之一電路板上達成一高通道數,同時最小化整體封裝尺寸及佔據面積。例如,可使用多晶片模組在約10 cm x約10 cm之例示性平面尺寸內組裝一個128通道超聲波引擎電路板,此係相對於習知超聲波電路之大得多的空間要求之一顯著改良。在一些實施例中,包含一或多個多晶片模組之一超聲波引擎之一單一電路板可具有16個至128個通道。在某些實施例中,包含一或多個多晶片模組之一超聲波引擎之一單一電路板可具有16個、32個、64個、128個或192個通道及類似者。
According to the present invention, a system and method for medical ultrasound imaging are disclosed. The presently disclosed systems and methods for medical ultrasound imaging use medical ultrasound imaging equipment. The medical ultrasound imaging equipment includes a handheld housing in one of the external dimensions of a tablet computer and a touch panel disposed on a front panel of the housing. Control the screen display. The touch screen display includes a multi-touch touch screen. The multi-touch touch screen can recognize and distinguish one or more single-point, multi-point, and And / or simultaneous touch, thereby allowing gestures (ranging from simple single-point gestures to complex multi-point movement gestures) to be used as user input to a medical ultrasound imaging device.
According to one aspect, an exemplary medical ultrasound imaging system includes a housing having a front panel and a rear panel rigidly mounted to each other in a parallel plane, a touch screen display, having at least one processor and at least one memory. A computer, an ultrasonic beam forming system, and a battery. The casing of the medical ultrasonic imaging device is implemented in a tablet computer appearance size. The touch screen display is disposed on the front panel of the casing and includes one or more single-point, multi-point, and / or one or more touch or gestures that can be recognized and distinguished on a surface of the touch-screen display. Point-touch LCD touch screen. A computer, ultrasonic beamforming system or engine and battery are operatively housed within the housing. Medical ultrasound imaging equipment can be detected using a Firewire connection operatively connected between a computer in the housing and the ultrasound engine and a probe with a probe attachment / detachment lever that facilitates the connection of at least one ultrasound probe / sensor. Head connector. In addition, the exemplary medical ultrasound imaging system includes an I / O port connector and a DC power input.
In an exemplary operation mode, medical personnel may use simple single-point gestures and / or more complex multi-point gestures as user inputs for multi-touch LCD touch screens for controlling exemplary medical ultrasound imaging devices. Operating mode and / or function. These single-point / multi-point gestures may correspond to single-point and / or multi-touch events mapped to one or more predetermined operations that can be performed by a computer and / or an ultrasound engine. Medical personnel can perform these single-point / multi-point gestures with various finger, palm, and / or stylus movements on the surface of a touch screen display. The multi-touch LCD touch screen receives single-point / multi-point gestures as user input, and provides these user inputs to a computer, which uses a processor to execute program instructions stored in memory to communicate with these Predetermined operations associated with single / multi-point gestures (at least some of these operations are performed in conjunction with an ultrasonic engine). These single-point / multi-point gestures on the surface of a touch screen display can include (but are not limited to): one-click selection gesture, one pinch gesture, one toggle gesture, one rotation gesture, one or two gestures, one expansion gesture , A drag gesture, a press gesture, a press and drag gesture, and a palm gesture. Compared to existing ultrasound systems that rely on many control features operated by mechanical switching, keyboard components, or touch pad trackball interfaces, the preferred embodiment of the present invention uses a single on / off switch. All other actions have been performed using touchscreen controls. In addition, the preferred embodiment uses a capacitive touch screen display that is sensitive enough to detect touch gestures that are actuated by the user's bare fingers and the user's gloved fingers. Medical personnel must generally wear sterile plastic gloves during medical procedures. Therefore, it is highly desirable to provide a portable ultrasound device that can be used by a gloved hand; however, this has previously prevented the use of touch screen display control functions in ultrasound systems for many applications requiring aseptic precautions. A preferred embodiment of the present invention provides that all gloved ultrasound imaging operations are controlled by a gloved person using a programmed touch gesture on a touch screen display.
According to an exemplary aspect, at least one swipe gesture can be used to control the tissue penetration depth of the ultrasonic waves generated by the ultrasonic probe / sensor. For example, a single swipe gesture in the "upward" direction on the touchscreen display surface can increase the penetration depth by one (1) cm or any other suitable amount, and the "downward" direction on the touchscreen display surface A single swipe gesture can reduce the penetration depth by one (1) cm or any other suitable amount. In addition, a drag gesture in one of the "up" or "down" directions on the touch screen display surface can increase or decrease the penetration depth by a multiple of one (1) centimeter or any other suitable amount. Additional operating modes and / or functions controlled by specific single / multi-point gestures on the touch screen display surface may include (but are not limited to): freeze / save operation, two-dimensional mode operation, gain control, color control, segmentation Screen control, PW imaging control, film / time-series video clip scrolling control, zoom and pan control, full screen control, Doppler and 2D beam steering control and / or body marking control. At least some operating modes and / or functions of the exemplary medical ultrasound imaging device can be controlled by one or more touch controls implemented on a touch screen display, wherein beamforming can be reset by moving touch gestures parameter. The medical staff may provide one or more specific single / multi-point gestures as user inputs for specifying at least a selected subset of touch controls to be implemented on the touch screen display as required and / or required. When some or more virtual buttons or icons are available, a larger number of touch controls achieve greater functionality when operating in full screen mode.
According to another exemplary aspect, a pressing gesture may be adopted in an area of the touch screen display, and in response to the pressing gesture, a virtual window may be provided on the touch screen display for displaying the touch screen. At least one enlarged portion of an ultrasound image is displayed on the display. According to yet another exemplary aspect, a press and drag gesture can be employed in the area of the touch screen display, and in response to the press and drag gesture, a predetermined feature of the ultrasound image can be tracked. In addition, a one-point selection gesture (substantially simultaneously with a part of the press and drag gestures) can be used in the area of the touch screen display, and in response to the click gesture, the predetermined characteristics of the ultrasound image can be tracked. These operations can be performed in different areas with a single display format, such that, for example, a move gesture in an area of interest in an image can be performed differently than the same performed in the image but outside the area of interest Gesture one function.
By providing a medical ultrasound imaging device with a multi-touch touch screen, medical personnel can control the device using simple single-point gestures and / or more complex multi-point gestures without the need for a traditional keyboard or knob. . Because the multi-touch touch screen eliminates the need for a traditional keyboard or knob, this medical ultrasound imaging device is easier to keep clean in hospitals and / or on-site environments, provides an intuitive user-friendly interface, and provides full Functional operation. In addition, by providing this medical ultrasound imaging device in a tablet form factor, medical personnel can easily transport the device between hospitals and / or on-site locations.
The system is operable to communicate with external and remote devices via a wireless communication network, such as a 3G or 4G wireless cellular network. The system can therefore provide voice and data transfer (including via a wireless public access network for mobile device communication).
Certain exemplary embodiments provide a multi-chip module for an ultrasonic engine of a portable medical ultrasound imaging system, in which a transmit / receive (TR) chip, a preamplifier / time gain compensation (TGC) chip And a beamformer wafer is assembled into a vertically stacked configuration. The transmission circuit provides a high-voltage electric drive pulse to the sensor element to generate a transmission beam. When the transmission chip is operated at a voltage greater than 80 V, a CMOS processing program using a 1 micron design rule has been used for the transmission chip and a one-micron design rule has been used for the low-voltage receiving circuit (less than 5 V).
A preferred embodiment of the present invention utilizes a one-micron processing procedure to provide a integrated circuit having sub-circuits operating at a plurality of voltages (eg, 2.5 V, 5 V, and 60 V or higher). According to some preferred embodiments of the present invention, these features can be used in conjunction with a dual-plane sensor probe.
Therefore, a single IC chip that combines high voltage transmission, low voltage amplifier / TGC, and low voltage beamforming circuits into a single chip can be used. Using a 0.25-micron design rule, this mixed-signal circuit can accommodate beam forming of 32 sensor channels in a chip area of less than 0.7 x 0.7 (0.49) cm 2 . Therefore, 128 channels can be processed using four 32-channel wafers in a total circuit board area of less than 1.5 x 1.5 (2.25) cm 2 .
The term "multi-chip module" as used herein refers to an electronic package in which multiple integrated circuits (ICs) are packaged using a unified substrate, thereby promoting their use as a single component (i.e., as a package in a small package) One of the much larger volumes (higher processing capacity IC). Each IC may include a circuit fabricated in a thinned semiconductor wafer. Exemplary embodiments also provide an ultrasonic engine including one or more such multi-chip modules and a portable medical ultrasonic imaging system including an ultrasonic engine circuit board having one or more multi-chip modules. The illustrative embodiments also provide methods for facilitating and assembling a multi-chip module as taught herein. Stacking TR wafers, preamplifier / TGC wafers, and beamformer wafers vertically on a circuit board minimizes the package size (eg, length and width) and the area occupied by the chips on the circuit board.
The TR chip, preamplifier / TGC chip, and beamformer chip in a multi-chip module may each include multiple channels (for example, 8 channels per chip to 64 channels per chip). In some embodiments, the high-voltage TR chip, the preamplifier / TGC chip, and the sample interpolation receiving beamformer chip may each include 8, 16, 32, and 64 channels. In a preferred embodiment, each circuit in a two-layer beamformer module has 32 beamformer receiving channels to provide a 64-channel receive beamformer. A second 64-channel two-layer module can be used to form a 128-channel handheld tablet PC ultrasonic device with a total thickness of less than 2 cm. It is also possible to use a transmission multi-chip beamformer with one of the same or similar channel density in each layer.
An exemplary number of chips vertically integrated in a multi-chip module may include (but is not limited to): two, three, four, five, six, seven, eight, and the like. In one embodiment of an ultrasonic device, a single multi-chip module is provided on a circuit board of an ultrasonic engine that performs a specific operation of an ultrasonic wave. In other embodiments, a plurality of multi-chip modules are provided on a circuit board of an ultrasonic engine. The multiple multi-chip modules can be stacked vertically on top of each other on the circuit board of the ultrasonic engine to further minimize the package size and footprint of the circuit board.
Provide one or more multi-chip modules to achieve a high number of channels on a circuit board of an ultrasonic engine while minimizing the overall package size and footprint. For example, a multi-chip module can be used to assemble a 128-channel ultrasonic engine circuit board in an exemplary planar size of about 10 cm x about 10 cm, which is a significant improvement over the much larger space requirements of conventional ultrasonic circuits. . In some embodiments, a single circuit board containing one of the ultrasonic engines of one or more multi-chip modules may have 16 to 128 channels. In some embodiments, a single circuit board including one of the ultrasonic engines of one or more multi-chip modules may have 16, 32, 64, 128 or 192 channels and the like.

[ 相關申請案交叉參考 ]
本申請案係2013年9月25日申請之美國申請案第14/037,106號之部分接續申請案,該案之全部內容以引用的方式併入本文中。
本發明揭示醫療超聲波成像之系統及方法。當前所揭示之醫療超聲波成像之系統及方法採用醫療超聲波成像設備,該醫療超聲波成像設備包含在一平板電腦外觀尺寸中之殼體及安置於該殼體之一前面板上之一觸控螢幕顯示器。該觸控螢幕顯示器包含一多點觸控式觸控螢幕,該多點觸控式觸控螢幕可辨識及區別在該觸控螢幕顯示器之一表面上之一或多個單點、多點及/或同時觸控,藉此容許使用手勢(範圍自簡單單點手勢至複雜多點移動手勢)作為至醫療超聲波成像設備之使用者輸入。關於平板電腦超聲波系統及操作之進一步細節係描述於2004年11月11日申請之美國申請案第10/997,062號、2003年3月11日申請之美國申請案第10/386,360號及美國專利第6,969,352號中,此等專利及申請案之全部內容以引用的方式併入本文中。
圖1描繪根據本發明之例示性醫療超聲波成像設備100之一闡釋性實施例。如圖1中所展示,該醫療超聲波成像設備100包含一殼體102、一觸控螢幕顯示器104、一電腦(其具有實施於一電腦主機板106上之至少一處理器及至少一記憶體)、一超聲波引擎108及一電池110。例如,該殼體102可以一平板電腦外觀尺寸或任何其他合適外觀尺寸來實施。該殼體102具有一前面板101及一後面板103。該觸控螢幕顯示器104係安置於該殼體102之該前面板101上,且包含可辨識及區別在該觸控螢幕顯示器104之一表面105上之一或多個多點及/或同時觸控之一多點觸控式LCD觸控螢幕。電腦主機板106、超聲波引擎108及電池110係操作性地安置於殼體102內。醫療超聲波成像設備100進一步包含操作性地連接於殼體102內之電腦主機板106與超聲波引擎108之間之一Firewire連接112 (亦參見圖2A),及具有促進至少一超聲波探測頭/傳感器之連接之一探測頭附接/卸離桿115之一探測頭連接器114 (亦參見圖2A及圖2B)。在某些較佳實施例中,傳感器探測頭殼體可包含電路組件,該等電路組件包含一傳感器陣列、傳輸及接收電路以及波束成形器及波束成形器控制電路。此外,醫療超聲波成像設備100具有一或多個I/O埠連接器116 (參見圖2A),該等I/O埠連接器116可包含(但不限於):一或多個USB連接器、一或多個SD卡、一或多個網路埠、一或多個小型顯示埠及一DC電力輸入。
在一例示性操作模式中,醫療人員(本文中亦稱為「使用者」或「若干使用者」)可採用簡單單點手勢及/或較複雜之多點手勢作為至觸控螢幕顯示器104之多點觸控式LCD觸控螢幕之使用者輸入以用於控制醫療超聲波成像設備100之一或多個操作模式及/或功能。此一手勢在本文中定義為至少一手指、一觸控筆及/或一手掌在觸控螢幕顯示器104之表面105上之一移動、一敲擊或一位置。例如,此等單點/多點手勢可包含靜態或動態手勢、連續或分段式手勢及/或任何其他合適手勢。一單點手勢在本文中定義為可使用藉由一單一手指、一觸控筆或一手掌之在觸控螢幕顯示器104上之一單一觸控接觸點來執行之一手勢。一多點手勢在本文中定義為可使用藉由多個手指或至少一手指、一觸控筆及一手掌之任何合適組合之在觸控螢幕顯示器104上之多個觸控接觸點來執行之一手勢。一靜態手勢在本文中定義為並不涉及至少一手指、一觸控筆或一手掌在觸控螢幕顯示器104之表面105上之移動之一手勢。一動態手勢在本文中定義為涉及至少一手指、一觸控筆或一手掌之移動之一手勢(諸如藉由跨觸控螢幕顯示器104之表面105拖曳一或多個手指所引起之移動)。一連續手勢在本文中定義為可在至少一手指、一觸控筆或一手掌在觸控螢幕顯示器104之表面105上之一單一移動或敲擊中執行之一手勢。一分段式手勢在本文中定義為可在至少一手指、一觸控筆或一手掌在觸控螢幕顯示器104之表面105上之多次移動或敲擊中執行之一手勢。
執行於觸控螢幕顯示器104之表面105上之此等單點/多點手勢可對應於單點或多點觸控事件,該等觸控事件經映射至可藉由電腦及/或超聲波引擎108執行之一或多個預定操作。使用者可藉由在觸控螢幕顯示器104之表面105上之各種單手指、多手指、觸控筆及/或手掌運動進行此等單點/多點手勢。多點觸控式LCD觸控螢幕接收單點/多點手勢作為使用者輸入,且提供該等使用者輸入至處理器,該處理器執行儲存於記憶體中之程式指令以進行與該等單點/多點手勢相關聯之預定操作(至少在有些時候結合超聲波引擎108進行該等操作)。如圖3A中所展示,觸控螢幕顯示器104之表面105上之此等單點/多點手勢可包含(但不限於):一點選手勢302、一捏合手勢304、一撥動手勢306、314、一旋轉手勢308、316、一點兩下手勢310、一展開型手勢312、一拖曳手勢318、一按壓手勢320、一按壓及拖曳手勢322及/或一手掌手勢324。例如,此等單點/多點手勢可儲存於實施於電腦主機板106上之記憶體中之至少一手勢程式庫中。可操作以控制系統操作之電腦程式可儲存於一電腦可讀媒體上及可視需要使用連接至一影像處理器之一觸控處理器及連接至系統波束成形器之一控制處理器來實施。因此,可回應於靜態觸控手勢及移動觸控手勢兩者來調整與傳輸及接收相關聯之波束成形器延遲。
根據圖1之闡釋性實施例,醫療超聲波成像設備100之一使用者可採用至少一撥動手勢306或314以控制藉由超聲波探測頭/傳感器產生之超聲波之組織穿透深度。例如,觸控螢幕顯示器104之表面105上之「向上」方向或任何其他合適方向上之一動態、連續撥動手勢306或314可增加穿透深度達一(1)公分或任何其他合適量。此外,觸控螢幕顯示器104之表面105上之「向下」方向或任何其他合適方向上之一動態、連續撥動手勢306或314可降低穿透深度達一(1)公分或任何其他合適量。此外,觸控螢幕顯示器104之表面105上之「向上」或「向下」方向或任何其他合適方向上之一動態、連續拖曳手勢318可增加或降低穿透深度達多個公分或任何其他合適量。
藉由觸控螢幕顯示器104之表面105上之特定單點/多點手勢控制之額外操作模式及/或功能可包含(但不限於):凍結/儲存操作、二維模式操作、增益控制、色彩控制、分割螢幕控制、PW成像控制、電影/時間序列影像剪輯捲動控制、變焦及水平搖攝控制、全螢幕控制、多普勒及二維波束導向控制及/或身體標記控制。可藉由實施於觸控螢幕顯示器104上之一或多個觸控控制項來控制醫療超聲波成像設備100之至少一些操作模式及/或功能。此外,使用者可提供作為使用者輸入之一或多個特定單點/多點手勢以用於指定根據要求及/或需要待實施於觸控螢幕顯示器104上之觸控控制項之至少一選定子集。經顯示為圖標或可自一功能表選擇之複數個預設掃描參數係與各成像模式相關聯使得針對該模式自動選擇掃描參數。
圖3B中展示一處理程序序列,其中回應於輸入於一觸控螢幕上之觸控手勢來控制超聲波波束成形及成像操作340。各種靜態及移動觸控手勢已經程式化至系統中使得資料處理器可操作以控制平板電腦器件內之波束成形及影像處理操作342。一使用者可選擇344一第一顯示操作,該第一顯示操作具有與其相關聯之第一複數個觸控手勢。使用一靜態或移動手勢,使用者可執行可操作以控制成像操作之複數個手勢之一者且可特定選擇可調整用於產生與第一顯示操作相關聯之影像資料之波束成形參數346之複數個手勢之一者。回應於更新之波束成形程序來更新及顯示348經顯示影像。使用者可進一步選擇執行具有一不同速度特性(方向或速率或兩者)之一不同手勢以調整350第一超聲波顯示操作之一第二特性。接著,基於第二手勢更新352經顯示之影像,該第二手勢可修改成像處理參數或波束成形參數。本文中進一步詳細描述此處理程序之實例,其中不同手勢之速度及方向之變化可與一選定顯示操作之相異成像參數相關聯。
血流或組織移動之超聲波影像(無論彩色血流還是頻譜多普勒)基本上係自移動之量測獲得。在超聲波掃描器中,傳輸一系列脈衝以偵測血液之移動。來自固定目標之回波在脈衝之間係相同的。來自移動散射體之回波在信號返回至掃描器所用之時間方面呈現略微差異。
如可自圖3C至圖3H中所見,必須有在波束之方向上之運動;若血流垂直於波束,則未接收到自脈衝至脈衝之任何相對運動,未偵測到任何血流。此等差異可經量測為一直接時間差,或更通常的是,可在自其獲得「多普勒頻率」之一相移方面量測此等差異。接著,處理該等差異以產生一彩色血流顯示或一多普勒聲波圖。在圖3C至圖3D中,血流方向係垂直於波束方向,脈衝波頻譜多普勒未量測到任何血流。在圖3G至圖3H中,當將超聲波波束導向至較佳對準至血流之一角度時,在彩色血流映像中展示一弱血流,且此外脈衝波多普勒量測到血流。在圖3H中,當將超聲波波束導向至更佳對準至回應於一移動之血流方向之一角度時,彩色血流映像更強,此外當PWD之校正角度經放置而對準至血流時,PWD量測到一強血流。
在此平板電腦超聲波系統中,一ROI (所關注區域)亦用於定義回應於超聲波傳輸波束之一移動手勢之方向。於圖3I中展示具有在彩色血流模式中之腎血流之一分支之一肝臟影像,因為ROI係自傳感器筆直向下,血流方向係幾乎法向於超聲波波束,所以偵測到非常弱的腎血流。因此,彩色血流模式係用於成像肝臟中之一腎血流。如可見,波束幾乎法向於血流且偵測到非常弱的血流。其中手指在ROI之外之一撥動手勢係用於導向波束。如圖3J中可見,藉由重設波束成形參數而導向ROI使得波束方向更對準至血流方向,偵測到該ROI內之一更強血流。在圖3J中,其中手指在ROI之外之一撥動手勢係用於將超聲波波束導向至更對準至血流方向之方向中。可見到ROI內之更強血流。其中手指在ROI之內之一水平移動手勢將移動ROI框至覆疊整個腎區域之一位置中,即,水平移動容許該ROI框之一平移移動使得該框覆疊整個目標區域。
圖3K證明一水平移動手勢。在手指處於ROI之內的情況下,手指可移動ROI框至影像平面內之任何地方。在以上實施例中,容易區分:其中一手指在一「ROI」框之外之一「撥動」手勢係意欲用於導向一波束且其中一手指在該「ROI」之內之一「拖曳及移動,即水平移動」手勢係意欲用於移動ROI框。然而,存在其中沒有任何ROI作為一參考區域之應用,則顯而易見將難以區分一「撥動」或一「水平移動」手勢,在此情況中,觸控螢幕程式需要追蹤手指之起始速度或加速度以判定該手勢係一「撥動」手勢還是一「拖曳及移動」手勢。因此,自觸控螢幕感測器器件接收資料之觸控引擎經程式化以在指示不同手勢之速度臨限值之間進行判別。因此,與不同移動手勢相關聯之時間、速率及方向可具有預設臨限值。兩個及三個手指靜態及移動手勢可具有單獨臨限值以區分此等控制操作。注意,預設之經顯示圖標或虛擬按鈕可具有相異靜態壓力或持續時間臨限值。當在全螢幕模式中操作時,觸控螢幕處理器(其較佳在執行其他成像操作(諸如掃描轉換)之系統中央處理單元上操作)關掉靜態圖標。
圖4A至圖4C描繪可由醫療超聲波成像設備100之使用者實施於觸控螢幕顯示器104上之觸控控制項之例示性子集402、404、406。應注意,根據要求及/或需要可將觸控控制項之(若干)任何其他合適子集實施於觸控螢幕顯示器104上。如圖4A中所展示,子集402包含用於執行二維(2D)模式操作之一觸控控制項408、用於執行增益控制操作之一觸控控制項410、用於執行色彩控制操作之一觸控控制項412及用於執行影像/剪輯凍結/儲存操作之一觸控控制項414。例如,一使用者可採用按壓手勢320以致動觸控控制項408,從而使醫療超聲波成像設備100返回至2D模式。此外,使用者可抵靠觸控控制項410之一側採用按壓手勢320以降低一增益位準,及抵靠該觸控控制項410之另一側採用按壓手勢320以增加該增益位準。此外,使用者可在觸控控制項412上採用拖曳手勢318以使用一預定色碼識別一2D影像上之密度之範圍。此外,使用者可採用按壓手勢320以致動觸控控制項414以凍結/儲存一靜止影像或獲取一電影影像剪輯。
如圖4B中所展示,子集404包含用於執行分割螢幕控制操作之一觸控控制項416、用於執行PW成像控制操作之一觸控控制項418、用於執行多普勒及二維波束導向控制操作之一觸控控制項420及用於執行註釋操作之一觸控控制項422。例如,一使用者可抵靠觸控控制項416採用按壓手勢320,以容許該使用者藉由在分割螢幕之各側上交替採用點選手勢302而在分割觸控螢幕顯示器104之相對側之間切換。此外,使用者可採用按壓手勢320以致動觸控控制項418且進入PW模式,此容許(1)使用者控制角度校正、(2)藉由採用按壓及拖曳手勢322來移動(例如,「向上」或「向下」)可顯示於觸控螢幕顯示器104上之一基線、及/或(3)藉由對可顯示於觸控螢幕顯示器104上之一比例尺採用點選手勢302來增加或降低比例。此外,使用者可抵靠觸控控制項420之一側採用按壓手勢320以在五(5)之增量或任何其他合適增量下執行至「左」或任何其他合適方向之2D波束導向,及抵靠觸控控制項420之另一側採用按壓手勢320以在五(5)之增量或任何其他合適增量下執行至「右」或任何其他合適方向之2D波束導向。此外,使用者可在觸控控制項422上採用點選手勢302,以容許該使用者經由可顯示於觸控螢幕顯示器104上之一彈出式鍵盤輸入註釋資訊。
如圖4C中所展示,子集406包含用於執行動態範圍操作之一觸控控制項424、用於執行Teravision™軟體操作之一觸控控制項426、用於執行映射操作之一觸控控制項428及用於執行針導引操作之一觸控控制項430。例如,一使用者可抵靠觸控控制項424採用按壓手勢320及/或按壓及拖曳手勢322以控制或設定動態範圍。此外,使用者可在觸控控制項426上採用點選手勢302以選取待藉由電腦主機板106上之處理器自記憶體執行之Teravision™軟體之一所要位準。而且,使用者可在觸控控制項428上採用點選手勢302以執行一所要映射操作。此外,使用者可抵靠觸控控制項430採用按壓手勢320以執行一所要針導引操作。
根據本發明,可在醫療超聲波成像設備100 (參見圖1)之觸控螢幕顯示器104之表面105上使用單點/多點手勢來執行在該觸控螢幕顯示器104上顯示為超聲波影像之物件(諸如器官、組織等)之各種量測及/或追蹤。使用者可直接對經顯示物件之一原始超聲波影像、對該經顯示物件之該超聲波影像之一經放大版本、及/或對觸控螢幕顯示器104上之一虛擬視窗506 (參見圖5C及圖5D)內之超聲波影像之一經放大部分執行物件之此等量測及/或追蹤。
圖5A及圖5B描繪顯示於醫療超聲波成像設備100 (參見圖1)之觸控螢幕顯示器104上之一例示性物件(即,具有一囊性病變504之一肝臟502)之一原始超聲波影像。應注意,可藉由醫療超聲波成像設備100回應於超聲波(該等超聲波藉由操作性地連接至該設備100之一超聲波探測頭/傳感器來產生)穿透肝臟組織而產生此一超聲波影像。可直接對顯示於觸控螢幕顯示器104上之原始超聲波影像(參見圖5A及圖5B)、或對該超聲波影像之一經放大版本執行具有囊性病變504之肝臟502之量測及/或追蹤。例如,使用者可使用藉由將兩個(2)手指放置於觸控螢幕顯示器104之表面105上且將其等展開分離以放大原始超聲波影像之一展開型手勢(例如,參見圖3之展開型手勢312)獲得超聲波影像之此一經放大版本。亦可對觸控螢幕顯示器104上之虛擬視窗506 (參見圖5C及圖5D)內之超聲波影像之一經放大部分執行肝臟502及囊性病變504之此等量測及/或追蹤。
例如,使用他或她的手指(例如,參見圖5A至圖5D之一手指508),使用者可藉由在所關注區域(諸如對應於囊性病變504之區域)附近抵靠觸控螢幕顯示器104之表面105採用一按壓手勢(例如,參見圖3之按壓手勢320) (參見圖5B)來獲得虛擬視窗506。回應於該按壓手勢,虛擬視窗506 (參見圖5C及圖5D)係顯示於觸控螢幕顯示器104上(可能至少部分疊加於原始超聲波影像上),藉此對使用者提供在囊性病變504附近之肝臟502之一經放大部分之一視圖。例如,圖5C之虛擬視窗506可提供囊性病變504之超聲波影像之一經放大部分之一視圖,該囊性病變504之該超聲波影像係由抵靠觸控螢幕顯示器104之表面105按壓之手指508覆疊。為重新定位虛擬視窗506內之經放大之囊性病變504,使用者可抵靠觸控螢幕顯示器104之表面105採用一按壓及拖曳手勢(例如,參見圖3之按壓及拖曳手勢322) (參見圖5D),藉此將囊性病變504之影像移動至虛擬視窗506內之一所要位置。在一實施例中,醫療超聲波成像設備100可經組態以容許使用者選擇比原始超聲波影像大2倍、4倍或任何其他合適倍數之虛擬視窗506內之一放大層級。使用者可藉由自觸控螢幕顯示器104之表面105提起他或她的手指(例如,參見圖5A至圖5D之手指508)而自該觸控螢幕顯示器104移除虛擬視窗506。
圖6A描繪顯示於醫療超聲波成像設備100 (參見圖1)之觸控螢幕顯示器104上之另一例示性物件(即,一心臟602之一心尖四(4)腔室視圖)之一超聲波影像。應注意,可藉由醫療超聲波成像設備100回應於超聲波(該等超聲波藉由操作性地連接至該設備100之一超聲波探測頭/傳感器產生)穿透心臟組織而產生此一超聲波影像。可直接對顯示於觸控螢幕顯示器104上之原始超聲波影像(參見圖6A至圖6E)、或對該超聲波影像之一經放大版本執行心臟602之量測及/或追蹤。例如,使用他或她的手指(例如,參見圖6B至圖6E之手指610、612),使用者可藉由在觸控螢幕顯示器104之表面105上採用一或多個多手指手勢而執行心臟602之一左心室606 (參見圖6B至圖6E)之一心內膜邊界604 (參見圖6B)之一手動追蹤。在一實施例中,使用他或她的手指(例如,參見圖6B至圖6E之手指610、612),使用者可藉由在觸控螢幕顯示器104之表面105上採用一點兩下手勢(例如,參見圖3A之點兩下手勢310)而獲得一游標607 (參見圖6B),且可藉由使用一手指(諸如手指610)採用一拖曳手勢(例如,參見圖3A之拖曳手勢318)而移動該游標607,藉此將該游標607移動至觸控螢幕顯示器104上之一所要部位。本文中所描述之系統及方法可用於心臟壁運動之定量量測且明確言之用於心室不同步之量測,如2004年4月2日申請之美國申請案第10/817,316號中詳細描述,該案之全部內容以引用的方式併入本文中。
一旦游標607處於觸控螢幕顯示器104上之所要部位(如藉由手指610之部位判定),使用者即可藉由使用另一手指(諸如手指612)採用一點選手勢(例如,參見點選手勢302;參見圖3)而將該游標607固定於該部位處。為執行心內膜邊界604 (參見圖6B)之一手動追蹤,使用者可使用手指610採用一按壓及拖曳手勢(例如,參見圖3之按壓及拖曳手勢322),如圖6C及圖6D中所繪示。可以任何合適方式(諸如藉由一虛線608 (參見圖6C至圖6E))在觸控螢幕顯示器104上反白顯示心內膜邊界604之此一手動追蹤。心內膜邊界604之該手動追蹤可繼續直至手指610到達觸控螢幕顯示器104上之任何合適部位處,或直至該手指610返回至游標607之部位,如圖6E中所繪示。一旦手指610處於游標607之部位處或任何其他合適部位處,使用者即可藉由使用手指612採用一點選手勢(例如,參見點選手勢302;參見圖3)而完成手動追蹤操作。應注意,可採用此一手動追蹤操作來追蹤(若干)任何其他合適特徵及/或波形(諸如一脈衝波多普勒(PWD)波形)。在一實施例中,醫療超聲波成像設備100可經組態以至少部分基於(若干)各自特徵/波形之一(若干)手動追蹤執行與此(等)特徵及/或波形有關之(若干)任何合適計算及/或量測。
如上文所描述,使用者可對觸控螢幕顯示器104上之一虛擬視窗內之一經顯示物件之一原始超聲波影像之一經放大部分執行物件之量測及/或追蹤。圖7A至圖7C描繪顯示於醫療超聲波成像設備100 (參見圖1)之觸控螢幕顯示器104上之一例示性物件(即,具有一囊性病變704之一肝臟702)之一原始超聲波影像。圖7A至圖7C進一步描繪提供囊性病變704之超聲波影像之一經放大部分之一視圖之一虛擬視窗706,該囊性病變704之該超聲波影像係由抵靠觸控螢幕顯示器104之表面105按壓之使用者之手指之一者(諸如一手指710)覆疊。使用他或她的手指(例如,參見圖7A至圖7C之手指710、712),使用者可藉由在觸控螢幕顯示器104之表面105上採用一或多個多手指手勢而執行虛擬視窗706內之囊性病變704之一大小量測。
例如,使用他或她的手指(例如,參見圖7A至圖7C之手指710、712),使用者可藉由在表面105上採用一點兩下手勢(例如,參見圖3之點兩下手勢310)而獲得一第一游標707 (參見圖7B、圖7C),且可藉由使用一手指(諸如手指710)採用一拖曳手勢(例如,參見圖3之拖曳手勢318)而移動該第一游標707,藉此將該第一游標707移動至一所要部位。一旦第一游標707處於該所要部位(如藉由手指710之部位判定),使用者即可藉由使用另一手指(諸如手指712)採用一點選手勢(例如,參見點選手勢302;參見圖3)而將該第一游標707固定於該部位處。類似地,使用者可藉由在表面105上採用一點兩下手勢(例如,參見圖3之點兩下手勢310)而獲得一第二游標709 (參見圖7C),且可藉由使用手指710採用一拖曳手勢(例如,參見圖3之拖曳手勢318)而移動該第二游標709,藉此將該第二游標709移動至一所要部位。一旦第二游標709處於該所要部位(如藉由手指710之部位判定),使用者即可藉由使用手指712採用一點選手勢(例如,參見點選手勢302;參見圖3)而將該第二游標709固定於該部位處。在一實施例中,醫療超聲波成像設備100可經組態以至少部分基於第一游標707及第二游標709之部位執行與囊性病變704有關之(若干)任何合適尺寸計算及/或量測。
圖8A至圖8C描繪顯示於醫療超聲波成像設備100 (參見圖1)之觸控螢幕顯示器104上之一例示性物件(即,具有一囊性病變804之一肝臟802)之一原始超聲波影像。圖8a至圖8c進一步描繪提供囊性病變804之超聲波影像之一經放大部分之一視圖之一虛擬視窗806,該囊性病變804之該超聲波影像係由抵靠觸控螢幕顯示器104之表面105按壓之使用者之手指之一者(諸如一手指810)覆疊。使用他或她的手指(例如,參見圖8A至圖8C之手指810、812),使用者可藉由在觸控螢幕顯示器104之表面105上採用一或多個多手指手勢而執行虛擬視窗806內之囊性病變804之一卡尺量測。
例如,使用他或她的手指(例如,參見圖8A至圖8C之手指810、812),使用者可藉由在表面105上採用一點兩下手勢(例如,參見圖3之點兩下手勢310)而獲得一第一游標807 (參見圖8B、圖8C),且可藉由使用一手指(諸如手指810)採用一拖曳手勢(例如,參見圖3之拖曳手勢318)而移動該游標807,藉此將該游標807移動至一所要部位。一旦游標807處於該所要部位(如藉由手指810之部位判定),使用者即可藉由使用另一手指(諸如手指812)採用一點選手勢(例如,參見點選手勢302;參見圖3)而將該游標807固定於該部位處。接著,使用者可採用一按壓及拖曳手勢(例如,參見圖3之按壓及拖曳手勢322)以獲得一連接線811 (參見圖8B、圖8C)及自第一游標807跨囊性病變804延伸該連接線811至該囊性病變804之另一側上之一所要部位。一旦連接線811跨囊性病變804延伸至該囊性病變804之另一側上之所要部位,使用者即可使用手指812採用一點選手勢(例如,參見點選手勢302;參見圖3)以獲得一第二游標809 (參見圖8C)及將其固定於該所要部位處。在一實施例中,醫療超聲波成像設備100可經組態以至少部分基於在第一游標807及第二游標809之部位之間延伸之連接線811執行與囊性病變804有關之(若干)任何合適卡尺計算及/或量測。
圖9A展示一系統140,其中具有一傳感器元件陣列152之一傳感器殼體150可在連接器114處附接至殼體102。各探測頭150可具有唯一識別經附接之探測頭之一探測頭識別電路154。當使用者插入具有一不同陣列之一不同探測頭時,系統識別該探測頭的操作參數。注意,較佳實施例可包含具有一觸控感測器107之一顯示器104,該觸控感測器107可連接至分析來自該感測器107之觸控螢幕資料及傳輸命令至兩個影像處理操作(如圖11所示之1124)及至一波束成形器控制處理器(如圖11所示之1116)之一觸控處理器109。在一較佳實施例中,觸控處理器可包含儲存操作一超聲波觸控螢幕引擎之指令之一電腦可讀媒體,該超聲波觸控螢幕引擎可操作以控制本文中所描述之顯示及成像操作。
圖9B展示超聲波應用程式內之一典型傳感器管理模組902之一軟體流程圖900。當偵測到一傳感器附接(TRANSDUCER ATTACH) 904事件時,該傳感器管理軟體模組902首先自識別(IDENTIFICATION)片段讀取傳感器類型ID 906及硬體版本資訊。該資訊係用於自硬碟取出特定傳感器設定檔資料集908且將其載入至應用程式之記憶體中。接著,軟體自工廠(FACTORY)片段讀取調整資料910且施加該等調整至剛載入至記憶體912中之設定檔資料。接著,軟體模組將一傳感器附接訊息914發送至主超聲波應用程式,該主超聲波應用程式使用已載入之傳感器設定檔。在確認916之後,執行一超聲波成像序列且更新使用(USAGE)片段918。接著,傳感器管理軟體模組等待一傳感器卸離(TRANSDUCER DETACH)事件920或時間過去5分鐘。若偵測921一傳感器卸離事件,發送及確認926一訊息924,則自記憶體移除928傳感器設定檔資料集且模組返回等待另一傳感器附接事件。若一5分鐘時間段到期而未偵測到一傳感器卸離事件,則軟體模組在使用片段922中增加一累計使用計數器,且等待另一5分鐘時間段或一傳感器卸離事件。將該累計使用記錄於記憶體中以用於維護及更換記錄。
存在許多類型之超聲波傳感器。其等在幾何結構、元件之數目及頻率回應方面不同。例如,具有10 MHz至15 MHz之中心頻率之一線性陣列係較佳適於胸部成像,且具有3 MHz至5 MHz之中心頻率之一彎曲陣列係較佳適於腹部成像。
對於相同或不同的超聲波掃描會話常常需要使用不同類型的傳感器。對於僅具有一傳感器連接之超聲波系統,操作者將在開始一新的掃描會話之前改變傳感器。
在一些應用中,在一超聲波掃描會話期間有必要在不同類型的傳感器之間切換。在此情況中,較為方便的是具有連接至相同超聲波系統之多個傳感器,且操作者可在無需實體卸離及重新附接傳感器(此花費較長時間)之情況下藉由點擊操作者控制台上之一按鈕而在此等經連接之傳感器之間快速切換。本發明之較佳實施例可包含在平板電腦殼體內之一多工器,該多工器可在該平板電腦殼體內之複數個探測頭連接器埠之間選擇,或替代性地,該平板電腦殼體可連接至可安裝至如本文中所描述之一推車上之一外部多工器。
圖9C係使用超聲波傳感器而無需感測器總成中的任何主動電子器件之一例示性針感測定位系統之一透視圖。該感測器傳感器可包含一被動超聲波傳感器元件。可以類似於利用超聲波引擎電子器件之一典型傳感器探測頭之一方式使用該等元件。系統958包含增加至一針導件962之超聲波傳感器元件960之增加,該針導件962係表示於圖9C中但可為任何合適外觀尺寸。可使用一針導件安裝支架966將超聲波傳感器元件960及針導件962安裝至一超聲波傳感器探測頭聲音握把或一超聲波成像探測頭總成970。具有安裝於暴露端上之一磁碟(超聲波反射體磁碟964)之針對超聲波具有反射性。
針導件962上之超聲波傳感器元件960可連接至超聲波引擎。可透過至引擎上之一專用探測頭連接器(類似於一共用鉛筆狀CW探測頭連接器)之一分離纜線進行該連接。在一替代實施例中,一小短纜線可插塞至一較大影像傳感器探測頭握把中或一分股纜線連接至引擎處之相同探測頭連接器。在另一替代實施例中,可經由影像探測頭握把與針導件之間(其等之間不具有一纜線)之一電連接器進行連接。在一替代實施例中,針導件上之超聲波傳感器元件可藉由圍封該針導件及該等傳感器元件於成像探測頭握把之相同機械圍封件中而連接至超聲波引擎。
圖9D係與傳感器元件960及超聲波反射體磁碟964一起定位之一針導件962之一透視圖。藉由傳輸來自該針導件962上之傳感器元件960之超聲波972而定位反射體磁碟964之位置。該超聲波972行進穿過空氣朝向反射體磁碟964且藉由該反射體磁碟964反射。經反射之超聲波974到達針導件962上之傳感器元件960。從經過之時間及聲音在空氣中之速率計算反射體磁碟964與傳感器元件960之間的距離976。
圖9E係使用超聲波傳感器而無需感測器總成中之任何主動電子器件的例示性針感測定位系統之一替代實施例之一透視圖。感測器傳感器可包含一被動超聲波傳感器元件。可以類似於利用超聲波引擎電子器件之一典型傳感器探測頭之一方式使用該等元件。
系統986包含可安裝至一針導件安裝支架966之針導件962,該針導件安裝支架966可耦合至用於成像患者之身體的一超聲波成像探測頭總成982或可為替代性的合適外觀尺寸。超聲波反射體磁碟964可安裝於針956之經暴露端處。在此實施例中,一線性超聲波聲音陣列978係平行於針956之移動方向而安裝。該線性超聲波聲音陣列978包含平行於針956定位之一超聲波傳感器陣列980。在此實施例中,定位一超聲波成像探測頭總成982以用於成像患者身體。使用一超聲波傳感器陣列984組態用於成像患者身體之該超聲波成像探測頭總成982。
在此實施例中,可藉由使用耦合至用於成像之一超聲波成像探測頭總成978之超聲波傳感器陣列980來偵測超聲波反射體磁碟964之位置。藉由傳輸來自用於成像之該超聲波成像探測頭總成978上之傳感器元件980之超聲波972而定位反射體磁碟964之位置。該超聲波972行進穿過空氣朝向反射體磁碟964且藉由該反射體磁碟964反射。經反射之超聲波974到達用於成像之超聲波成像探測頭總成978上之傳感器元件980。從經過之時間及聲音在空氣中之速率計算反射體磁碟964與傳感器元件980之間的距離976。在一替代實施例中,一交替演算法可用於循序地掃描傳感器陣列中之元件之極性及分析每傳感器陣列元件產生之反射。在一替代實施例中,在形成一超聲波影像之前可發生複數次掃描。
圖9F繪示類似於圖9A中所展示之系統且經組態以接納用於無線通信之一用戶識別模組(SIM)卡之一系統140。在此特定實施例中,通信電路118經連接至運算電路106,且一SIM卡埠119經組態以接納一SIM卡120且經由許多導電觸點將該SIM卡120連接至該通信電路118。在一些實施例中,可使用能夠接納一標準SIM卡、小型SIM卡、微型SIM卡、奈米SIM卡、嵌入式SIM卡或其他類似無線識別/授權卡或電路之一SIM卡埠119組態超聲波器件。系統併入一SIM卡介面電路118 (諸如可自荷蘭(The Netherlands)之埃因霍溫(Eindhoven)之NXP Semiconductors N.V.購得之SIM卡介面電路),該SIM卡介面電路118可包含電磁干擾(EMI)濾波及靜電放電(ESD)保護特徵。識別卡併入一識別電路(通常為嵌入於一塑膠卡或基板中之一積體電路),該識別電路包含儲存國際行動用戶識別碼(IMSI)及對一行動無線網路(諸如3G或4G通信網路)識別及鑑認用戶之一密鑰之一記憶體器件。
圖10A繪示根據例示性實施例之用於監測一心臟之同步之一例示性方法。在該方法中,一參考模板經載入於記憶體中且用於導引一使用者識別一成像平面(按照步驟930)。接著,一使用者識別一所要成像平面(按照步驟932)。通常使用心臟之一心尖4腔室視圖;然而,在不脫離本發明之精神之情況下可使用其他視圖。
有時,識別心內膜邊界可為困難的,且當遭遇此等困難時,可採用相同視圖之組織多普勒成像(按照步驟934)。提供用於識別中隔及橫向遊離壁之一參考模板(按照步驟936)。接著,可使用具有(例如) ±30 cm/sec之預設速度級之標準組織多普勒成像(TDI) (按照步驟938)。
接著,可提供所要三重影像之一參考(按照步驟940)。B模式或TDI可用於導引距離閘(按照步驟942)。B模式可用於導引距離閘(按照步驟944)或TDI用於導引距離閘(按照步驟946)。使用TDI或B模式來導引距離閘亦容許使用一方向校正角度以容許頻譜多普勒顯示中隔壁之徑向平均速度。接著,一第一脈衝波頻譜多普勒用於使用雙重或三重模式量測中隔壁平均速度(按照步驟948)。用於處理資料及計算不同步之軟體可利用一部位(例如,一中心點)以自動設定一心臟壁上之註明日期之(dated)部位之間之一角度以有助於簡化參數之設定。
亦使用一雙重影像或一TDI導引一第二距離閘位置(按照步驟950),且若需要則可使用一定向校正角度。在步驟950之後,藉由系統追蹤中隔壁及橫向遊離壁之平均速度。接著,在所關注區域(例如,中隔壁及左心室遊離壁)處之頻譜多普勒平均速度之時間積分952分別提供中隔及左遊離壁之位移。
可結合相關技術中已知之用於移除存在於所收集信號中之任何基線干擾之一高通濾波構件(類比或數位)利用以上方法步驟。此外,所揭示之方法採用用於追蹤心室間中隔及左心室遊離壁之移動之多個同時PW頻譜多普勒線。此外,可沿著各頻譜線採用一多閘結構,因此容許區域壁運動之定量量測。對多個閘求平均可容許量測全域壁移動。
圖10B係具有可透過一介面單元1020連接至任何個人電腦(PC) 1010之整合式超聲波探測頭1040之系統1000之一例示性實施例的一詳細示意性方塊圖。該超聲波探測頭1040經組態以傳輸超聲波及減少自一或多個影像目標1064反射之超聲波。傳感器1040可使用一或多個纜線1066、1068耦合至介面單元1020。該介面單元1020可定位於整合式超聲波探測頭1040與主機電腦1010之間。兩級波束成形系統1040及1020可透過一USB連接1022、1012連接至任何PC。
超聲波探測頭1040可包含由具有小於整個陣列之孔徑之一孔徑之相鄰元件組成之子陣列/孔徑1052。藉由1D傳感器陣列1062接收經返回之回波且將該等回波傳輸至控制器1044。該控制器藉由將信號傳輸至記憶體1058、1046而起始形成一粗略(coarse)波束。記憶體1058、1046將一信號傳輸至一傳輸驅動器1 1050及傳輸驅動器m 1054。接著,傳輸驅動器1 1050及傳輸驅動器m 1054分別將信號發送至多工器1 1048及多工器m 1056。將該信號傳輸至子陣列波束成形器1 1052及子陣列波束成形器n 1060。
各粗略波束成形操作之輸出可包含透過介面單元1020中之一第二級波束成形進行進一步處理以將波束成形輸出轉換成數位表示。該等粗略波束成形操作可經連貫加總以形成用於陣列之一精細(fine)波束輸出。可將信號自超聲波探測頭1040子陣列波束成形器1 1052及子陣列波束成形器n 1060傳輸至介面單元1020內之A/D轉換器1030及1028。在介面單元1020內存在用於將第一級波束成形輸出轉換成數位表示之A/D轉換器1028、1030。可藉由一客戶特定應用積體電路(ASIC) (諸如一場可程式化閘極陣列(FPGA) 1026)自A/D轉換器1030、1028接收數位轉換以完成第二級波束成形。該FPGA數位波束成形1026可將資訊傳輸至系統控制器1024。該系統控制器可將資訊傳輸至一記憶體1032,該記憶體1032可將一信號發送回至FPGA數位波束成形1026。替代性地,系統控制器1024可將資訊傳輸至定製USB3晶片組1022。該USB3晶片組1022可接著將資訊傳輸至一DC-DC轉換器1034。繼而,該DC-DC轉換器1034可將電力自介面單元1020傳輸至超聲波探測頭1040。在該超聲波探測頭1040內,一電源供應器1042可接收電力信號且與傳輸驅動器1 1050介接以提供電力至前端整合探測頭。
介面單元1020定製或USB3晶片組1022可用於提供介面單元1022與主機電腦1010之間之一通信鏈路。定製或USB3晶片組1022將一信號傳輸至主機電腦1010之定製或USB3晶片組1012。接著,該定製或USB3晶片組1012與微處理器1014介接。接著,該微處理器1014可顯示資訊或將資訊發送至一器件1075。
在一替代實施例中,可使用一窄頻帶波束成形器。例如,將一個別類比移相器應用至經接收回波之各者。接著,加總各子陣列內之相移輸出以形成一粗略波束。A/D轉換器可用於數位化該等粗略波束之各者;一數位波束成形器接著用於形成精細波束。
在另一實施例中,形成一64元件線性陣列可使用八個鄰近元件以形成一粗略波束輸出。此配置可利用將整合式探測頭之輸出連接至介面單元之八個輸出類比纜線。可透過纜線將粗略波束發送至定位於介面單元中之對應A/D轉換器。數位延遲係用於形成一精細波束輸出。可需要八個A/D轉換器以形成數位表示。
在另一實施例中,形成一128元件陣列可使用十六個子陣列波束成形電路。各電路可從一鄰近八元件陣列形成提供於至介面單元之第一級輸出中之一粗略波束。此配置可利用將整合式探測頭之輸出連接至介面單元以數位化輸出之十六個輸出類比纜線。一PC微處理器或一DSP可用於執行降頻轉換、基頻顯帶(base-banding)、掃描轉換及後影像處理功能。該微處理器或該DSP亦可用於執行全部多普勒處理功能。
圖10C係具有第一子陣列波束成形電路之整合式超聲波探測頭1040之系統1080之一例示性實施例之一詳細示意性方塊圖,且第二級波束成形電路係整合於主機電腦1082內部。具有第二級波束成形電路之後端電腦可為一PDA、平板電腦或行動器件殼體。超聲波探測頭1040經組態以傳輸超聲波及減少自一或多個影像目標1064反射之超聲波。傳感器1040使用一或多個纜線1066、1068耦合至主機電腦1082。注意,A/D電路元件亦可放置於傳感器探測頭殼體中。
超聲波探測頭1040包含由具有小於整個陣列之孔徑之一孔徑之相鄰元件組成之子陣列/孔徑1052。藉由1D傳感器陣列1062接收經返回之回波且將該等回波傳輸至控制器1044。該控制器藉由將信號傳輸至記憶體1058、1046而起始形成一粗略波束。記憶體1058、1046將一信號傳輸至一傳輸驅動器1 1050及傳輸驅動器m 1054。接著,傳輸驅動器1 1050及傳輸驅動器m 1054分別將信號發送至多工器1 1048及多工器m 1056。將該信號傳輸至子陣列波束成形器1 1052及子陣列波束成形器n 1060。
各粗略波束成形操作之輸出接著通過介面單元1020中之一第二級波束成形以將波束成形輸出轉換成數位表示。該等粗略波束成形操作可經連貫加總以形成用於陣列之一精細波束輸出。將信號自超聲波探測頭1040子陣列波束成形器1 1052及子陣列波束成形器n 1060傳輸至主機電腦1082內之A/D轉換器1030及1028。在主機電腦1082內存在用於將第一級波束成形輸出轉換成數位表示之A/D轉換器1028、1030。可藉由一客戶ASIC (諸如一FPGA 1026)自A/D轉換器1030、1028接收數位轉換以完成第二級波束成形。該FPGA數位波束成形1026將資訊傳輸至系統控制器1024。該系統控制器將資訊傳輸至一記憶體1032,該記憶體1032可將一信號發送回至FPGA數位波束成形1026。替代性地,系統控制器1024可將資訊傳輸至定製USB3晶片組1022。該USB3晶片組1022可接著將資訊傳輸至一DC-DC轉換器1034。繼而,該DC-DC轉換器1034可將電力自介面單元1020傳輸至超聲波探測頭1040。在該超聲波探測頭1040內,一電源供應器1042可接收電力信號且與傳輸驅動器1 1050介接以提供電力至前端整合探測頭。該電源供應器可包含致能傳感器總成之無線操作之一電池。一無線收發器可整合於控制器電路或一分離通信電路中以致能影像資料及控制信號之無線傳送。
主機電腦1082之定製或USB3晶片組1022可用於提供定製或USB3晶片組1012之間之一通信鏈路以將一信號傳輸至微處理器1014。接著,該微處理器1014可顯示資訊或將資訊發送至一器件1075。
圖11係超聲波引擎108 (即,前端超聲波特定電路)之一例示性實施例及圖1及圖2A中所繪示之超聲波器件之電腦主機板106 (即,主機電腦)之一例示性實施例之一詳細示意性方塊圖。該超聲波引擎108及/或該電腦主機板106之組件可實施於特定應用積體電路(ASIC)中。例示性ASIC具有一高通道數且可在一些例示性實施例中每晶片封裝32個或更多通道。一般技術者將認知,超聲波引擎108及電腦主機板106可包含比所展示之模組更多或更少之模組。例如,超聲波引擎108及電腦主機板106可包含圖17中所展示之模組。
一傳感器陣列152經組態以將超聲波傳輸至一或多個影像目標1102及接收自該一或多個影像目標1102反射之超聲波。該傳感器陣列152使用一或多個纜線1104耦合至超聲波引擎108。
超聲波引擎108包含用於施加驅動信號至傳感器陣列152及用於自該傳感器陣列152接收返回回波信號之一高電壓傳輸/接收(TR)模組1106。超聲波引擎108包含用於放大返回回波信號及施加合適時間增益補償(TGC)功能至該等信號之一前置放大器/TGC模組1108。超聲波引擎108包含一取樣資料波束成形器1110,其中已藉由前置放大器/TGC模組1108放大及處理在返回回波信號之後用於各通道中之延遲係數。
在一些例示性實施例中,高電壓TR模組1106、前置放大器/TGC模組1108及樣本內插接納波束成形器1110可各為每晶片具有8個至64個通道之一矽晶片,但例示性實施例並不限於此範圍。在某些實施例中,高電壓TR模組1106、前置放大器/TGC模組1108及樣本內插接納波束成形器1110可各為具有8個、16個、32個、64個通道及類似者之一矽晶片。如圖11中所繪示,一例示性TR模組1106、一例示性前置放大器/TGC模組1108及一例示性波束成形器1110可各採取呈包含32個通道之一矽晶片之形式。
超聲波引擎108包含一先進先出(FIFO)緩衝模組1112,該先進先出(FIFO)緩衝模組1112係用於緩衝藉由波束成形器1110輸出之經處理資料。超聲波引擎108亦包含用於儲存程式指令及資料之一記憶體1114及用於控制超聲波引擎模組之操作之一系統控制器1116。
超聲波引擎108經由一通信鏈路112與電腦主機板106介接,該通信鏈路114可遵循一標準高速通信協定,諸如Firewire (IEEE 1394標準串列介面)或快速(例如,200兆位元/秒至400兆位元/秒或更快)通用串列匯流排(USB 2.0 USB 3.0)協定。至電腦主機板之標準通信鏈路在400兆位元/秒或更高,較佳在800兆位元/秒或更高下操作。替代性地,鏈路112可為一無線連接,諸如一紅外線(IR)鏈路。超聲波引擎108包含建置及維持通信鏈路112之一通信晶片組1118 (例如,一Firewire晶片組)。
類似地,電腦主機板106亦包含建置及維持通信鏈路112之一通信晶片組1120 (例如,一Firewire晶片組)。電腦主機板106包含用於儲存資料及/或電腦可執行指令(該等電腦可執行指令用於執行超聲波成像操作)之一核心電腦可讀記憶體1122。該記憶體1122形成電腦之主記憶體,且在一例示性實施例中可儲存約4GB之DDR3記憶體。電腦主機板106亦包含用於執行儲存於核心電腦可讀記憶體1122上之電腦可執行指令之一微處理器1124,該等電腦可執行指令用於執行超聲波成像處理操作。一例示性微處理器1124可為一現有商業電腦處理器(諸如一Intel-Core i5處理器)。另一例示性微處理器1124可為一基於數位信號處理器(DSP)之處理器(諸如來自德州儀器之一或多個DaVinciTM 處理器)。電腦主機板106亦包含用於控制一顯示器件之一顯示控制器1126,該顯示器件可用於顯示超聲波資料、掃描及映圖。
由微處理器1124執行之例示性操作包含(但不限於):降頻轉換(用於自經接收之超聲波資料產生I、Q樣本)、掃描轉換(用於將超聲波資料轉換成一顯示器件之一顯示格式)、多普勒處理(用於判定及/或成像來自超聲波資料之移動及/或流資訊)、彩色血流處理(用於使用一實施例中之自相關產生疊加於一B模式超聲波影像上之多普勒頻移之一彩色編碼圖)、能量多普勒處理(用於判定能量多普勒資料及/或產生一能量多普勒圖)、頻譜多普勒處理(用於判定頻譜多普勒資料及/或產生一頻譜多普勒圖)及後信號處理。此等操作係進一步詳細描述於2003年3月11日申請之命名為「Ultrasound Probe with Integrated Electronics」之WO 03/079038 A2中,該案之全部內容以引用的方式明確併入本文中。
為達成一較小及較輕的可攜式超聲波器件,超聲波引擎108包含提供該超聲波引擎108之一電路板之整體封裝尺寸及佔據面積之減小。為此目的,例示性實施例提供最小化整體封裝尺寸及佔據面積同時提供一高通道數之一小且輕的可攜式超聲波器件。在一些實施例中,一例示性超聲波引擎之一高通道數電路板可包含一或多個多晶片模組,其中各晶片提供多個通道(例如,32個通道)。如本文中使用之術語「多晶片模組」係指一電子封裝,其中將多個積體電路(IC)封裝至一統一基板中,從而促進其等用作為一單一組件(即,作為一較大IC)。一多晶片模組可用於一例示性電路板中以啟用整合於一高密度互連(HDI)基板上之兩個或兩個以上主動IC組件以減小整體封裝尺寸。在一例示性實施例中,可藉由垂直堆疊一超聲波引擎之一傳輸/接收(TR)矽晶片、一放大器矽晶片及一波束成形器矽晶片而組裝一多晶片模組。超聲波引擎之一單一電路板可包含此等多晶片模組之一或多者以提供一高通道數,同時最小化該電路板之整體封裝尺寸及佔據面積。
圖12描繪包含組裝成一垂直堆疊組態之一多晶片模組之一電路板1200之一部分之一示意性側視圖。主動電子積體電路組件之兩個或兩個以上層係垂直整合於一單一電路中。IC層定向於在一垂直堆疊組態中實質上彼此平行而延伸之間隔平面中。在圖12中,電路板包含用於支撐多晶片模組之一HDI基板1202。包含(例如)一第一波束成形器器件之一第一積體電路晶片1204使用任何合適耦合機構(例如,環氧樹脂施用及固化)耦合至基板1202。一第一間隔層1206使用(例如)環氧樹脂施用及固化耦合至該第一積體電路晶片1204中與基板1202相對之表面。具有(例如)一第二波束成形器器件之一第二積體電路晶片1208使用(例如)環氧樹脂施用及固化耦合至第一間隔層1206中與第一積體電路晶片1204相對之表面。提供用於積體電路晶片之間之機械及/或電連接之一金屬框架1210。一例示性金屬框架1210可呈一引線框之形式。第一積體電路晶片1204可使用配線1212耦合至該金屬框架1210。第二積體電路晶片1208可使用配線1214耦合至相同金屬框架1210。提供一封裝1216以囊封多晶片模組總成且將多個積體電路晶片維持於相對於彼此實質上平行之配置中。
如圖12中所繪示,第一積體電路晶片1204、第一間隔層1206及第二積體電路晶片1208之垂直三維堆疊提供電路板上之高密度功能性,同時最小化整體封裝尺寸及佔據面積(相較於並不採用一垂直堆疊之多晶片模組之一超聲波引擎電路板)。一般技術者將認知,一例示性多晶片模組並不限於兩個堆疊的積體電路晶片。垂直整合於一多晶片模組中之晶片之例示性數目可包含(但不限於):兩個、三個、四個、五個、六個、七個、八個及類似者。
在一超聲波引擎電路板之一實施例中,提供如圖12中所繪示之一單一多晶片模組。在其他實施例中,複數個多晶片模組亦繪示於圖12中。在一例示性實施例中,複數個多晶片模組(例如,兩個多晶片模組)可在一超聲波引擎之一電路板上垂直堆疊於彼此之頂部上以進一步最小化該電路板之封裝尺寸及佔據面積。
除了需要減小佔據面積之外,亦需要降低多晶片模組中之整體封裝高度。例示性實施例可採用薄化至次數百微米之晶圓以減小多晶片模組中之封裝高度。
任何合適技術可用於組裝一多晶片模組於一基板上。例示性組裝技術包含(但不限於):積層MCM (MCM-L),其中基板係一多層積層印刷電路板;沈積MCM (MCM-D),其中多晶片模組係使用薄膜技術沈積於基底基板上;及陶瓷基板MCM (MCM-C),其中若干導電層係沈積於一陶瓷基板上及嵌入於玻璃層(其中在高溫(HTCC)或低溫(LTCC)下共燒該等層)中。
圖13係用於製造包含組裝成一垂直堆疊組態之一多晶片模組之一電路板之一例示性方法之一流程圖。在步驟1302中,製造或提供一HDI基板。在步驟1304中,提供一金屬框架(例如,引線框)。在步驟1306中,使用(例如)環氧樹脂施用及固化將一第一IC層耦合或接合至基板。第一IC層經導線接合至金屬框架。在步驟1308中,使用(例如)環氧樹脂施用及固化將一間隔層耦合至第一IC層,使得該等層垂直堆疊且實質上彼此平行而延伸。在步驟1310中,使用(例如)環氧樹脂施用及固化將一第二IC層耦合至間隔層,使得全部該等層垂直堆疊且實質上彼此平行而延伸。第二IC層經導線接合至金屬框架。在步驟1312中,一封裝係用於囊封多晶片模組總成。
一多晶片模組中之例示性晶片層可使用任何合適技術耦合至彼此。例如,在圖12中所繪示之實施例中,可在晶片層之間提供間隔層以間隔分離該等晶片層。鈍化矽層、晶粒附著膏層及/或晶粒附著膜層可用作間隔層。可用於製造一多晶片模組之例示性間隔件技術係進一步描述於(2008年5月27日至30日)在美國的弗羅里達州舉行的第58次電子組件及技術會議(Electronic Components and Technology Conference) (ECTC2008)之Toh CH等人之「Die Attach Adhesives for 3D Same-Sized Dies Stacked Packages」,第1538至1543頁中,該案之全部內容以引用的方式明確併入本文中。
對晶粒附著(DA)膏或膜之重要需求係對於鄰近晶粒之鈍化材料之極佳黏著性。又,對於一大晶粒應用需要一均勻接合鏈厚度(BLT)。此外,在高溫及低吸濕性下之高凝聚強度對於可靠性係較佳。
圖14A至圖14C係可根據例示性實施例使用之包含垂直堆疊晶粒之例示性多晶片模組之示意性側視圖。周邊及中心墊導線接合(WB)封裝兩者皆經繪示且可用於導線接合一多晶片模組中之例示性晶片層。圖14A係包含四個垂直堆疊晶粒之一多晶片模組之一示意性側視圖,其中該等晶粒藉由具有一2合1切割晶粒附著膜(D-DAF)之一鈍化矽層彼此間隔分離。圖14B係包含四個垂直堆疊晶粒之一多晶片模組之一示意性側視圖,其中該等晶粒藉由作為晶粒至晶粒間隔件之基於DA膜之黏著劑彼此間隔分離。圖14C係包含四個垂直堆疊晶粒之一多晶片模組之一示意性側視圖,其中該等晶粒藉由作為晶粒至晶粒間隔件之基於DA膏或膜之黏著劑間隔分離。在一些例示性實施例中該等基於DA膏或膜之黏著劑可具有導線穿透能力。在圖14C之例示性多晶片模組中,膜包線(FOW)係用於容許長導線接合及中心接合墊堆疊之晶粒封裝。FOW採用具有容許相同或類似尺寸之導線接合晶粒在無鈍化矽層之情況下直接堆疊於彼此之頂部上之導線穿透能力之一晶粒附著膜。此解決使相同或類似尺寸之晶粒直接堆疊於彼此之頂部上之問題,此另外提出一挑戰,此係因為不存在間隙或沒有足夠間隙用於較低晶粒之接合導線。
圖14B及圖14C中所繪示之DA材料較佳維持幾乎不具有空隙之一接合線厚度(BLT)及透過組裝程序排出。在組裝之後,夾置於晶粒之間之DA材料維持對晶粒之極佳黏著性。按需要定製DA材料之材料性質以在無塊狀裂解之情況下維持用於高溫可靠性加壓之高凝聚強度。按需要定製DA材料之性質以亦最小化或較佳消除可引起封裝可靠性失效(例如,爆開,藉此由於來自封裝中之水分之壓力積聚而發生介面或塊狀裂解)之水分累積。
圖15係使用(a)具有一2合1切割晶粒附著膜(D-DAF)之鈍化矽層、(b)DA膏、(c)厚DA膜及(d)採用具有容許相同或類似尺寸之導線接合晶粒在無鈍化矽間隔件之情況下直接堆疊於彼此之頂部上之導線穿透能力之一晶粒附著膜之膜包線(FOW)之晶粒至晶粒堆疊之特定例示性方法之一流程圖。各方法執行晶圓之背面研磨以減小晶圓厚度以達成積體電路之堆疊及高密度封裝。鋸割該等晶圓以分離個別晶粒。一第一晶粒係使用(例如)一烘箱中之環氧樹脂施用及固化接合至一多晶片模組之一基板。導線接合係用於將該第一晶粒耦合至一金屬框架。
在方法(A)中,使用一切割晶粒附著膜(D-DAF)以一堆疊方式將一第一鈍化矽層接合至第一晶粒。使用D-DAF以一堆疊方式將一第二晶粒接合至該第一鈍化矽層。導線接合係用於將該第二晶粒耦合至金屬框架。使用D-DAF以一堆疊方式將一第二鈍化矽層接合至該第二晶粒。使用D-DAF以一堆疊方式將一第三晶粒接合至第二鈍化矽層。導線接合係用於將該第三晶粒耦合至金屬框架。使用DAF以一堆疊方式將一第三鈍化矽層接合至該第三晶粒。使用D-DAF以一堆疊方式將一第四晶粒接合至該第三鈍化層。導線接合係用於將該第四晶粒耦合至金屬框架。
在方法(B)中,對於多薄晶粒堆疊應用重複晶粒附著(DA)膏施配及固化。將DA膏施配於一第一晶粒上,且一第二晶粒經提供於該DA膏上及經固化至該第一晶粒。導線接合係用於將該第二晶粒耦合至金屬框架。將DA膏施配於該第二晶粒上,且一第三晶粒經提供於該DA膏上及經固化至該第二晶粒。導線接合係用於將該第三晶粒耦合至金屬框架。將DA膏施配於該第三晶粒上,且一第四晶粒經提供於該DA膏上及經固化至該第三晶粒。導線接合係用於將該第四晶粒耦合至金屬框架。
在方法(C)中,切割及按壓晶粒附著膜(DAF)至一底部晶粒且接著將一頂部晶粒放置及熱壓縮於該DAF上。例如,將一DAF按壓至第一晶粒及將一第二晶粒熱壓縮至該DAF上。導線接合係用於將該第二晶粒耦合至金屬框架。類似地,將一DAF按壓至該第二晶粒及將一第三晶粒熱壓縮至該DAF上。導線接合係用於將該第三晶粒耦合至金屬框架。將一DAF按壓至該第三晶粒及將一第四晶粒熱壓縮至該DAF上。導線接合係用於將該第四晶粒耦合至金屬框架。
在方法(D)中,膜包線(FOW)採用具有容許相同或類似尺寸之導線接合晶粒在無鈍化矽層之情況下直接堆疊於彼此之頂部上之導線穿透能力之一晶粒附著膜。以一堆疊方式將一第二晶粒接合及固化至第一晶粒。膜包線接合係用於將該第二晶粒耦合至金屬框架。以一堆疊方式將一第三晶粒接合及固化至該第一晶粒。膜包線接合係用於將該第三晶粒耦合至金屬框架。以一堆疊方式將一第四晶粒接合及固化至該第一晶粒。膜包線接合係用於將該第四晶粒耦合至金屬框架。
在完成上述步驟之後,在各方法(a)至(d)中,執行晶圓成型及後成型固化(PMC)。隨後,執行捲珠安裝及單粒化。
於(2008年5月27日至30日)在美國的弗羅里達州舉行的第58次電子組件及技術會議(Electronic Components and Technology Conference) (ECTC2008)之Toh CH等人之「Die Attach Adhesives for 3D Same-Sized Dies Stacked Packages」,第1538至1543頁中提供關於以上描述之晶粒附著技術之進一步細節,該案之全部內容以引用的方式明確併入本文中。
圖16係包含以一垂直堆疊組態垂直整合於一基板1614上之一TR晶片1602、一放大器晶片1604及一波束成形器晶片1606之一多晶片模組1600之一示意性側視圖。圖12至圖15中所繪示之任何合適技術皆可用於製造多晶片模組。一般技術者將認知,在其他實施例中堆疊晶片之特定順序可為不同的。提供第一間隔層1608及第二間隔層1610以間隔分離晶片1602、1604、1606。各晶片耦合至一金屬框架(例如,一引線框) 1612。在某些例示性實施例中,熱傳遞及散熱機構可提供於多晶片模組中以在無塊狀裂解之情況下維持高溫可靠性加壓。參考圖12及圖14描述圖16之其他組件。
在此例示性實施例中,各多晶片模組可處置對於較大數目個通道(例如,32個通道)之完全傳輸、接收、TGC放大及波束成形操作。藉由將三個矽晶片垂直整合於一單一多晶片模組中,進一步減小印刷電路板所需之空間及佔據面積。複數個多晶片模組可提供於一單一超聲波引擎電路板上以進一步增加通道之數目同時最小化封裝尺寸及佔據面積。例如,一128通道超聲波引擎電路板108可製造於約10 cm x約10 cm之例示性平面尺寸內,此係習知超聲波電路之空間要求之一顯著改良。在較佳實施例中,包含一或多個多晶片模組之一超聲波引擎之一單一電路板可具有16個通道至128個通道。在某些實施例中,包含一或多個多晶片模組之一超聲波引擎之一單一電路板可具有16個、32個、64個、128個通道及類似者。
圖17係超聲波引擎108 (即,前端超聲波特定電路)之一例示性實施例及提供作為一單板完整超聲波系統之電腦主機板106 (即,主機電腦)之一例示性實施例之一詳細示意性方塊圖。如圖17中所繪示之一例示性單板超聲波系統可具有約25 cm x約18 cm之例示性平面尺寸,但其他尺寸亦係可行的。圖17之單板完整超聲波系統可實施於圖1、圖2A、圖2B及圖9A中所繪示之超聲波器件中,且可用於執行圖3至圖8、圖9B及圖10中所描繪之操作。
超聲波引擎108包含促進至少一超聲波探測頭/傳感器之連接之一探測頭連接器114。在超聲波引擎108中,可垂直堆疊一TR模組、一放大器模組及一波束成形器模組以形成如圖16中所展示之一多晶片模組,藉此最小化該超聲波引擎108之整體封裝尺寸及佔據面積。該超聲波引擎108可包含一第一多晶片模組1710及一第二多晶片模組1712,各模組包含垂直整合成如圖16中所展示之一堆疊組態之一TR晶片、一超聲波脈衝發生器及接收器、包含一時間增益控制放大器之一放大器晶片及一樣本資料波束成形器晶片。可使第一多晶片模組1710及第二多晶片模組1712垂直堆疊於彼此之頂部上以進一步最小化電路板上所需之區域。替代性地,該等第一多晶片模組1710及第二多晶片模組1712可水平安置於電路板上。在一例示性實施例中,TR晶片、放大器晶片及波束成形器晶片各為一32通道晶片,且各多晶片模組1710、1712具有32個通道。一般技術者將認知,例示性超聲波引擎108可包含(但不限於)一個、兩個、三個、四個、五個、六個、七個、八個多晶片模組。注意,在一較佳實施例中,可用傳感器殼體中之一第一波束成形器及平板電腦殼體中之一第二波束成形器組態系統。
ASIC及多晶片模組組態使一128通道完整超聲波系統能夠實施於呈一平板電腦格式之大小之一小單板上。一例示性128通道超聲波引擎108 (例如)可容納於約10 cm x約10 cm之例示性平面尺寸內,此係習知超聲波電路之空間要求之一顯著改良。一例示性128通道超聲波引擎108亦可容納於約100 cm2 之一例示性面積內。
超聲波引擎108亦包含用於產生時序時脈以使用傳感器陣列執行一超聲波掃描之一時脈產生複雜可程式化邏輯器件(CPLD) 1714。超聲波引擎108包含用於將自傳感器陣列接收之類比超聲波信號轉換至數位RF形成之波束之一類比轉數位轉換器(ADC) 1716。超聲波引擎108亦包含用於管理接收延遲設定檔及產生傳輸波形之一或多個延遲設定檔及波形產生器場可程式化閘極陣列(FPGA) 1718。超聲波引擎108包含用於儲存用於超聲波掃描之延遲設定檔之一記憶體1720。一例示性記憶體1720可為一單一DDR3記憶體晶片。超聲波引擎108包含經組態以管理超聲波掃描序列、傳輸/接收時序、儲存設定檔至記憶體1720及自該記憶體1720取出設定檔以及經由一高速串列介面112緩衝及移動數位RF資料串流至電腦主機板106之一掃描序列控制場可程式化閘極陣列(FPGA) 1722。該高速串列介面112可包含介於電腦主機板106與超聲波引擎108之間的Fire Wire或其他串列或並列匯流排介面。超聲波引擎108包含建置及維持通信鏈路112之一通信晶片組1118 (例如,一Fire Wire晶片組)。
提供一電力模組1724以供應電力至超聲波引擎108、管理一電池充電環境及執行電力管理操作。該電力模組1724可產生用於超聲波電路之經調節、低雜訊電力且可產生用於TR模組中之超聲波傳輸脈衝發生器之高電壓。
電腦主機板106包含用於儲存資料及/或電腦可執行指令(該等電腦可執行指令用於執行超聲波成像操作)之一核心電腦可讀記憶體1122。該記憶體1122形成電腦之主記憶體且在一例示性實施例中可儲存約4Gb之DDR3記憶體。該記憶體1122可包含用於儲存一作業系統、電腦可執行指令、程式及影像資料之一固態硬碟機(SSD)。一例示性SSD可具有約128GB之一容量。
電腦主機板106亦包含用於執行儲存於核心電腦可讀記憶體1122上之電腦可執行指令以執行超聲波成像處理操作之一微處理器1124。例示性操作包含(但不限於):降頻轉換、掃描轉換、多普勒處理、彩色血流處理、能量多普勒處理、頻譜多普勒處理及後信號處理。一例示性微處理器1124可為一現有商業電腦處理器(諸如一Intel Core-i5處理器)。另一例示性微處理器1124可為一基於數位信號處理器(DSP)之處理器(諸如來自德州儀器之DaVinciTM 處理器)。
電腦主機板106包含一輸入/輸出(I/O)及圖形晶片組1704,該輸入/輸出(I/O)及圖形晶片組1704包含經組態以控制I/O及圖形周邊設備(諸如USB埠、視訊顯示埠及類似者)之一共處理器。電腦主機板106包含經組態以提供一無線網路連接之一無線網路配接器1702。一例示性配接器1702支援802.11g及802.11n標準。電腦主機板106包含經組態以介接該電腦主機板106至顯示器104之一顯示控制器1126。電腦主機板106包含經組態以提供該電腦主機板106與超聲波引擎108之間的一快速資料通信之一通信晶片組1120 (例如,一Fire Wire晶片組或介面)。一例示性通信晶片組1120可為一IEEE 1394b 800 Mbit/sec介面。可替代性地提供其他串列或並列介面1706,諸如USB3、Thunder-Bolt、PCIe及類似者。提供一電力模組1708以供應電力至電腦主機板106、管理一電池充電環境及執行電力管理操作。
一例示性電腦主機板106可容納於約12 cm x約10 cm之例示性平面尺寸內。一例示性電腦主機板106可容納於約120 cm2 之一例示性面積內。
圖18係根據例示性實施例提供之一例示性可攜式超聲波系統100之一透視圖。該系統100包含在如圖18中所繪示之一平板電腦外觀尺寸中但可在任何其他合適外觀尺寸中之一殼體102。一例示性殼體102可具有低於2 cm且較佳在0.5 cm與1.5 cm之間之一厚度。該殼體102之一前面板包含一多點觸控式LCD觸控螢幕顯示器104,該多點觸控式LCD觸控螢幕顯示器104經組態以辨識及區別在該觸控螢幕顯示器104之一表面上之一或多個多點及/或同時觸控。可使用一使用者之手指、一使用者之手或一選用觸控筆1802之一或多者觸控該顯示器104之表面。殼體102包含一或多個I/O埠連接器116 (該一或多個I/O埠連接器116可包含(但不限於):一或多個USB連接器、一或多個SD卡、一或多個網路小型顯示埠)及一DC電力輸入。圖18中之殼體102之實施例亦可組態於具有150 mm x 100 mm x 15 mm (225000 mm3 之一體積)或更小之尺寸之一手掌承載之外觀尺寸內。殼體102可具有小於200 g之一重量。視需要,傳感器陣列與顯示器殼體之間的纜線敷設可包含如本文中所描述之介面電路1020。該介面電路1020可包含(例如)在自平板電腦懸掛之一莢狀物(pod)中之波束成形電路及/或A/D電路。分離連接器1025、1027可用於將懸掛莢狀物連接至傳感器探測頭纜線。該連接器1027可包含如本文中所描述之探測頭識別電路。單元102可包含一相機、一麥克風及一揚聲器以及用於語音及資料通信之無線電話電路以及可用於控制如本文中所描述之超聲波成像操作之語音啟動之軟體。
殼體102包含促進至少一超聲波探測頭/傳感器150之連接之一探測頭連接器114或耦合至該探測頭連接器114。該超聲波探測頭150包含一傳感器殼體,該傳感器殼體包含一或多個傳感器陣列152。該超聲波探測頭150可使用沿著一可撓性纜線1806提供之一殼體連接器1804耦合至探測頭連接器114。一般技術者將認知,超聲波探測頭150可使用任何其他合適機構(例如,包含用於執行超聲波特定操作(如波束成形)之電路之一介面殼體)耦合至殼體102。超聲波系統之其他例示性實施例係進一步詳細描述於2003年3月11日申請之命名為「Ultrasound Probe with Integrated Electronics」之WO 03/079038 A2中,該案之全部內容以引用的方式明確併入本文中。較佳實施例可採用介於手持式傳感器探測頭150與顯示器殼體之間的一無線連接。波束成形器電子器件可併入探測頭殼體150中以提供如本文中所描述之一個1D或2D傳感器陣列中之子陣列之波束成形。顯示器殼體可經定大小以固持於使用者之手之手掌中且可包含至公共存取網路(諸如網際網路)之無線網路連接性。
圖19繪示呈現於圖18之可攜式超聲波系統100之觸控螢幕顯示器104上之一主圖形使用者介面(GUI) 1900之一例示性視圖。當啟動超聲波系統100時可顯示該主GUI 1900。為協助一使用者巡覽該主GUI 1900,該GUI可視為包含四個例示性工作區域:一功能表列1902、一影像顯示視窗1904、一影像控制列1906及一工具列1908。額外GUI組件可提供於主GUI 1900上以(例如)使一使用者能夠關閉該GUI/或該GUI中之視窗、調整該GUI/或該GUI中之視窗的大小及退出該GUI及/或該GUI中之視窗。
功能表列1902使一使用者能夠選擇用於顯示於影像顯示視窗1904中之超聲波資料、影像及/或視訊。該功能表列1902可包含(例如)用於在一患者資料夾目錄及一影像資料夾目錄中選擇一或多個檔案之GUI組件。影像顯示視窗1904顯示超聲波資料、影像及/或視訊且可視需要提供患者資訊。工具列1908提供與一影像或視訊顯示器相關聯之功能性,包含(但不限於):用於保存當前影像及/或視訊至一檔案之一保存按鈕、保存最大可容許數目個先前圖框(如一電影回放(Cine loop))之一保存回放按鈕、用於列印當前影像之一列印按鈕、用於凍結一影像之一凍結影像按鈕、用於控制一電影回放之重播之態樣之一重播工具列及類似者。可提供於主GUI 1900中之例示性GUI功能性係進一步詳細描述於2003年3月11日申請之命名為「Ultrasound Probe with Integrated Electronics」之WO 03/079038 A2中,該案之全部內容以引用的方式明確併入本文中。
影像控制列1906包含可藉由憑藉一使用者直接對顯示器104之表面施加之觸控及觸控手勢而操作之觸控控制項。例示性觸控控制項可包含(但不限於):一2D觸控控制項408、一增益觸控控制項410、一色彩觸控控制項412、一儲存觸控控制項414、一分割觸控控制項416、一PW成像觸控控制項418、一波束導向觸控控制項420、一註釋觸控控制項422、一動態範圍操作觸控控制項424、一Teravision™觸控控制項426、一映圖操作觸控控制項428及一針導引觸控控制項430。結合圖4a至圖4c進一步詳細描述此等例示性觸控控制項。
圖20A描繪根據本發明之一實施例之以一平板電腦外觀尺寸實施之例示性醫療超聲波成像設備2000之一闡釋性實施例。該平板電腦可具有12.5” x 1.25” x 8.75”或31.7 cm x 3.175 cm x 22.22 cm之尺寸但其亦可在具有小於2500 cm3 之一體積及小於8 lbs之一重量之任何其他合適外觀尺寸中。如圖20A中所展示,該醫療超聲波成像設備2000包含一殼體2030、一觸控螢幕顯示器2010,其中可顯示超聲波影像2010及超聲波資料2040且超聲波控制項2020經組態以藉由一觸控螢幕顯示器2010加以控制。該殼體2030可具有一前面板2060及一後面板2070。該觸控螢幕顯示器2010形成該前面板2060且包含可辨識及區別使用者在該觸控螢幕顯示器2010上之一或多個多點及/或同時觸控之一多點觸控式LCD觸控螢幕。該觸控螢幕顯示器2010可具有一電容性多點觸控及AVAH LCD螢幕。例如,電容性多點觸控及AVAH LCD螢幕可使一使用者能夠在不損耗解析度之情況下從多個角度觀看影像。在另一實施例中,使用者可利用一觸控筆以將資料輸入於觸控螢幕上。平板電腦可包含一整合式可折疊支架,該整合式可折疊支架允許一使用者自與該平板電腦外觀尺寸共形之一儲存位置旋轉該支架使得該器件可平躺於後面板上,或替代性地,使用者可旋轉該支架以使該平板電腦能夠以相對於一支撐表面所成之複數個傾斜角度之一者站立於一直立位置處。
電容性觸控螢幕模組包括經塗佈有一透明導體(諸如銦錫氧化物)之一絕緣體(例如,玻璃)。製程可包含在玻璃、x感測器膜、y感測器膜及一液晶材料之間之一接合處理程序。平板電腦經組態以容許一使用者在佩戴一乾燥手套或一濕手套時執行多點觸控式手勢(諸如捏合及張開)。螢幕之表面記錄與該螢幕接觸之電導體。該接觸使螢幕靜電場畸變,從而導致電容之可量測變化。接著,一處理器解譯該靜電場之變化。藉由使用「內置式(in-cell)」技術減少層及產生觸控螢幕來實現增加回應位準。「內置式(in-cell)」技術藉由將電容器放置於顯示器內來減少層。應用「內置式(in-cell)」技術減小使用者之手指與觸控螢幕目標之間的可視距離,藉此產生與經顯示內容之一更具指向性之接觸及使點選手勢能夠具有一回應增加。
圖20B描繪根據本發明之一實施例之以一平板電腦外觀尺寸實施且經組態以接納一無線SIM卡之例示性醫療超聲波成像設備2000之一闡釋性實施例。在此特定實施例中,該超聲波成像設備/器件2000包含經組態以接納一SIM卡2084且將該SIM卡電路連接至該器件內之無線通信電路之一SIM卡埠2080。該SIM卡埠2080在此實施例中包含在內部之將SIM卡2084之ID電路連接至器件2000之電路之金屬接觸件。在此特定實例中,一SIM卡盤2082經組態以接納SIM卡2084且將其連接至SIM卡埠2080。在一些實施例中,SIM卡埠2080及/或SIM卡盤2082可經組態以接納一標準SIM卡、小型SIM卡、微型SIM卡、奈米SIM卡或其他類似無線識別/授權卡或電路。
圖21繪示根據本發明之一實施例之用於一模組化超聲波成像系統之一較佳推車系統。該推車系統2100使用包含接納平板電腦之一銜接機架之一基底總成2122。推車組態2100經組態以將包含一觸控螢幕顯示器2102之平板電腦2104銜接至一推車2108,該推車2108可包含一完整操作者控制台2124。在將平板電腦2104銜接至推車支架2108之後,系統形成繞系統之一完整特徵轉動。繞系統之該完整特徵轉動可包含一可調整高度器件2106、一凝膠固持器2110及一儲存箱(bin) 2114、複數個輪子2126、一熱探測頭固持器2120及操作者控制台2124。控制器件可包含在操作者控制台2124上之一鍵盤2112,該鍵盤2112亦可具有經增加之其他周邊設備(諸如一印表機或一視訊介面或其他控制器件)。
圖22繪示根據本發明之一實施例之用於具有一模組化超聲波成像系統之實施例中之一較佳推車系統。可使用耦合至一水平支撐部件之一垂直支撐部件2212組態該推車系統2200。具有用於輔助器件附接2014之一位置之一輔助器件連接器2018可經組態以連接至該垂直支撐部件2212。一3埠探測頭MUX連接器件2016亦可經組態以連接至平板電腦。一儲存箱2224可經組態以藉由一儲存箱附接機構2222附接至垂直支撐部件2212。推車系統亦可包含經組態以附接至垂直支撐部件之一繩管理系統2226。推車總成2200包含安裝於一基底2228上之支撐樑2212,該支撐樑2212具有輪子2232及對平板電腦之擴展操作提供電力之一電池2230。該總成亦可包含使用高度調整器件2226安裝之一配件固持器2224。固持器2210、2218可安裝於樑2212或控制台面板2214上。多埠探測頭多工器器件2216連接至平板電腦以提供使用者可使用經顯示之虛擬切換器依序選擇之若干傳感器探測頭之同時連接。對經顯示影像之一移動觸控手勢(諸如一個三手指撥動)或觸控一經顯示之虛擬按鈕或圖標可在經連接之探測頭之間切換。
圖23A繪示根據本發明之一實施例之用於一模組化超聲波成像系統之較佳推車安裝座系統。配置2300描繪耦合至銜接站2304之平板電腦2302。該銜接站2304係固定至附接機構2306。該附接機構2306可包含容許使用者顯示器傾斜至一使用者所要位置中之一鉸接部件2308。該附接機構2306附接至垂直部件2312。如本文中所描述之一平板電腦2302可安裝於基底銜接單元2304上,該基底銜接單元2304安裝至樑2212之頂部上之一安裝座總成2306上。基底單元2304包含托架2310、將系統2302連接至電池2230及多工器器件2216之電連接器2305及一埠2307。
圖23B繪示根據本發明之一實施例之用於經組態以接納一無線SIM卡之一模組化超聲波成像系統之一推車安裝座系統。在此特定實施例中,銜接站2304包含經組態以接納一SIM卡2084及將該SIM卡電路連接至定位於該銜接站2304或平板電腦2302內之無線通信電路之一SIM卡埠2080。在此特定實例中,一SIM卡2084可直接插入於SIM卡埠2080中,而在其他實例中一SIM卡盤(諸如圖20B中所展示之該SIM卡盤)可用於將SIM卡2084連接至SIM卡埠2080內之金屬接觸件。在一些實施例中,SIM卡埠2080及/或SIM卡盤2082可經組態以接納一標準SIM卡、小型SIM卡、微型SIM卡、奈米SIM卡或其他類似無線識別/授權卡或電路。
圖24繪示根據本發明之一實施例之用於一模組化超聲波成像系統之較佳推車系統2400,其中使用連接器2404將平板電腦2402連接於安裝總成2406上。配置2400描繪在不具有銜接元件2304之情況下經由附接機構2404耦合至垂直支撐部件2408之平板電腦2402。附接機構2404可包含用於顯示器調整之一鉸接部件2406。
圖25A及圖25B繪示一多功能銜接站系統2500。圖25A繪示銜接站2502及具有配接至該銜接站2502之一基底總成2506之平板電腦2504。該平板電腦2504及該銜接站2502可經電連接。平板電腦2504可藉由接合釋放機構2508而自該銜接站2502釋放。銜接站2502可含有用於連接一傳感器探測頭2510之一傳感器埠2512。銜接站2502可含有3個USB 3.0埠、一LAN埠、一耳機插座及用於充電之一電力連接器。圖25B繪示根據本發明之較佳實施例之平板電腦2504及具有一支架之銜接站2502之一側視圖。該銜接站可包含一可調整支架/握把2526。該可調整支架/握把2526可針對多個觀看角度而傾斜。可出於運輸目的而向上翻轉該可調整支架/握把2526。該側視圖亦繪示一傳感器埠2512及一傳感器探測頭連接器2510。
參考圖26A,整合式探測頭系統2600包含前端探測頭2602、主機電腦2604及一可攜式資訊器件(諸如一個人數位助理(PDA) 2606)。該PDA 2606 (諸如一掌上電腦(Palm Pilot)器件或其他手持式運算器件)係一遠端顯示器及/或記錄器件2606。在所展示之實施例中,藉由通信鏈路2608 (其係一有線鏈路)將前端探測頭2602連接至主機電腦2604。藉由一通信鏈路或介面2610 (其係一無線鏈路2610)將主機電腦2604 (一運算器件)連接至PDA 2606。
因在所描述之實施例中之整合式超聲波探測頭系統2600具有一基於Windows®之主機電腦2604,所以該系統可利用可用於Windows®作業系統之軟體之廣泛選擇。一潛在有用之應用係電連接超聲波系統,以容許醫師使用該系統發送及接收訊息、診斷影像、指令、報告或甚至遠端控制前端探測頭2602。
透過通信鏈路或介面2608及2610之連接可透過一乙太網路為有線的或透過一無線通信鏈路(諸如但不限於:IEEE 802.11a、IEEE 802.11b、超鏈接或家用射頻(HomeRF))為無線的。圖26A展示用於通信鏈路2608之一有線鏈路及用於通信鏈路2610之一無線鏈路。應認知,可使用其他有線實施例或協定。
無線通信鏈路2610可使用各種不同協定(諸如一RF鏈路),可使用一專用協定(諸如IEEE 1394協定堆疊或藍芽系統協定堆疊)之全部或部分來實施該等不同協定。IEEE 1394係用於高頻寬應用(諸如超聲波成像資料之高品質數位視訊編輯)之一較佳介面。藍芽協定使用電路及封包切換之一組合。可保留用於同步封包之時槽。藍芽可支援一非同步資料通道(高達三個同時同步通道),或同時支援非同步資料及同步語音之一通道。各同步通道支援在各方向上之一64 kb/s同步(語音)通道。非同步通道可支援最大723.2 kb/s不對稱或433.9 kb/s對稱。
藍芽系統由一無線電單元、一鏈路控制單元及用於鏈路管理及主機終端介面功能之一支援單元組成。鏈路控制器實行基頻帶協定及其他低階鏈路常式。
藍芽系統提供一點對點之連接(僅涉及兩個藍芽單元)或一單點對多點之連接。在該單點對多點之連接中,在若干藍芽單元之間共用通道。共用相同通道之兩個或兩個以上單元形成一微微網(piconet)。一藍芽單元作為該微微網之主單元,而其他單元作為從單元。高達七個從單元可在一微微網中為主動的。
藍芽鏈路控制器具有兩個主要狀態:備用(STANDBY)及連接(CONNECTION),此外,存在七個子狀態:傳呼、傳呼掃描、查詢、查詢掃描、主單元回應、從單元回應及查詢回應。該等子狀態係用於增加新的從單元至一微微網之臨時狀態。
亦可使用(但不限於)家用射頻(Home RF)或IEEE 802.11無線LAN規格實施鏈路。對於關於IEEE 802.11無線LAN規格之更多資訊,參見以引用的方式併入本文中之用於無線LAN之IEEE標準。IEEE標準可在全球資訊網(World Wide Web)之環球資源定位器(URL) www.ieee.org處找到。例如,硬體支援IEEE標準802.11b提供兩個個人電腦之在2 Mbps及11 Mbps下之一通信鏈路。對信號之傳輸及接收分配之頻帶係約2.4 GHz。相比而言,IEEE標準802.11a提供54 Mbps通信。對於此標準之頻率分配係大約5 GHz。最近,商家(諸如Proxim)已製造使用一專屬資料加倍、晶片組技術以達成108 Mbps通信之PC卡及存取點(基地台)。由Atheros Communications公司製造提供資料加倍之晶片(AR5000)。如同任何無線電系統,兩個電腦之間維持之實際資料速率係與傳輸器與接收器之間的實體距離有關。
無線鏈路2610亦可呈現其他形式(諸如,如由紅外線資料協會(IrDA)定義之一紅外線通信鏈路)。取決於所要之通信類型(即,藍芽、紅外線等),主機電腦5及遠端顯示器及/或記錄器件9各具有所要通信埠。
圖26B將探測頭3與主機電腦5之間的通信鏈路2608展示為一無線鏈路。主機電腦2604與PDA 2606之間的通信鏈路2610經展示為一有線鏈路。
圖26C之整合式探測頭系統2600具有用於介於探測頭2602與主機電腦2604之間之通信鏈路2608及介於主機電腦2604與PDA 2606之間之通信鏈路2610兩者之無線鏈路。應認知,有線鏈路及無線鏈路可兩者一起使用或替代性地可在一系統2600中純粹為有線鏈路或無線鏈路。
圖27之整合式探測頭系統2600之遠端顯示器及/或記錄器件2606係一遠端運算系統2612。該遠端運算系統2612除了具有遠端顯示及/或記錄能力之外亦可遠端控制探測頭2602。通信鏈路2610經展示為一無線鏈路。探測頭2602與主機電腦2604之間之通信鏈路2608經展示為一有線鏈路。
一遠端控制系統之一實例包含使用一隨身電腦(wearable computer) (諸如由Xybernaut公司製造之一隨身電腦)、一對高速、無線PC卡(諸如由Proxim提供之PC卡)及超聲波程式及探測頭2602。一可攜式經網路連線之超聲波系統可經組態為重量小於2.5磅。使用類似於Microsoft® NetMeeting之一程式,可建置一遠端PC與隨身電腦之間之一即時連接。遠端主機可監測與隨身電腦之全部互動,包含即時超聲波成像(在高達每秒約4個圖框之顯示速率下)。NetMeeting亦可用於「控制」隨身電腦及即時管理來自遠端個人電腦之超聲波會話。此外,可以108 Mbps將存檔至隨身電腦上之硬碟之影像及反覆可執行軟體指令傳送至主機電腦。藉由此技術,可以匹敵一硬接線之每秒100百萬位元(100 Mbps)之區域網路(LAN)之速率執行即時超聲波診斷且將超聲波診斷中繼至一遠端視野。
圖28繪示具有用於將複數個遠端器件2606連接至主機電腦2604之一集線器2802之一整合式探測頭系統2800。自該集線器2802至遠端器件之通信鏈路2804經展示為無線鏈路及有線鏈路兩者。應認知,可使用一完全有線網路(諸如一LAN或乙太網路)。替代地,使用在電腦(遠端器件) 2606之各者中之一無線收發器及埠,可易於建置一無線網路/通信系統。使用最近出現之高速無線標準(諸如IEEE 802.11a),遠端機器與本端機器之間的通信可匹敵一有線、100 Mbps區域網路(LAN)之通信。另一替代例係使用一藍芽系統形成一微微網。
對組合之音訊-視覺及電腦資料之使用之增加導致對於多媒體網路連線能力之更大需要且開始出現包含於本發明之較佳實施例中之解決方案。多媒體網路連線之標準化正在進行中,且IEEE 1394係作為能夠介接於許多音訊-視覺(AV)電腦及其他數位消費性電子器件且提供高達400 Mbps之傳輸頻寬之重要競爭者而出現。
較佳實施例使用IEEE 1394技術,該IEEE 1394技術使用一無線解決方案以用於經由IEEE 802.11 (用於在企業環境中以及愈來愈多地在家庭中之無線資料傳輸之新興標準)之1394協定之傳輸。在一較佳實施例中,IEEE 1394係實施為在802.11無線電硬體及乙太網路協定之頂部上之一協定配接層(PAL),從而帶來此等重要技術之一會聚。此協定配接層使PC能夠作為一無線1394器件運行。工程設計目標係使實際遞送之IEEE 1394頻寬足以用於一單一高清晰度(high-definition) MPEG2視訊串流(或多個標準清晰度MPEG2視訊串流)自一設施中之一室至另一室之傳輸。
本發明之較佳實施例包含使用Wi-LAN之寬頻正交頻分多工(W-OFDM)技術之在2.4 GHz下之IEEE 1394之無線傳輸之使用。此發展建置W-OFDM (最具頻寬效率之無線傳輸技術)作為能夠提供家用多媒體網路連線所需之資料速率之技術之一者。
無線IEEE 1394系統包含一MPEG-2資料串流產生器,該MPEG-2資料串流產生器將一多運輸串流饋送至諸如藉由Philips Semiconductors公司提供之一機上盒(STB)中。該STB將此信號轉換至一IEEE 1394資料串流且將該IEEE 1394資料串流施加至諸如由Wi-LAN.TM.提供之W-OFDM無線電系統。接著,無線電傳輸器將IEEE 1394資料串流經由空氣發送至(例如)主機電腦中之對應W-OFDM接收器。在接收側,解調變IEEE 1394信號且將其發送至兩個STB,該兩個STB在兩個分離TV監測器上顯示不同MPEG-2資料串流之內容。使用IEEE 1394作為網路之有線部分之介面最佳化整個系統以用於傳輸等時資訊(語音、實況視訊)及提供與設施中之多媒體器件之一理想介面。W-OFDM技術係本質上不受多路徑效應之影響。如同全部調變方案,OFDM編碼一射頻(RF)信號內之資料。無線電通信常常由於出現雜訊、天電干擾(stray)及經反射信號而受到阻礙。藉由在不同頻率上同時發送高速信號,OFDM技術提供健全通信。具備OFDM能力之系統對雜訊及多路徑高度容忍,以使廣域及家用多點涵蓋變得可能。此外,因為此等系統在使用頻寬方面非常有效率,所以更多的高速通道在一頻帶內為可能的。W-OFDM係藉由使用一寬頻帶而容許比習知OFDM大很多之輸送量之OFDM之一具成本效益之變型。W-OFDM進一步處理信號以最大化範圍。對習知OFDM之此等改良導致急劇增加之傳輸速率。
OFDM技術正變得愈來愈可見,此係因為美國及歐洲標準化委員會正選取其作為能夠提供可靠無線高資料速率連接之唯一技術。歐洲地面數位視訊廣播使用OFDM且IEEE 802.11工作組最近在其提出之6 Mbps至54 Mbps無線LAN標準中選定OFDM。歐洲電信標準協會正考量W-OFDM用於ETSI BRAN標準。關於Wi-LAN.TM.之詳細資訊可在網站http://www.wi-lan.com/Philips Semiconductors上找到,Philips Semiconductors為總部設在荷蘭(Netherland)之埃因霍溫(Eindhoven)之Royal Philips Electronics之一部門。可藉由在http://www.semiconductors.philips.com/存取其主頁而獲得關於Philips Semiconductors之額外資訊。
此外,亦可在較佳實施例中使用能夠在透過內壁高達7米及在視線下高達12米之傳輸範圍下達到400百萬位元(Mbps)之基於IEEE 1394高速串列匯流排之NEC公司之無線傳輸技術。藉由幅移鍵控(ASK)調變方案及一低成本收發器之開發,此實施例使用60 GHz毫米波長傳輸,此並不需要任何種類之授權。此實施例併入NEC之PD72880 400 Mbps遠距離傳輸實體層器件中之一回波偵測功能以防止信號反射之影響(此係IEEE 1394經由一無線連接之穩定操作之一顯著障礙)。
無線IEEE 1394可在將PC橋接至經互連之IEEE 1394器件之群集(其等可在設施中之另一室中)時發揮重要作用。三個例示性應用為自一PC源送(source)視訊或音訊串流、提供網際網路內容及連接性至一IEEE 1394群集,及提供命令、控制及組態能力至該群集。在第一實施例中,PC可提供資料至一設施中之另一室中之某位。在第二實施例中,PC可提供供1394致能器件存取網際網路之一渠道。在第三實施例中,PC發揮精心安排1394群集中之活動及在該群集內且經由該電橋路由資料(儘管實際資料並不流經PC)之作用。
圖29係展示對藉由一較佳實施例超聲波成像系統及相關聯架構2902產生之影像之無線存取之佈建之一圖式。成像系統2906匯出患者資訊及影像至對應資料夾2908中之檔案。可執行軟體指令具有實施上文所描述之超聲波成像方法所需之全部功能性。
無線代理器2910用於偵測患者目錄及影像檔案及開啟一埠以使無線用戶端獲得至其之連接。在建置一連接2914之後,其將患者清單及對應影像發送回至用戶端。例如,無線代理器2910可包含資料介面電路,該資料介面電路可包含一第一埠(諸如一RF介面埠)。
駐留於一手持式設備側上之無線觀看器2912可建置至無線代理器2910之連接及擷取患者及影像資訊。在使用者選擇患者及影像之後,其起始自無線代理器之檔案傳輸。在接收一影像之後,觀看器2912顯示此影像連同患者資訊。該影像經儲存於手持式設備上以供將來使用。手持式設備使用者可觀看在先前會話中擷取之影像或可要求新的影像傳輸。
圖33係繪示根據本發明之一例示性實施例之一可攜式資訊器件(諸如一個人數位助理(PDA)或任何運算器件)之一方塊圖。鏈路介面或資料介面電路3310繪示(但不限於)用於建置至另一器件之一無線鏈路之一鏈路介面。該無線鏈路較佳係藉由IEEE 1394通信規格定義之一RF鏈路。然而,該無線鏈路可呈現其他形式(諸如,如藉由紅外線資料協會(IrDA)定義之紅外線通信鏈路)。PDA包含能夠執行一RF堆疊3350之一處理器3360,該RF堆疊3350透過匯流排3308與一資料介面電路3310通信。該處理器3360亦透過匯流排3308連接至使用者介面電路3370、資料儲存器3306及記憶體3304。
資料介面電路3310包含一埠(諸如RF介面埠)。RF鏈路介面可包含一第一連接3312,該第一連接3312包含用於將信號轉換成射頻輸出且用於接受射頻輸入之射頻(RF)電路3314。該RF電路3314可經由建置通信埠1026之一收發器發送及接收RF資料通信。藉由RF電路3314接收之RF通信信號係經轉換成電信號且經由匯流排3308中繼至處理器3360中之RF堆疊3350。可藉由(但不限於)IEEE 1394規格實施膝上型個人電腦(PC) (主機電腦)與PDA之間的無線電介面3314、3316及鏈路。
類似地,PC主機電腦具有能夠通信至遠端定位之影像觀看器之一RF堆疊及電路。在一較佳實施例中,遠端影像觀看器可用於監測及/或控制超聲波成像操作(而不僅僅顯示所得成像資料)。
當前市場提供與無線連接性有關之許多不同選項。在一較佳實施例中,使用展頻技術無線LAN。在無線LAN解決方案中,最先進之係802.11b標準。許多製造者提供802.11b順應式設備。與選定手持式設備之相容性係無線連接性選項之一指定類別中之主要準則。
手持式設備市場亦提供各種手持式器件。為成像目的,具有高品質螢幕及足夠處理能力來顯示一影像係非常重要的。考量此等因素,在一較佳實施例中,使用一Compaq iPAQ,特定言之,使用一Compaq iPAQ 3870。使用與手持式設備相容之一無線PC卡(諸如Compaq之無線PC卡WL110及對應無線存取點)。
圖30繪示與一較佳實施例中之個人電腦或一替代實施例中之探測頭通信之影像觀看器3020。該影像觀看器具有容許使用者與根據本發明之較佳實施例之超聲波成像系統電腦或探測頭介接之使用者介面按鈕3022、3024、3026、3028。在一較佳實施例中,一通信介面(諸如按鈕3022)容許使用者起始與超聲波成像應用程式之一連接。類似地,按鈕3024係用於終止與超聲波成像應用程式之一經建置之連接。一按鈕3026作為用於提供可選擇之一患者清單及對應影像之一選擇按鈕。本端或遠端儲存此等影像。若經選定,則將可遠端儲存之影像傳輸至觀看器。該選定影像經顯示於觀看器3030上。
額外通信介面按鈕(諸如按鈕3028)作為一選項按鈕,該選項按鈕可(但不限於)容許改變組態參數(諸如一網際網路協定(IP)位址)。
圖31係繪示包含四個主要軟體組件之一較佳實施例超聲波影像收集及散發系統3140之一圖式。該系統之主硬體元件係超聲波探測頭3142a至3142n。與膝上型電腦3144a至3144n通信之探測頭容許產生超聲波影像及相關患者資訊且將影像及資訊遞交至一影像/患者資訊散發伺服器3146。該散發伺服器利用一SQL資料庫伺服器3148以儲存及擷取影像及相關患者資訊。該SQL伺服器提供分散式資料庫管理。多個工作站可操縱儲存於伺服器上之資料,且該伺服器協調操作及執行資源密集型計算。
可在兩個不同實施例中實施影像觀看軟體或可執行指令。在一第一實施例中,如圖30中所描述之影像觀看器之一完全固定版本可駐留於配備有高頻寬網路連接之一工作站或膝上型電腦上。在一第二實施例中,影像觀看器之一輕重量版本可駐留於配備有IEEE 802.11b及/或IEEE 802.11a順應式網路卡之一小的手持式掌上個人電腦(PocketPC) 3020。該掌上個人電腦影像觀看器僅實施容許基本影像觀看操作之有限功能性。無線網路協定3150 (諸如IEEE 802.11)可用於傳輸資訊至與一醫院網路通信之一手持式設備或其他運算器件3152。
此較佳實施例描述覆蓋醫院廣泛影像收集及擷取需要之超聲波成像系統。其亦提供對非影像患者相關資訊之瞬時存取。為提供醫院間資訊交換,影像散發伺服器具有維持跨廣域網路之彼此連接性之能力。
在另一較佳實施例中,探測頭3262可直接使用一無線通信鏈路3266與一遠端運算器件(諸如一PDA 3264)通信,如圖32之系統3260中所展示。該通信鏈路可使用IEEE 1394協定。探測頭及PDA3302兩者皆具有參考圖33所描述之一RF堆疊及電路以使用無線協定通信。探測頭包含一傳感器陣列、波束成形電路、傳輸/接收模組、一系統控制器及數位通信控制電路。於PDA中提供超聲波影像資料之後處理(包含掃描轉換)。
圖34係繪示根據本發明之一較佳實施例之整合超聲波系統3442之一成像及遠距醫療系統之一示意圖3440。該系統之較佳實施例輸出即時RF數位資料或前端資料。
如將理解,本文中所描述之各種無線連接(諸如超聲波診療所3442(例如,用於超聲波捕獲)、社區診療所3446(例如,用於普通遠距離醫療捕獲)、放射學診療所3448(例如,用於膠片數位化捕獲)及心臟診療所3450(例如,用於超聲波心動圖顯象捕獲)之間之連接3444;以及圖26A至圖29及圖31至圖32中所展示之連接)可包含使用任何數目個通信協定之3G、4G、GSM、CDMA、CDMA2000、W-CDMA或任何其他合適無線連接。在一些情況中,超聲波成像器件可經組態以回應於經由超聲波成像螢幕之觸控敏感使用者介面(UI)執行之來自使用者之一命令起始與一遠端電腦或其他電子器件之一無線連接。例如,在執行一超聲波程序期間、之前或之後,一使用者可起始與一醫院或醫生之一無線連接以傳輸超聲波成像資料。可在產生資料時即時傳輸該資料,或可傳輸已經產生之資料。亦可經由相同無線網路起始一音訊及/或視訊連接使得超聲波成像器件之使用者可在執行程序時與一醫院或醫生聯絡。可經由提供於超聲波成像螢幕上之觸控螢幕UI巡覽及/或選擇本文中所描述之選項。在一些實施例中,可使用(例如) JavaScript或一些其他基於瀏覽器之技術經由無線網路將觸控螢幕UI之全部或部分提供至使用者。該UI之部分可執行於器件上或遠端執行,且可實施各種器件端及/或伺服器端技術以提供本文中所描述之各種UI特徵。
圖35繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一2D成像操作模式。平板電腦2504之觸控螢幕可顯示藉由二維傳感器探測頭使用一256數位波束成形器通道獲得之影像。二維影像視窗3502描繪一個二維影像掃描3504。可使用彈性頻率控制3506獲得二維影像,其中控制參數經表示於平板電腦上。
圖36繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一運動操作模式。平板電腦之觸控螢幕顯示器3600可顯示藉由一運動操作模式獲得之影像。平板電腦之觸控螢幕顯示器3600可同時顯示二維模式成像3606及運動模式成像3608。平板電腦之觸控螢幕顯示器3600可顯示具有一個二維影像3606之一個二維影像視窗3604。使用圖形使用者介面顯示之彈性頻率控制項3506可用於將頻率自2 MHz調整至12 MHz。
圖37繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一彩色多普勒操作模式。平板電腦之觸控螢幕顯示器3700顯示藉由彩色多普勒操作模式獲得之影像。一個二維影像視窗3706係用作基底顯示器。色彩編碼資訊3708係覆疊於二維影像3710上。自經傳輸信號之經接收回波導出紅血球之基於超聲波之成像。回波信號之主要特性係頻率及振幅。振幅取決於藉由超聲波波束取樣之在體積內之移動血液量。可使用顯示器調整一高圖框速率或高解析度以控制掃描之品質。較高頻率可藉由快速血流產生且可以較淺色彩顯示,而較低頻率係以較深色彩顯示。彈性頻率控制項3704及彩色多普勒掃描資訊3702可顯示於平板電腦顯示器3700上。
圖38繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一脈衝波多普勒操作模式。平板電腦之觸控螢幕顯示器3800可顯示藉由脈衝波多普勒操作模式獲得之影像。脈衝波多普勒掃描產生用於分析血流在一小區域中沿著一所要超聲波游標(稱為樣本體積或樣本閘3812)之運動之一系列脈衝。平板電腦顯示器3800可描繪一個二維影像3802,其中覆疊樣本體積/樣本閘3812。平板電腦顯示器3800可使用一混合操作模式3806以描繪一個二維影像3802及一時間/多普勒頻移3810。若已知波束與血流之間之一適當角度,則可將該時間/多普勒頻移3810轉換成速度及血流。在該時間/多普勒頻移3810中之灰色陰影3808可表示信號之強度。頻譜信號之厚度可指示層狀血流或湍流。平板電腦顯示器3800可描繪可調整之頻率控制項3804。
圖39繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一個三重掃描操作模式。平板電腦顯示器3900可包含能夠單獨顯示或結合彩色多普勒或定向多普勒特徵顯示二維影像之一個二維視窗3902。平板電腦之觸控螢幕顯示器3900可顯示藉由彩色多普勒操作模式獲得之影像。一個二維影像視窗3902係用作基底顯示器。色彩編碼資訊3904係覆疊3906於二維影像3916上。可單獨使用或結合二維成像或彩色多普勒成像使用脈衝波多普勒特徵。平板電腦顯示器3900可包含藉由覆疊於二維影像3916上或經色碼覆疊3906 (單獨或組合地)之一樣本體積/樣本閘3908表示之一脈衝波多普勒掃描。平板電腦顯示器3900可描繪表示時間/多普勒頻移3912之一分割螢幕。若已知隔離波束與血流之間之一適當角度,則可將該時間/多普勒頻移3912轉換成速度及血流。在該時間/多普勒頻移3912中之灰色陰影3914可表示信號之強度。頻譜信號之厚度可指示層狀血流或湍流。平板電腦顯示器3900亦可描繪彈性頻率控制項3910。
圖40繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI主螢幕介面4000。當啟動超聲波系統時可顯示用於一使用者操作模式之該螢幕介面4000。為協助一使用者巡覽GUI主螢幕4000,該主螢幕可視為包含三個例示性工作區域:一功能表列4004、一影像顯示視窗4002及一影像控制列4006。額外GUI組件可提供於主GUI主螢幕4000上以使一使用者能夠關閉該GUI主螢幕及/或該GUI主螢幕中之視窗、調整該GUI主螢幕及/或該GUI主螢幕中之視窗之大小及退出該GUI主螢幕及/或該GUI主螢幕中之視窗。
功能表列4004使使用者能夠選擇用於顯示於影像顯示視窗4002中之超聲波資料、影像及/或視訊。該功能表列可包含用於在一患者資料夾目錄及一影像資料夾目錄中選擇一或多個檔案之組件。
影像控制列4006包含可藉由憑藉使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):一深度控制觸控控制項4008、一個二維增益觸控控制項4010、一全螢幕觸控控制項4012、一文字觸控控制項4014、一分割螢幕觸控控制項4016、一ENV觸控控制項4018、一CD觸控控制項4020、一PWD觸控控制項4022、一凍結觸控控制項4024、一儲存觸控控制項4026及一最佳化觸控控制項4028。
圖41繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI功能表螢幕介面4100。當自功能表列4104觸發功能表選擇模式藉此起始超聲波系統之操作時,可顯示用於一使用者操作模式之螢幕介面4100。為協助一使用者巡覽GUI主螢幕4100,該主螢幕可視為包含三個例示性工作區域:一功能表列4104、一影像顯示視窗4102及一影像控制列4120。額外GUI組件可提供於主GUI功能表螢幕4100上以(例如)使一使用者能夠關閉該GUI功能表螢幕及/或該GUI功能表螢幕中之視窗、調整該GUI功能表螢幕及/或該GUI功能表螢幕中之視窗的大小及退出該GUI功能表螢幕及/或該GUI功能表螢幕中之視窗。
功能表列4104使使用者能夠選擇用於顯示於影像顯示視窗4102中之超聲波資料、影像及/或視訊。該功能表列4104可包含用於在一患者資料夾目錄及一影像資料夾目錄中選擇一或多個檔案之觸控控制組件。以一擴展格式4106描繪之功能表列可包含例示性觸控控制項,諸如,一患者觸控控制項4108、一預設觸控控制項4110、一檢視觸控控制項4112、一報告觸控控制項4114及一設定觸控控制項4116。
影像控制列4120包含可藉由憑藉一使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):深度控制觸控控制項4122、一個二維增益觸控控制項4124、一全螢幕觸控控制項4126、一文字觸控控制項4128、一分割螢幕觸控控制項4130、一針視覺化ENV觸控控制項4132、一CD觸控控制項4134、一PWD觸控控制項4136、一凍結觸控控制項4138、一儲存觸控控制項4140及一最佳化觸控控制項4142。
圖42繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI患者資料螢幕介面4200。當啟動超聲波系統時,當自功能表列4202觸發患者選擇模式時可顯示用於一使用者操作模式之螢幕介面4200。為協助一使用者巡覽GUI患者資料螢幕4200,該患者資料螢幕可視為包含五個例示性工作區域:一新的患者觸控螢幕控制項4204、一新的研究觸控螢幕控制項4206、一研究清單觸控螢幕控制項4208、一工作清單觸控螢幕控制項4210及一編輯觸控螢幕控制項4212。在各觸控螢幕控制項內,進一步資訊輸入欄位4214、4216係可用。例如,患者資訊區段4214及研究資訊區段4216可用於記錄資料。
在患者資料螢幕4200內,影像控制列4218包含可藉由憑藉使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):接受研究觸控控制項4220、密切研究觸控控制項4222、列印觸控控制項4224、列印預覽觸控控制項4226、消除觸控控制項4228、一個二維觸控控制項4230、凍結觸控控制項4232及一儲存觸控控制項4234。
圖43繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI患者資料螢幕介面4300 (諸如預設參數螢幕介面)。當啟動超聲波系統時,當自功能表列4302觸發預設選擇模式4304時可顯示用於一使用者操作模式之螢幕介面4300。
在預設螢幕4300內,影像控制列4308包含可藉由憑藉使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):一保存設定觸控控制項4310、一刪除觸控控制項4312、CD觸控控制項4314、PWD觸控控制項4316、一凍結觸控控制項4318、一儲存觸控控制項4320及一最佳化觸控控制項4322。
圖44繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI檢視螢幕介面4400。當啟動超聲波系統時,當自功能表列4402觸發預設擴展檢視4404時可顯示用於一使用者操作模式之螢幕介面4400。
在檢視螢幕4400內,影像控制列4416包含可藉由憑藉使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):一縮圖設定觸控控制項4418、同步觸控控制項4420、選擇觸控控制項4422、一先前影像觸控控制項4424、一下一影像觸控控制項4426、一個二維影像觸控控制項4428、一暫停影像觸控控制項4430及一儲存影像觸控控制項4432。
一影像顯示視窗4406可容許使用者檢視呈複數個格式之影像。影像顯示視窗4406可容許一使用者觀看在組合或子集中之影像4408、4410、4412、4414或容許個別觀看任何影像4408、4410、4412、4414。影像顯示視窗4406可經組態以顯示待同時觀看之至多四個影像4408、4410、4412、4414。
圖45繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI報告螢幕介面。當啟動超聲波系統時,當自功能表列4502觸發報告擴展檢視4504時可顯示用於一使用者操作模式之螢幕介面4500。顯示螢幕4506含有超聲波報告資訊4526。使用者可使用在超聲波報告4526內之工作單選擇以輸入備註、患者資訊及研究資訊。
在報告螢幕4500內,影像控制列4508包含可藉由憑藉使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):一保存觸控控制項4510、一保存為觸控控制項4512、一列印觸控控制項4514、一列印預覽觸控控制項4516、一密切研究觸控控制項4518、一個二維影像觸控控制項4520、一凍結影像觸控控制項4522及一儲存影像觸控控制項4524。
圖46A繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI設定螢幕介面。當啟動超聲波系統時,當自功能表列4602觸發報告擴展檢視4604時可顯示用於一使用者操作模式之螢幕介面4600。
在設定擴展螢幕4604內,設定控制列4644包含可藉由憑藉使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):一通用觸控控制項4606、一顯示觸控控制項4608、一量測觸控控制項4610、註釋觸控控制項4612、一列印觸控控制項4614、一儲存/獲取觸控控制項4616、一DICOM觸控控制項4618、一匯出觸控控制項4620及一研究資訊影像觸控控制項4622。該等觸控控制項可含有容許使用者輸入組態資訊之一顯示螢幕。例如,通用觸控控制項4606含有一組態螢幕4624,其中使用者可輸入組態資訊。此外,通用觸控控制項4606含有容許使用者組態軟鍵銜接位置4626之一選擇。圖46B描繪具有一右側對準之軟鍵控制項4652。圖46B進一步繪示軟鍵控制箭頭4650之啟動將使鍵對準改變至相對側(在此情況中,左側對準)。圖46C描繪軟鍵控制項4662之左側對準,使用者可藉由使用軟鍵控制箭頭4660啟動一定向變化以將位置改變至右側對準。
在檢視螢幕4600內,影像控制列4628包含可藉由憑藉使用者直接對顯示器4664之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):一縮圖設定觸控控制項4630、同步觸控控制項4632、選擇觸控控制項4634、一先前影像觸控控制項4636、一下一影像觸控控制項4638、一個二維影像觸控控制項4640及一暫停影像觸控控制項4642。
圖47繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI設定螢幕介面。當啟動超聲波系統時,當自功能表列4702觸發報告擴展檢視4704時可顯示用於一使用者操作模式之螢幕介面4700。
在設定擴展螢幕4704內,設定控制列4744包含可藉由憑藉使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於)複數個圖標,諸如,一通用觸控控制項4706、一顯示觸控控制項4708、一量測觸控控制項4710、註釋觸控控制項4712、一列印觸控控制項4714、一儲存/獲取觸控控制項4716、一DICOM觸控控制項4718、一匯出觸控控制項4720及一研究資訊影像觸控控制項4722。該等觸控控制項可含有容許使用者輸入儲存/獲取資訊之一顯示螢幕。例如,儲存/獲取觸控控制項4716含有一組態螢幕4702,其中使用者可輸入組態資訊。使用者可致動容許使用者輸入字母數字字元於不同觸控啟動之欄位中之一虛擬鍵盤。此外,儲存/獲取觸控控制項4702含有容許使用者啟用追溯性獲取4704之一選擇。當使用者啟用儲存功能時,預設系統以儲存預期電影回放。若使用者啟用追溯性獲取,則儲存功能可追溯性地收集電影回放。
在設定螢幕4700內,影像控制列4728包含可藉由憑藉使用者直接對顯示器之表面施加之觸控及觸控手勢操作之觸控控制項。例示性觸控控制項可包含(但不限於):一縮圖設定觸控控制項4730、同步觸控控制項4732、選擇觸控控制項4734、一先前影像觸控控制項4736、一下一影像觸控控制項4738、一個二維影像觸控控制項4740及一暫停影像觸控控制項4742。
微型化PC致能之超聲波成像系統之一較佳實施例運行於一業界標準PC及Windows® 2000作業系統(OS)上。因此,已準備好對於遠距醫療解決方案為理想選擇同時具成本效益之網路。提供嵌入及因此與第三方應用程式整合之開放式架構支援。該較佳實施例包含一改良之應用程式設計介面(API)、共同介面、用於第三方應用程式(舉例而言,諸如(但不限於):輻射治療計畫、影像導引手術、整合式解決方案(例如,計算、三維及報告封裝))之匯出支援。該API提供供應用程式使用以起始與網路服務、主機通信程式、電話設備或程式至程式通信接觸之一組軟體中斷、調用及資料格式。基於軟體之特徵增強減少硬體因過時而作廢且提供有效升級。
此外,該較佳實施例包含晶片上系統積體電路(IC),該等晶片上系統積體電路(IC)運行於PC及具有一大通道數、大動態範圍、高影像品質、完整特徵集、廣泛診斷覆疊、最小供應鏈要求、針對簡單測試及高可靠性之簡化設計及非常低的維護成本。
如本文中先前所描述,較佳實施例包含一基於PC之設計,該設計係直觀的、具有一簡單圖形使用者介面、易於使用及訓練,其利用PC產業技術訣竅、健全電子器件、高品質顯示器及低製造成本。亦提供與其他應用程式之軟體控制通信之支援,該等應用程式係容許患者資料、掃描器影像、當前程序術語學(CPT)代碼管理之嵌入式應用,該當前程序術語學(CPT)代碼管理係一數字編碼系統,醫師藉由該數字編碼系統將其等程序及服務、醫師之計畫、結果評估報告全部記錄於一整合式PC上。對醫療保健之改革已施加壓力以降低成本,突顯對於首次看診/內場診斷、資料儲存及檢索解決方案之需要,該等解決方案在結合技術創新(舉例而言,諸如基於醫學數位成像及通信(DICOM)標準之資料儲存及檢索、寬頻及圖像存檔及通信系統(PACS))時,驅動患者記錄儲存及檢索及傳輸之變化、在用於超聲波資料獲取之較低成本/手持式器件方面之創新,其等全部實現本發明之較佳實施例。DICOM標準有助於醫療影像(舉例而言,諸如超聲波、磁共振影像(MRI)及電腦斷層(CT)掃描)之散發及觀看。寬頻係一廣域網路術語,該術語係指提供大於45 Mbps之頻寬之一傳輸設施。寬頻系統一般本質上為光纖。
本發明之一較佳實施例提供影像獲取及終端使用者應用程式(例如,輻射治療、手術、血管造影),全部應用程式執行於相同平台上。此透過一共同軟體介面提供低成本、人性化控制。超聲波系統具有用於高級使用者之可擴縮使用者介面且具有一基於直觀Windows®之PC介面。超聲波系統之一較佳實施例歸因於對於資料及影像之一站式(one-stop)影像捕獲、分析、儲存、檢索及傳輸能力之特徵亦提供一提高之診斷能力。藉由一128通道頻寬提供一高影像品質。除易於使用以外,超聲波系統亦提供在任何時間、任何部位及使用任何工具之患者存取。使用根據本發明之一較佳實施例之一10盎司探測頭提供所照護點(point of care)成像。資料儲存及檢索能力係基於DICOM標準且與現有第三方分析及患者記錄系統相容。根據一較佳實施例之超聲波系統亦使用(例如但不限於)電子郵件、LAN/WAN、DICOM及數位成像網路圖像存檔及通信系統(DINPAC)提供直接影像傳送能力。顯示經捕獲之影像之選擇包含(但不限於):一桌上型電腦、一膝上型電腦、隨身個人電腦及手持式器件(諸如個人數位助理)。
如前文所描述,本發明之超聲波系統係用於微創手術及機器人手術方法中,包含(但不限於):活體組織切片檢查程序、用於診斷及治療性血管造影之導管導入、胎兒成像、心臟成像、血管成像、內視鏡程序期間之成像、用於遠距醫療應用之成像及用於獸醫應用之成像、輻射治療及冷療法。該等實施例使用基於電腦之追蹤系統及CT及MR影像來精確定位目標區域之精確部位。超聲波系統之替代較佳實施例可在較低成本下及使用較小佔據面積器件以恰在程序之前、程序期間及緊接在程序之後提供影像。較佳實施例克服要求待推進操作室(procedure room)之一單獨超聲波設備及將來自超聲波之影像移動至追蹤位置及針對先前捕獲之CT及MR影像登記目標之器件之一方法之需要。超聲波系統之一較佳實施例提供一充分整合式解決方案,因為其可在與處理影像之任何第三方應用程式相同之平台上運行其超聲波應用程式。該系統包含一串流視訊介面(一第三方應用程式與該系統之超聲波應用程式之間的一介面)。此系統之一關鍵組件容許該兩個應用程式運行於相同電腦平台(使用相同作業系統(OS))上,舉例而言,諸如基於Windows®之平台,其他平台(諸如Linux)亦可使用且因此提供該兩個應用程式之一無縫整合。下文描述將影像自系統之超聲波應用程式移動至另一應用程式之軟體介面之細節。
較佳實施例包含控制及資料傳送方法,該等方法容許一第三方基於Windows®之應用程式藉由運行超聲波應用程式作為一後台任務、將控制命令發送至該超聲波應用程式及作為交換接收影像(資料)來控制(例如)一可攜式基於Windows®之超聲波系統。此外,實施例組態一可攜式超聲波基於Windows®之應用程式作為供應另一基於Windows®之應用程式(其作為一用戶端)之實況超聲波影像圖框之一伺服器。此用戶端應用程式接收此等超聲波影像圖框及對其等進行進一步處理。此外,一替代實施例組態可攜式超聲波基於Windows®之應用程式作為經由兩個通信機構(例如,藉由第三方(下文中可互換地稱為外部或一用戶端)使用以啟動及控制可攜式超聲波基於Windows®之應用程式之一組件物件模型(COM)自動化介面及遞送實況超聲波影像之一高速共用記憶體介面)與一第三方用戶端應用程式互動之一伺服器。
一較佳實施例包含及組態作為一可攜式基於Windows®之超聲波應用程式與另一第三方基於Windows®之應用程式之間的一串流視訊介面之一共用記憶體介面。此串流視訊介面經設計以即時提供超聲波影像至一第三方用戶端。
一較佳實施例容許第三方基於Windows®之應用程式控制來自可攜式超聲波基於Windows®之應用程式之影像通過相同PC平台內之共用記憶體介面之流率及實施此介面所需之記憶體之量。此等控制由設定影像緩衝器之數目、各緩衝器之大小及影像傳送之速率之方式組成。可針對零資料丟失設定此流率控制,從而確保將每個圖框自超聲波系統遞送至第三方基於Windows®之應用程式,或針對最低延時設定此流率控制,從而首先將藉由超聲波系統產生之最新圖框遞送至第三方基於Windows®之應用程式。
一較佳實施例格式化超聲波影像圖框,使得在第三方基於Windows®之應用程式自共用記憶體介面擷取(由可攜式超聲波基於Windows®之應用程式產生之)影像時該第三方基於Windows®之應用程式可解譯探測頭、空間及時間資訊。在伺服器(即,可攜式超聲波應用程式)與用戶端應用程式(第三方基於Windows®之應用程式)之間傳遞的實際影像資料係具有8位元像素及一256項目色彩表之一Microsoft器件無關位元映射(DIB)。影像圖框亦含有提供以下額外資訊之一標頭,例如(但不限於):探測頭類型、探測頭序號、圖框序列號、圖框率、圖框時間戳記、圖框觸發時間戳記、影像寬度(以像素為單位)、影像高度(以像素為單位)、像素大小(在X及Y上)、像素原點(影像中之第一像素相對於傳感器頭之x、y定位)及方向(沿著或跨影像之各線之空間方向)。
此外,較佳實施例透過使用ActiveX控制項控制用於在一基於Windows®之可攜式超聲波系統與一第三方基於Windows®之系統之間傳送超聲波影像之共用記憶體介面。基於Windows®之可攜式超聲波應用程式含有一ActiveX控制項,該ActiveX控制項將一圖框傳送至共用記憶體中及發送出一Windows®事件(其包含對於剛寫入之圖框之一指標)至第三方基於Windows®之應用程式。此第三方應用程式具有接收此事件並從共用記憶體取出影像圖框之一類似ActiveX控制項。
圖形使用者介面包含一或多個控制程式,該等控制程式之各者係較佳一自含型(例如)用戶端指令碼。該等控制程式經獨立組態以用於(除其他功能外)在使用者介面中產生基於圖形或文字之使用者控制項,用於在如藉由使用者控制項引導之使用者介面中產生一顯示區域,或用於顯示經處理之串流媒體。該等控制程式可實施為可在一媒體閘道容器環境中操作及可透過網頁控制之ActiveX控制項、Java applets或任何其他自含型及/或自執行應用程式或其等之部分。
超聲波內容可顯示於圖形使用者介面中之一圖框內。在一實施例中,程式產生一ActiveX控制項之一例項。ActiveX係指藉由華盛頓(Washington)之雷德蒙德(Redmond)之Microsoft®公司提供的一組物件導向程式設計技術及工具。ActiveX技術之核心部分係組件物件模型(COM)。根據ActiveX環境運行之一程式係稱為「組件」,可在網路中之任何地方運行之一自給自足程式(只要支援該程式)。此組件通常稱為一「ActiveX控制項」。因此,一ActiveX控制項係可藉由一電腦內或一網路之若干電腦中之許多應用程式重新使用之一組件程式物件,無關於以何種程式設計語言產生其。一ActiveX控制項在所謂的容器中運行,後者係利用COM程式介面之一應用程式。
使用一組件之一優點在於,其可藉由許多應用程式(其等稱為「組件容器」)重新使用。另一優點在於,可使用若干熟知語言或開發工具(包含C++、Visual Basic或PowerBuilder)之一者或使用指令碼工具(諸如VBScript)產生一ActiveX控制項。ActiveX控制項可經下載為(例如)較小可執行程式,或下載為用於網頁動畫之可自執行代碼。類似於ActiveX控制項且適用於用戶端指令碼之係applet。一applet通常為以Java.TM. (藉由加利福尼亞州之森尼維爾市(Sunnyvale)之SUN Microsystems公司發佈之一基於網路之物件導向程式設計語言)寫入之一自含型、自執行電腦程式。
可在用戶端系統處本端儲存及存取控制程式,或可自網路下載控制程式。通常藉由將一控制程式囊封於一或多個基於標示語言之檔案中來進行下載。控制程式亦可用於在若干作業系統環境之一者中運行之一應用程式通常所需之任何任務。Windows®、Linux及Macintosh係可在較佳實施例中使用之作業系統環境之實例。
超聲波成像系統之一較佳實施例具有針對影像串流能力之特定軟體架構。此超聲波成像系統係控制一較佳實施例之超聲波探測頭及容許獲得及顯示用於醫療目的之視覺影像之一應用程式。該成像系統具有其自身圖形使用者介面。此介面已達到特徵且經合宜地組織以提供與分離影像以及影像串流一起作用之最大彈性。一些可能醫療應用要求開發具有顯著不同特徵之圖形使用者介面。此涉及將成像系統整合至其他更複雜的醫療系統中。較佳實施例容許以一高度有效及便捷方式匯出成像資料以供原始設備製造者(OEM)直接存取成像資料。
藉由以下準則(諸如資料傳送效能)量測根據一較佳實施例之影像串流解決方案之品質。成像資料消耗大量記憶體及處理器電力。需要較大數目個分離影像圖框來產生實況醫療視訊患者檢查。最小化在將資料自產生視訊資料之一處理程序傳送至消耗視訊資料之一處理程序之一處理程序中之資料應對操作變得非常重要。第二準則包含業界標準成像格式。因為意欲藉由第三方公司開發消耗視訊成像資料之應用程式,所以可以業界標準格式表示資料。一第三準則係便利性。可藉由使用方便且並不需要額外學習之一程式設計介面呈現成像資料。
此外,準則包含可擴縮性及可擴展性。一串流資料架構可易於擴展以適應新的資料類型。其可提供用於以一個以上資料接收處理程序為目標之視訊串流之未來倍增之一基本架構。
較佳實施例之影像串流架構提供在兩個處理程序之間之資料傳輸之方法。該影像串流架構定義調節資料傳送處理程序之操作參數,及描述在處理程序之間傳送參數之機構。將操作參數自一第三方用戶端應用程式傳送至一較佳實施例之成像系統之方法之一者係藉由使用現有COM介面。
在一較佳實施例中,影像傳送架構集中使用物件導向程式設計方法論及Microsoft Windows®作業系統之處理間能力。該物件導向方法論提供容許滿足必要要求之一架構解決方案之一必要基礎。其亦對於使修改相對較簡單及反向相容之未來增強及擴展奠定根基。
視訊成像資料表示在不同資料元素之間具有相互干擾之複雜資料結構。其亦允許及常常需要相同資料元素之不同解譯。以下影像傳送架構之較佳實施例包含用於實體資料交換之一共用記憶體。例如,Windows®共用記憶體係在處理程序之間交換資料之一快速及經濟方式。此外,在某些實施例中共用記憶體可再劃分成具有一固定大小之分離區段。接著,各區段可處於一最小可控制單元處。此外,成像資料可抽象化為物件。可藉由一分離物件表示成像資料之各圖框。接著可將該等物件映射至共用記憶體之區段。
較佳實施例可包含一區段物件之鎖定-解鎖。所使用之程式設計API通知機制係一事件驅動機制。事件驅動機構係基於C++純虛擬函式之實施方案。
在一較佳實施例中,影像傳送架構由三個層組成:一應用程式設計介面(API)層、一程式設計介面實施方案及共用記憶體存取層以及一實體共用記憶體層。該應用程式設計介面層提供與一用戶端及伺服器端上之應用程式之兩個不同C++類別程式庫介面。屬於應用程式自身之指令之全部相關聯序列亦係此層之部分。應用程式衍生類別及其等實施方案係應用程式設計介面層之關鍵要素。作為成像資料提供者方之伺服器使用(例如)物件傳輸器類別及相關衍生及基底類別。作為成像資料消費者方之用戶端使用(例如)一Object Factory類別及相關衍生及基底類別。
程式設計介面實施方案層對應用程式提供兩個不同動態鏈接程式庫(DLL)實施類別。此層將與應用程式相關聯之類別之物件映射至存取共用記憶體實體系統物件之物件之一內部實施方案。此層容許隱藏來自應用程式之範疇之全部實施方案特定成員變量及函式。因此,應用程式設計介面層變得不淩亂、易於理解及使用。伺服器端應用程式可使用(例如) Object-Xmitter.DLL,而用戶端應用程式可使用(例如) ObjectFactory.DLL。
實體共用記憶體層表示實施共用記憶體功能性之作業系統物件。其亦描述共用記憶體之結構、其之分段及記憶體控制區塊。
關於共用記憶體之組織,因為共用記憶體意欲用於程序間通信,所以作業系統指定在其產生時之一獨特名稱。為管理共用記憶體,需要其他程序間通信(IPC)系統物件。其等亦需要具有獨特名稱。為簡化一獨特名稱產生處理程序,僅需要一個基礎名稱。全部其他名稱係藉由一實施方案代碼自該基礎名稱導出。因此,應用程式設計介面對於邏輯共用記憶體物件需要僅一基礎名稱之規格。可藉由應用程式之伺服器端及應用程式之用戶端兩者使用相同獨特名稱。
應用程式之伺服器端負責共用記憶體之產生。在一產生處理程序中,不僅必須指定共用記憶體之獨特名稱,而且必須指定其他組態參數。此等參數包含(但不限於):指定待分配之片段之數目之片段數、片段大小及操作旗標。在一較佳實施例中存在三個此等旗標。第一旗標指定片段提交及檢索順序。該順序可為後進先出(LIFO)、先進先出(FIFO)或最後進出(LIO)之一者。LIO係普通LIFO之以使得無論何時在一新圖框到達時若找到準備用於檢索但仍未經鎖定以用於檢索之圖框則將該等圖框擦除之一方式之一修改。第二旗標指定在當請求一新片段分配但不存在可用之片段時之一條件下之共用記憶體實施方案行為。通常此可在接收應用程式處理資料比提交應用程式要慢時發生。此旗標可容許刪除先前經分配之片段之一者。若其並不容許刪除先前經分配之片段之一者,則其將一異常情況報告回至應用程式。使用此旗標應用程式可自動選擇將資料重寫入一共用記憶體中或其可控制該資料重寫處理程序本身。可僅在第二旗標容許將片段重寫入一共用記憶體中時使用第三旗標。其指定如何選擇待重寫之一片段。藉由預設,共用記憶體實施方案刪除最年輕或最近提交之資料片段。替代性地,可選擇最舊片段以用於重寫處理程序。
在產生共用記憶體時,初始化其實體布局。因為作業系統並不容許在一實體共用記憶體中之位址計算,所以在共用記憶體內並不使用資料指標。可依據自虛擬原點(VO)之相對位移來實施共用記憶體控制區塊及片段內之全部定址。藉由自VO之位移零,分配共用記憶體標頭結構。其含有上文列出之全部參數。圖48係繪示實體共用記憶體4880之結構之一方塊圖。
緊接在分配共用記憶體標頭結構4882之後跟隨的係產生用於每一記憶體片段之標頭陣列4884。記憶體片段標頭含有該片段所佔據之大小、映射至該片段之物件類別之獨特標籤及該片段狀態。各片段可為以下四個狀態之一者:未使用狀態,其中片段可用於分配;經鎖定以進行寫入之狀態,其中片段經映射至一特定類別之一物件且當前經形成;經寫入狀態,其中片段經映射至一特定類別之一物件且可用於檢索;及經鎖定以進行讀取之狀態,其中片段經映射至一特定類別之一物件且當前處於關於資料檢索之一處理程序中。因為每一片段具有其自身狀態,所以應用程式有可能鎖定用於物件形成及物件檢索之一個以上片段。此容許系統在應用程式之伺服器端及用戶端兩者上具有彈性多執行緒式架構。此外,使一個以上片段處於一「寫入」狀態中之能力提供取消或最小化伺服器端及用戶端上之應用程式之效能差之一「緩衝」機制。
一實體共用記憶體布局中之最後元件含有記憶體片段4888。除實體共用記憶體之外邏輯共用記憶體亦含有一實體系統互斥4886及系統事件4890。該實體互斥提供對實體共用記憶體之相互排斥存取。該實體事件具有一手動控制類型。其在片段之至少一者具有一「寫入」狀態時始終保持處於位準「高」。僅當不存在處於一「寫入」狀態之單一片段時其轉至位準「低」。此機制容許在用於執行緒之相同時間片段分配內未傳遞控制至一作業系統之情況下自共用記憶體擷取「寫入」物件。
在一較佳實施例中,物件傳輸程式設計介面由以下三個類別組成:即,AObjectXmitter、USFrame及BModeFrame。該AObjectXmitter類別容許起始指定所要操作參數之一物件傳送服務。一旦實例化該AObjectXmitter類別物件,即可產生USFrame及BmodeFrame類別之經初始化物件。該USFrame類別建構器需要參考AObjectXmitter類別之一物件。在實例化USFrame物件之後必須完成之第一動作係建置該物件與共用記憶體中之片段之一者之關聯。函式Allocateo將一物件映射至一未使用之共用記憶體片段且鎖定此片段以供當前物件使用。在映射一物件時可藉由一應用程式提供一位元映射大小。該經提供之大小僅表示位元映射資料所需之大小(並不包含物件之其他資料元素所需之記憶體大小)。
BModeFrame類別係自USFrame類別導出之一類別。其繼承該基底類別具有之全部方法及功能性。由BModeFrame類別提供之唯一額外功能性係容許提供與BMode操作特定有關之資訊之額外方法。
在實例化USFrame或BModeFrame類別物件及將其映射至共用記憶體片段之後,應用程式可填充該物件之全部所要資料元素。並不必要對每一資料元素提供一值。在將一物件映射至共用記憶體片段時,使用預設值初始化該物件之全部資料元素。在映射之後並未初始化之唯一資料元素係位元映射資料元素。當應用程式之伺服器端已提供全部所要資料元素時,其可藉由調用一方法(例如,Submit( ))而將物件交遞至該應用程式之用戶端。
可藉由隨後重新映射及重新提交來重新使用USFrame或BModeFrame物件。替代性地,當該物件適用於一應用程式時可刪除該物件且可產生一新的物件。因為物件實例化並不需要任何程序間通信機構,所以其與用於一普通變量之記憶體分配一樣簡單。
存在較佳實施例之架構之至少兩個優點。因為ObjectXmitter類別確實具有關於USFrame或BModeFrame類別之知識,所以引入類似的或直接或間接自USFrame類別導出之額外類別可為非常簡單的。此容許在無需對經開發以用於現有實施例之代碼或指令序列之任何修改之情況下產生物件傳輸程式設計介面的未來版本。此外,物件傳輸程式設計介面類別並不具有任何成員變量。此提供介面之另兩個益處。第一益處在於,此等類別係經COM物件介面定向且可直接用於COM物件介面規格及實施方案。第二益處在於,此等類別有效隱藏全部實施方案特定細節,從而使介面非常清楚、易於理解及使用。
藉由ObjectXmitter.DLL實施物件傳輸程式設計介面。對於由應用程式產生之每一物件,存在藉由駐留於該ObjectXmitter.DLL中之代碼產生之一鏡像實施方案物件。因為每一程式設計介面類別在實施方案中具有對應鏡像類別,所以促進修改且在當前將修改擴展至指定影像類型。此可藉由在實施方案DLL4910中產生對應鏡像類別來完成。實施方案物件負責處置共用記憶體及程式設計介面物件之映射。本發明之一實施例包含容許使用具有一用戶端應用程式之僅一通信通道實例化僅一ObjectXmitter類別物件之DLL。物件傳輸實施方案不僅傳輸物件資料而且提供描述經傳送之物件類型之額外資訊。
Object Factory程式設計介面由三個類別組成:AObjectFactory、USFrame及BModeFrame。該類別AObjectFactory含有三個純虛擬成員函式。此使此類別成為不能藉由一應用程式實例化之一抽象類別。必須從應用程式定義自AObjectFactory類別導出之其自身類別。並不需要定義自AObjectFactory類別導出之任何「特殊」類別。因為應用程式意欲處理將接收之影像,所以其將具有處理影像之一類別之機會非常高。一影像處理類別可很好地自AObjectFactory類別導出。
自一AObjectFactory類別導出之類別必須定義及實施僅純虛擬函式,舉例而言,諸如OnFrameOverrun( )、OnUSFrame( )及OnBModeFrame( )。例如,一經導出類別可如下定義:


在實例化一類別物件之後,可調用影像處理器基底類別成員函式Open( )。此函式提供匹配至由應用程式之伺服器端使用之共用記憶體名稱之一共用記憶體名稱。函式Open( )經由一指定共用記憶體將用戶端應用程式連接至伺服器應用程式。
在開啟共用記憶體之後之任何時刻,應用程式可預期對虛擬函式OnFrameOverrun( )、OnUSFrame( )及OnBModeFrame( )之一調用。OnUSFrame( )函式之每一調用攜載USFrame類別類型之一物件作為一自變數。OnBModeFrame( )函式之每一調用攜載BModeFrame類別類型之一物件作為一自變數。不需要使一應用程式實例化USFrame或BModeFrame類別之一物件。藉由對一AObjectFactory 類別之底層實施方案將USFrame及BModeFrame物件「給定」至一應用程式。
應用程式需要完成之唯一動作係處理經接收之圖框及釋放「給定」物件。應用程式並未嘗試刪除一圖框物件,此係因為刪除係藉由一底層實施方案進行。僅在應用程式完成全部資料處理或應用程式不再需要USFrame物件或經導出類別之物件時調用USFrame物件之成員函式Release( )。
一旦應用程式已接收一類別USFrame或BModeFrame之一物件,其即可擷取成像資料且適當地對其等進行處理。應用程式需要意識到,其確實在一分離執行緒中處理圖框物件資料且確保使用一執行緒安全程式設計技術寫入處理函式。因為純虛擬函式之任一者皆在藉由實施方案DLL產生之一分離執行緒內調用,所以在虛擬函式將控制返回至調用執行緒之前隨後調用皆不可能。此意謂只要應用程式還未將控制返回至實施方案產生之執行緒,應用程式就不能接收任何新的圖框。同時,應用程式之伺服器端可繼續提交額外圖框。此最終導致共用記憶體溢出(overflow)且阻止任何新的圖框傳輸。
在應用程式處理圖框資料時其始終保持共用記憶體資源從隨後重新映射鎖定。應用程式未釋放之圖框愈多,可用於應用程式之伺服器端上之物件傳輸介面之共用記憶體片段愈少。若並未以一適當速率比釋放圖框配合物件,則最終由用戶端應用程式鎖定共用記憶體之全部記憶體片段。在那時,影像傳輸應用程式停止發送新的圖框或重寫仍未藉由接收應用程式鎖定之圖框。若接收應用程式鎖定全部片段,則傳輸應用程式甚至無法選擇重寫現有圖框。
在由服務應用程式提出Frame Overrun時調用函式OnFrameOverrun( )。在服務應用程式嘗試提交一新的圖框且不存在將一物件映射至之任何可用共用片段時之任何時候提出此條件。可僅藉由應用程式之用戶端憑藉調用函式ResetFrameOverrun( )而清除此條件。若用戶端應用程式並未調用此函式,則提出Frame Overrun條件且再次調用OnFrameOverrun( )純虛擬函式。
Object Factory介面具有上文在描述物件傳輸介面時概述之相同優點。除了此等優點之外,其亦實施最小化程式設計工作及最大化執行效能之一事件驅動程式設計方法。同時存在函式,舉例而言,諸如USFrames( )、BModeFrames( )、GetUSFrame( )及GetBModeFrame( )。此等函式可用於實施較效率較低之「輪詢」程式設計方法。
藉由ObjectFactory.DLL4918實施Object Factory程式設計介面。此DLL自共用記憶體擷取一物件類別類型資訊以及物件相關資料。其產生由傳輸器使用之類型之一物件。Object Factory實施方案將新產生之物件映射至對應資料。Object Factory實施方案具有經由純虛擬函式事件觸發(fire)新產生及映射之物件之一分離執行緒。應用程式在整個處理期間「擁有」此物件且藉由調用Releaseo函式指示應用程式不再需要該物件。工廠實施方案本端釋放經分配以用於物件之資源以及共用記憶體資源。
在方塊圖圖49中以圖示方式表示上文所描述之處理流程4900。較佳實施例包含代碼維護之簡易及對於影像傳送機構之特徵增強。物件傳送介面4908及Object Factory介面4916以及其等之實施方案容許此等修改在相對較低開發成本下進行。關於物件修改,共用記憶體實施方案完全獨立於經傳送資料類型。因此,任何類型修改並不需要對控制共用記憶體之底層代碼進行任何改變。因為經傳送資料係囊封於一特定類型之類別內,所以修改傳送一物件所需要之唯一動作係修改定義此物件之對應類別。因為物件表示一類別導出樹,所以基底類別之任何修改引起經導出類別之每一物件之適當變化。物件類型之此等修改並不影響與經修改物件類別不相關之應用程式代碼。
可藉由自現有類別之一者導出一新類別來引入物件之新類型。一新導出之類別可自基底類別之適當層級導出。產生一新物件類型之一替代方式係藉由產生一新基底類別。此方法可在一新定義之類別顯著不同於現有類別時之情況中具有優點。
關於多個物件傳送通道,替代較佳實施例可支援一個以上AObjectXmitter類別物件及一個以上對應通信通道。其亦可依使得其容許在相對方向上傳輸物件之通信通道之一方式擴展。此容許應用程式將成像資料散發至一個以上用戶端應用程式。其可接受控制影像產生及探測頭操作之傳入通信。
此外,無線及遠端影像串流通道可適應於較佳實施例中。可實施一相同物件傳輸程式設計介面以並非經由共用記憶體而是經由高速無線通信網路(舉例而言,諸如ISO 802.11a)來傳送影像。該相同物件傳輸程式設計介面亦可用於跨一有線乙太網路連接傳送影像。遠端及無線影像串流假定接受者運算系統可在效能上不同。此使接受者之器件之一模型之選擇成為成功實施方案的重要因素之一者。
因此,包含於較佳實施例中之經串流成像利用在低額外耗用(overhead)下提供高頻寬之一共用記憶體用戶端-伺服器架構。
一較佳實施例之超聲波成像系統軟體應用程式係由一用戶端應用程式4904用作實況超聲波影像圖框之一伺服器4902。藉由如上所述之兩個通信機制支援此用戶端-伺服器關係。用戶端應用程式使用一COM自動化介面來啟動及控制超聲波成像系統應用程式4906。一高速共用記憶體介面4912將具有探測頭識別、空間及時間資訊之實況超聲波影像自該應用程式遞送至用戶端應用程式。
對於一簡單ActiveX COM API (TTFrameReceiver)中之用戶端應用程式囊封共用記憶體實施方案之複雜性。共用記憶體通信具有藉由用戶端應用程式指定之彈性參數。佇列順序、緩衝器數目、緩衝器大小及重寫許可皆藉由用戶端在開啟影像圖框串流時指定。佇列順序模式可指定為先進先出(FIFO)、後進先出(LIFO)及最後進出(LIO)。一般而言,當零資料損耗比最低延時更重要時,該FIFO模式係較佳的。LIO模式僅遞送最近影像圖框且在最低延時比資料損耗更重要時為較佳的。當最低延時及最小資料損耗同樣重要時可使用LIFO模式。然而,在LIFO模式中,可能並非總是以循序順序遞送圖框且在接收圖框之後需要一更複雜用戶端應用程式來將對其等進行分類。當全部共用記憶體緩衝器已滿時,重寫許可經指定為並不容許、重寫最舊圖框及重寫最新圖框。
各影像圖框含有一單一超聲波影像、探測頭識別資訊、像素空間資訊及時間資訊。影像格式係具有8位元像素及一256項目色彩表之一標準Microsoft器件無關位元映射(DIB)。
TTFrameReceiver ActiveX控制項提供用於接收圖框之兩個方案。第一方案係事件驅動。當已接收一圖框時觸發一COM事件FrameReady。在該FrameReady事件之後,可使用介面之資料存取方法讀取影像及相關聯資料。在已複製影像及其他資料之後,用戶端藉由調用ReleaseFrame方法而釋放圖框。直至在釋放先前圖框之後下一個FrameReady事件才會發生。在另一實施例中,用戶端可使用WaitForFrame方法對下一可用圖框進行輪詢。
在一較佳實施例中,用戶端應用程式及伺服器應用程式兩者皆執行於相同電腦上。該電腦可運行(例如但不限於)Microsoft® Windows® 2000/XP作業系統。可使用Microsoft® Visual C++6.0及MFC開發用戶端應用程式(USAutoView)。可在(例如) Visual Studio 6.0中編譯原始碼。伺服器端COM自動化介面及TTFrameReceiver ActiveX控制項可與其他MS Windows®軟體開發環境及語言相容。
在本發明之一實施例中,伺服器端COM自動化介面(ProgfD)之名稱係(例如)「Ultrasound.Document」且在第一次運行應用程式時將該介面登記於電腦上。調度介面可自一類型程式庫匯入至一用戶端應用程式中。
在一較佳實施例中,藉由增加不同方法(諸如void OpenFrameStream (BSTR*queneName、short numBuffers、long buffersize、BSTR*queueOrder、short overwritepermission))來擴展自動化介面以支援圖框串流。開啟伺服器端上之圖框串流傳輸器;開啟與用戶端應用程式之共用記憶體介面,queueName係共用記憶體「檔案」之一獨特名稱且係與在開啟接收器時所使用相同之名稱,numBuffer係共用記憶體佇列中之緩衝器之數目,bufferSize係共用記憶體佇列中之各緩衝器之以位元組為單位之大小,其中緩衝器大小係比可傳輸之最大影像大5120個位元組,queueOrder係「LIO」、「FIFO」或「LIFO」,對於並不容許之重寫overwritePermission為0,對於最舊之重寫overwritePermission為1或對於最新之重寫overwritePermission為2。注意,必須在開啟TTFrameReceiver控制項之前調用OpenFrameStream。
下一額外方法包含:void CloseFrameStream( ),其關閉伺服器端上之圖框串流傳輸器;void StartTransmitting( ),其告知伺服器端開始傳輸超聲波圖框;void StopTransmitting( ),其告知伺服器端停止傳輸超聲波圖框;及short GetFrameStreamStatus( ),其獲得圖框串流傳輸器之狀態。在開啟TTFrameReceiver之前檢查串流傳輸器經開啟是重要的。COM自動化介面未阻斷且在自用戶端應用程式調用其之瞬間OpenFrameStream調用不能發生。
在一較佳實施例中,TTFrameReceiver ActiveX控制項係與實況超聲波圖框串流之用戶端應用程式介面。圖框串流控制方法包含boolean Open(BSTR名稱),其開啟圖框串流接收器。直至在已開啟伺服器上之圖框串流傳輸器之後才能開啟該圖框串流接收器。圖框串流控制方法亦包含:boolean Close( ),其關閉圖框串流接收器;long WaitForFrame(long timeoutms),其等待一圖框準備好或直至逾時週期結束;及boolean ReleaseFrame( ),其釋放當前影像圖框。一旦已複製全部所要資料,就可釋放當前圖框。直至已釋放當前圖框才能接收下一圖框。其他資料存取函式之傳回值在釋放當前圖框之後並不有效直至下一FrameReady事件。
在一較佳實施例中之用於影像之資料存取方法包含long GetPtrBitmapinfo( ),其獲得對於含有影像之DIB之標頭(具有色彩表)之一指標。超聲波影像經儲存為一標準Microsoft器件無關位元映射(DIB)。BITMAPINFO及BITMAPINFOHEADER結構可視需要派用(cast)至經傳回之指標。用於BITMAPINFO結構之記憶體係分配於共用記憶體中且不可取消分配;代替性地,可調用ReleaseFrame( )以將記憶體傳回至共用記憶體機構。進一步方法包含long GetPtrBitmapBits( ),其獲得對影像像素之一指標。可視需要派用經傳回之指標以與Microsoft DIB API一起使用。用於位元映射像素之記憶體係分配於共用記憶體中且不可取消分配;代替性地,可調用ReleaseFrame( )以將記憶體返回至共用記憶體機構。
與探測頭識別有關之方法包含:short GetProbeType( ),其獲得所使用之經定義之超聲波探測頭類型;BSTR GetProbeType( ),其獲得經定義之探測頭名稱;long GetProbeSN( ),其獲得所使用之探測頭之序號。
關於時間資訊,該方法包含short GetSequenceNum( ),其獲得當前圖框之序列號。該序列號係自一8位元計數器導出且因此每256個圖框重複。其對於判定圖框序列中之間隙及在使用LIFO緩衝器排序模式時重新排序經接收之圖框為有用的。此外,double GetRate( )在與序列號組合時獲得圖框率,對經接收圖框提供精確相對時序;BSTR GetTimestamp( ),其獲得當前圖框之一時間戳記,該時間戳記對當前圖框提供在同步化至外部事件時可能有用之一絕對時間。解析度約為毫秒。時間戳記可經平均化且結合速率及序列號一起使用以達成較高精確度。最後,關於時間資訊,該方法包含BSTR GetTriggerTimestamp( ),其獲得超聲波掃描之啟動之一時間戳記,其中在「凍結」影像時停止超聲波探測頭。當恢復實況成像時記錄觸發時間戳記。
較佳實施例中之空間資訊具有以下方法:short GetXPixels( ),其獲得以像素為單位之影像之寬度;short GetYPixels( ),其獲得以像素為單位之影像之高度;double GetXPixelSize( ),其獲得在x方向上之各像素之大小,(x方向經定義為水平的且平行於各影像線);及double GetYPixelSize( ),其獲得在y方向之各像素之大小。該y方向經定義為垂直的且垂直於各影像線。此外,double GetXOrigin( ),其獲得影像中之第一像素相對於傳感器頭之x定位;及double GetYOrigin( ),其獲得影像中之第一像素相對於傳感器頭之y定位。正y方向經定義為遠離至患者體中之傳感器頭。另一方法包含short GetXDirection( ),其獲得沿影像之各線之空間方向。正x方向經定義為遠離探測頭標記。Short GetYDirection( ),獲得跨影像之各線之空間方向。正y方向經定義為遠離至患者體中之傳感器頭。
影像中之任何像素相對於傳感器頭之空間位置可易於如下計算:
PX = OX+NX*SX*DX
PY = OY+NY*SY*DY
其中,
P =像素相對於傳感器頭之位置,
O =原點,
N =影像中之像素之索引值,
S =像素大小,
D =像素之方向。
此外,當準備好一圖框且可讀取資料時,使用在一較佳實施例中之事件void FrameReady( )。處置器複製來自資料存取方法之資料且接著調用ReleaseFrame( )。建議,在處置器中避免任何種類之無定限處理(例如,調用訊息循環之函式)。此外,當伺服器不能發送一圖框或不得不在緩衝器中重寫一圖框(由於緩衝器已滿)時使用void FrameOverrun( )。此僅適用於FIFO及LIFO模式,此係因為LIO自動釋放舊緩衝器。此事件對於判定用戶端應用程式是否足夠快地讀取圖框及經分配之緩衝器之數目是否足以用於用戶端之延時為有用的。
在一較佳實施例中,USAutoView係使用戶端自動化及顯示實況超聲波影像圖框之一樣本用戶端應用程式。其具有證實啟動及停止伺服器端、隱藏及展示伺服器端、在展示影像上之圖形與並不展示影像上之圖形之間切換、凍結及恢復超聲波獲取、載入一預設檢查、改變經指定之患者大小、改變影像大小、空間資訊及反轉影像之功能。
圖50係根據本發明之一較佳實施例之用於一USAutoView UI之一圖形使用者介面4950之一視圖。USAutoView程式係具有三個ActiveX組件之一Windows®對話應用程式。TTFrameReceiver,其供應接收超聲波圖框之ActiveX介面;TTAutomate,其囊封伺服器端之自動化;及TTSimplelmageWnd,其係影像顯示視窗。CUSAutoViewDlg係主對話。其透過TTAutomate控制項管理伺服器端之自動化、透過TTFrameReceiver接收超聲波圖框及透過TTSimplelrnageWnd之影像顯示。CUSAutoViewDlg之OnStartUS( )方法調用啟動或停止來自伺服器端之自動化及資料傳輸所需之TTAutomate及TTFrameReceiver方法。
方法OnFramReady( )處置來自TTFrameReciever之FrameReady事件。其複製來自TTFrameReceiver之所要資料且接著使用TTFrameReceiver's ReleaseFrame( )方法釋放圖框。其避免了執行不確定處理之任何函式(諸如調用訊息循環之函式)。
TTAutomate係囊封伺服器端之自動化函式之一ActiveX控制項。伺服器端之原生COM自動化介面未阻斷且需要與GetStatusFlags一起等待以協調函式。TTAutomate將各函式包覆於所需等待循環中。該等等待循環容許處理Windows®訊息使得用戶端應用程式之使用者介面執行緒在等待時並未被阻斷。儘管TTAutomate中之自動化方法在完成函式之前不能返回,然在完成該函式之前仍處理其他Windows®訊息。建議,防止自訊息處置器至TTAutomate方法之多個併發調用,此係因為與伺服器端之協調一般不可重入。用於此控制之原始碼係包含於USAutoView工作區中。其可視需要經重新使用或修改。
TTSimplelmageWnd係對器件無關位元映射(DIB)提供一顯示視窗之一ActiveX控制項。顯示介面之兩個性質係long DIBitmaplnfo及long DIBits。DIBitmaplnfo對應於對含有用於DIB之BITMAPINFO結構之記憶體之一區塊之一指標。DIBits對應於對含有影像像素之記憶體之一區塊之一指標。為載入一新影像,將DIBitmapInfo設定為對DIB之位元映射資訊之指標。接著將DIBits設定為對位元映射位元之指標。當設定DIBits時,期望對於DIBitmapInfo設定之指標仍為有效且在內部複製位元映射資訊及位元映射位元兩者以顯示於螢幕上。將DIBitmapInfo及DIBits設定為零以清除影像。用於此控制之原始碼係包含於USAutoView工作區中。其可視需要經重新使用或修改。
本發明之較佳實施例包含複數個探測頭類型。例如,該等探測頭包含(但不限於):在2 MHz至4 MHz之間操作之一凸線性傳感器陣列、在2 MHz至4 MHz之間操作之一相控線性傳感器陣列、在4 MHz至8 MHz之間操作之一凸線性內腔傳感器陣列、在4 MHz至8 MHz之間操作之一線性傳感器陣列及在5 MHz至10 MHz之間操作之一線性傳感器陣列。
本發明之可攜式超聲波系統之較佳實施例在一檢查期間提供高解析度影像,諸如以下影像:B模式、M模式、彩色多普勒(CD)、脈衝波多普勒(PWD)、定向能量多普勒(DirPwr)及能量多普勒(PWR)。一旦安裝系統軟體,探測頭器件即連接至一桌上型電腦或膝上型電腦中。該探測頭可為連接至含有系統之波束成形硬體之一28 oz. 箱子之一業界標準傳感器。若將探測頭連接至一膝上型電腦,則一4接針FireWire纜線連接至定位於一內建式(built-in) MediaBay上之一IEEE 1394串列連接。然而,若將探測頭連接至一桌上型電腦,則該電腦可能並不配備有一MediaBay。吾人可使用一外部DC模組(EDCM)連接器連接探測頭。在連接探測頭之前,吾人需要確定Firewire連接於電腦之右側及左側兩者上。
在一實施例中,EDCM經設計以在一端處接受一6接針IEEE 1394 (亦稱為FireWire)纜線及在另一端處接受來自探測頭之一Lemo連接器。該EDCM接受自+10伏特至+40伏特之一輸入DC電壓。此外,在一實施例中,系統可使用IEEE 1394連接至一主機電腦。至EDCM之6接針IEEE 1394輸入可源自運行(例如)Windows® 2000作業系統之任何配備IEEE 1394之主機電腦。一外部IEEE 1394集線器對於提供所需DC電壓至EDCM而言亦可為必要的。在配備有IEEE 1394之一主機電腦中,存在IEEE 1394連接器之兩個類型之一者(一4接針或一6接針)。6接針連接器最常在使用內部PCI匯流排卡之基於PC之工作站中找到。通常,6接針連接器提供所需DC電壓至EDCM。一6接針公頭至6接針公頭之IEEE 1394纜線係用於將主機電腦連接至EDCM。
4接針連接器係在並不含有根據一較佳實施例之一MediaBay或提供一DC電壓輸出之膝上型電腦中找到。當使用此連接器類型時,一外部IEEE-1394集線器可用於對EDCM及探測頭供電。
當並非自主機電腦提供電力時,可在該主機電腦與EDCM之間使用一外部IEEE-1394集線器。集線器自一壁式插座導出其電力且使用符合IEC 60601-1電安全標準之一醫療級電源供應器連接。
為將集線器連接至主機電腦,需要一4接針公頭至6接針公頭或6接針公頭至6接針公頭IEEE纜線。將適當連接器(4接針或6接針)插入至主機電腦中及將6接針連接器插入至集線器中。接著,使用一6接針公頭至6接針公頭IEEE 1394纜線將集線器連接至EDCM。僅在主機電腦不能供應至少+10伏特至+40伏特直流電(DC)及10瓦特功率至EDCM時需要一IEEE 1394集線器。若主機電腦可供應足夠電壓及電力,則一6接針公頭至6接針公頭IEEE 1394纜線可用於將電腦直接連接至EDCM。
圖51繪示根據本發明之一較佳實施例之一圖形使用者介面之一主螢幕顯示器之一視圖。當使用者啟動根據本發明之系統時,主螢幕5170顯示。為幫助使用者巡覽,主螢幕可視為提供資訊以幫助吾人執行任務之四個分離工作區域。此等工作區域包含一功能表列5172、一影像顯示視窗5174、一影像控制列5176及一工具列5178至5186。
為調整視窗及區域大小,使用者可點選在視窗之右上方之小按鈕以關閉、調整大小及退出程式。一使用者介面或按鈕關閉視窗但留下程式繼續運行(最小化視窗)。一系統按鈕出現在螢幕之底部,在稱為工作列之區域中。藉由點選在工作列中之系統按鈕,視窗重新開啟。另一介面按鈕放大視窗以填充整個螢幕(稱為最大化),然而,當視窗處於其最大時,圖框率可降低。另一介面按鈕使視窗返回其在放大之前之大小。可藉由另一介面按鈕關閉系統程式。
使用者可增大或減小應用程式之各區域之寬度以滿足吾人之需要。例如,為使Explorer視窗更窄,將游標放置於區域之任一端處且藉由點選及拖曳獲得新的所要大小。吾人可重新定位各區域之大小及部位使得其等變為浮動視窗。為產生浮動視窗,使用者簡單地點選在特定區域之雙邊緣邊界上之其自己的滑鼠並拖曳其直至其看似一浮動視窗。為使該浮動視窗恢復回至原始形式,吾人在該視窗中點兩下。於圖52A至圖52C中描繪此等功能性,圖52A至圖52C係根據本發明之一較佳實施例之一圖形使用者介面5200、5208、5220中之視圖。
Explorer視窗對於經產生及保存之使用者產生影像之全部患者資料夾提供嵌套層級檔案目錄5202。資料夾目錄結構包含以下(但不限於):患者資料夾及一影像資料夾。患者資料夾目錄係患者資訊檔案連同任何相關聯影像儲存之處。影像資料夾目錄含有按日期及檢查類型之影像。在此目錄中之該等影像並不與一患者相關聯且在不具有患者資訊之情況下產生。圖53A至圖53B繪示根據本發明之一較佳實施例之患者資料夾5340及影像資料夾5350。在螢幕之頂部處之功能表列提供吾人可使用以執行基本任務之九個選項。為存取一功能表選項,簡單地點選該功能表名稱以顯示下拉式功能表選項。使用者亦可藉由使用其捷徑鍵組合來存取任何功能表。
影像顯示視窗提供兩個索引標籤(tab):影像顯示(Image Display)及患者資訊(Patient Information)。使用者在影像顯示索引標籤上點選以觀看超聲波影像。該影像係根據經定義之控制設定而顯示於視窗中。一旦保存該影像,則在使用者再次擷取其時,該影像之分類、日期及時間亦展示於影像顯示視窗中。患者資訊索引標籤係用於輸入稍後將儲存於一患者資料夾中之新的患者資訊。使用者可存取此索引標籤以亦對患者資訊進行修改及更新。
圖54A及圖54C繪示由兩個一維、多元件陣列組成之一XY雙平面探測頭。該等陣列可互相堆疊而構造,其中各陣列之一偏光軸在相同方向上對準。兩個陣列之仰角軸可彼此成一直角或彼此正交。例示性實施例可採用傳感器總成,舉例而言,諸如美國專利第7,066,887號(該案之全部內容以引用的方式併入本文中)中所描述之傳感器總成或法國之圖爾市塞德斯(Tours Cedex)之Vernon銷售之傳感器。藉由圖54A所繪示,藉由配置5400表示陣列定向。兩個陣列之偏光軸(5408, 5422)在z軸5406中指出。底部陣列之仰角軸係在y方向5402上指出,且頂部陣列之仰角軸在x方向5404上。
藉由圖54B進一步繪示,一個一維多元件陣列形成如配置5412中描繪之一影像。具有在一y方向5402上之一仰角軸5410之一個一維陣列在x軸5404、z軸5406平面上形成超聲波影像5414。具有在x方向5404上之仰角軸5410之一個一維陣列在y軸5402、z軸5406上形成超聲波影像5414。具有沿著一y軸5402之仰角軸5410及沿著一z軸5406之偏光軸5408之一個一維傳感器陣列將導致沿著x平面5404及z平面5406形成之一超聲波影像5414。藉由圖54C繪示之一替代實施例描繪具有一x軸5404上之一仰角軸5420及在z軸5406方向上之一偏光軸5422之一個一維傳感器陣列。在y平面5402及z平面5406上形成超聲波影像5424。
圖55繪示一雙平面影像形成xy探測頭之操作,其中陣列5512具有經施加以用於形成影像之一高電壓。高電壓驅動脈衝5506、5508、5510可施加至具有一y軸仰角之底部陣列5504。此施加可導致產生用於在XZ平面上形成經接收影像之傳輸脈衝,同時保持頂部陣列5502之元件處於一接地位準。此等探測頭致能使用比一全2D傳感器陣列更簡單之電子器件之一3D成像模式。如本文中所描述之一觸控螢幕啟動之使用者介面可採用螢幕圖標及手勢以致動3D成像操作。可藉由運行於平板電腦資料處理器上之軟體擴充此等成像操作,該平板電腦資料處理器將影像資料處理成3D超聲波影像。此影像處理軟體可採用此項技術中已知之平滑濾波及/或內插操作。波束導向亦可用於致能3D成像操作。一較佳實施例使用經配置以用於雙平面成像之複數個1D子陣列傳感器。
圖56繪示一雙平面影像形成xy探測頭之操作。圖56繪示一陣列5610,該陣列5610具有施加至其以用於形成影像之一高電壓。高電壓脈衝5602、5604、5606可施加至具有在x軸上之仰角之頂部陣列5612,從而產生用於在yz平面上形成經接收影像之傳輸脈衝,同時保持底部陣列5614之元件接地5608。此實施例亦可利用使用如本文中所描述之子陣列波束成形操作之正交1D傳感器陣列。
圖57繪示一雙平面影像形成xy探測頭之電路要求。接收波束成形要求係針對一雙平面探測頭描繪。進行至接收電子器件5702之一連接。接著,連接來自選擇底部陣列5704及選擇頂部陣列5708之元件以共用至接收電子器件5702通道之一連接。一個二至一多工器電路可整合於高電壓驅動器5706、5710上。該二至一多工器電路可整合於高電壓驅動器5706、5712上。對於各傳輸波束成形一接收波束。雙平面系統要求總計256個傳輸波束,對於該256個傳輸波束而言,128個傳輸波束係用於形成一XZ平面影像且另128個傳輸波束係用於形成一YZ平面影像。一經多次接收之波束的形成技術可用於改良圖框率。用於各傳輸波束之具有雙重接收波束能力之一超聲波系統提供其中可形成兩次接收之波束之一系統。雙平面探測頭僅需要總計128個傳輸波束以用於形成兩個正交平面影像,其中64個傳輸波束係用於形成一XZ平面影像,而另64個傳輸波束用於形成YZ平面影像。類似地,對於具有一四倍或4次接收波束能力之一超聲波系統,探測頭需要64個傳輸波束來形成兩個正交平面影像。
圖58A至圖58B繪示用於同時雙平面評估之一應用程式。使用超聲心動圖顯象量測LV機械不同步之能力可有助於識別更可能受益於心臟再同步治療之患者。需要經量化之LV參數係Ts-(lateral-septal)、Ts-SD、Ts-peak等。該Ts-(lateral-septal)可在一2D心尖4腔室視圖回波影像上量測,而Ts-SD、Ts-peak (medial)、Ts-onset(medial)、Ts-peak(basal)、Ts-onset (basal)可在於二尖瓣及乳頭狀肌層級處分別具有6個片段(提供總計12個片段)之兩個分離胸骨旁短軸視圖上獲得。圖58A至圖58B描繪提供待同時觀看之心尖四腔室影像5804及心尖兩腔室影像5802之一xy探測頭。
圖59A至圖59B繪示射血分率探測頭量測技術。在兩個正交平面之視覺化確保獲得軸上視圖時,雙平面探測頭提供EF量測。自動邊界偵測演算法提供量化回波結果以選擇植入回應器及導引AV延遲參數設定。如圖59A中所描繪,XY探測頭自兩個正交平面獲取即時同時影像且影像5902、5904顯示於一分割螢幕上。一手動輪廓追蹤或自動寄宿者追蹤技術可用於追蹤在心臟收縮末期及心臟舒張末期兩者時之心內膜寄宿者(自其計算EF)。在心尖2CH視圖5902及心尖4CH視圖5904中之LV區域(分別為A1及A2)係在心臟舒張末期及心臟收縮末期予以量測。LVEDV (左心室舒張末期容積)及LVESV (左心室收縮末期容積)係使用以下公式計算:
。且射血分率係藉由計算。
圖60繪示根據本發明之一實施例之用於無線傳送資料至一可攜式超聲波成像器件及自該可攜式超聲波成像器件無線傳送資料之一例示性方法。該方法可以選擇6001呈現對使用者可用之各種無線連接之一無線通信功能表選項開始。例如,一使用者可希望連接至一WiFi網路、一3G或4G蜂巢式網路或一些其他無線網路。該方法可以選擇6002一所要無線連接繼續進行。該方法可進一步包含選擇6003傳輸超聲波資料之一或多個目的地。在一些實施例中,此選擇可藉由使用觸控螢幕UI選擇一或多個醫院、醫生、診療所等來執行,類似於吾人自一電話聯絡人清單選擇一聯絡人之方式。該方法可進一步包含判定6004是否期望一音訊連接及/或視訊連接。在一些實施例中,除了在可攜式超聲波器件與一遠端醫院或診療所之間傳輸超聲波及其他醫療資料之外,使用者亦可經由相同無線網路建置一音訊及/或視訊連接。此功能容許超聲波成像器件之使用者(例如)在與一醫院或醫療專業人員直接音訊及/或視訊接觸時遠端執行及傳輸超聲波資料。在一實例中,起始與一醫院之一音訊及/或視訊通話可容許可攜式超聲波成像器件之使用者在執行一超聲波程序時自一醫生接收指導及/或建議。若期望一音訊及/或視訊通話,則該方法可進一步包含起始6005與所要目的地之音訊/視訊通話。
若並不期望任何音訊/視訊連接,或在起始一音訊/視訊通話之後,該方法可進一步包含判定6006是否意欲即時傳輸超聲波成像資料,或使用者是否僅希望傳輸已產生之超聲波資料。若期望一即時連接,則該方法可進一步包含開始6007超聲波掃描及即時傳輸6008超聲波資料至(若干)所要目的地。若並不需要即時超聲波資料傳輸,則該方法可以選擇6009(若干)所要檔案及傳輸6010該(或該等)選定檔案至(若干)所要目的地來繼續進行。
在一些實施例中,一使用者可藉由透過經由觸控敏感UI呈現之各種視窗、資料夾、子資料夾、功能表及/或子功能表巡覽來執行上文所描述之方法。可使用在一圖標上執行之一觸控螢幕手勢(藉由將一圖標自一部位拖放至另一部位、選擇或取消選擇一或多個複選框或執行任何其他充分獨特或可區別觸控螢幕命令)來執行用於選擇一功能表選項、目的地、檔案等之各種UI命令。在一些實施例中,本文中所描述之各種觸控螢幕命令可為使用者可組態的,而在其他實施例中其等係經硬編碼。如將理解,可以任何所要順序執行本文中所描述之方法之各種元件。例如,在一些實施例中,一使用者可在選擇6009待傳輸之(若干)檔案之前選擇6003一或多個目的地,而在其他實施例中,一使用者可在選擇6003所要目的地之前選擇6009一或多個檔案。類似地,可以各種序列或同時執行上文所描述之方法之其他元件,且除非另有說明,否則本文中描述之方法並不意欲限於任何特定序列。
應注意,本文中所描述之操作係純粹例示性的,且並不暗指任何特定順序。此外,在適當時可以任何序列使用該等操作,及/或可部分使用該等操作。在本文中出於繪示性目的提供例示性流程圖且該等例示性流程圖係方法之非限制性實例。一般技術者將認知,例示性方法可包含比例示性流程圖中所繪示之步驟更多或更少之步驟,且可以不同於所展示之一順序執行例示性流程圖中之該等步驟。
在描述例示性實施例時,為清楚起見使用特定術語學。為描述目的,各特定術語意欲至少包含以一類似方式操作以完成一類似目的之全部技術及功能等效物。此外,在其中一特定例示性實施例包含複數個系統元件或方法步驟之一些例項中,可用一單一元件或步驟來取代該等元件或步驟。同樣地,可用服務相同目的之複數個元件或步驟來取代一單一元件或步驟。此外,在本文中對於例示性實施例指定用於各種性質之參數之情況下,除非另有指定,否則可將該等參數上下調整達二十分之一、十分之一、五分之一、三分之一、二分之一等,或達其等之四捨五入近似值。
考慮到上文闡釋性實施例,應理解,此等實施例可採用涉及將資料傳送或儲存於電腦系統中之各種電腦實施操作。此等操作係需要實體操縱物理量之操作。通常,儘管並不一定,但此等量採取能夠經儲存、傳送、組合、比較及/或以其他方式操縱之電信號、磁信號及/或光學信號之形式。
此外,本文中所描述之形成闡釋性實施例之部分之操作之任一者係有用的機器操作。該等闡釋性實施例亦係關於用於執行此等操作之一器件或一裝置。該裝置可出於所需目的而特殊構造,或可併入藉由儲存於電腦中之一電腦程式選擇性啟動或組態之通用電腦器件。特定言之,採用耦合至一或多個電腦可讀媒體之一或多個處理器之各種通用機器可與根據本文中揭示之教示寫入之電腦程式一起使用,或構造執行所需操作之一更專用裝置可能為更方便的。
前面描述已指向本發明之特定闡釋性實施例。然而,將明白,可對所描述之實施例進行其他變動及修改而獲得其等相關聯優點之一些或全部。此外,本文中所描述之程序、處理程序及/或模組可實施於硬體、軟體(體現為具有程式指令之一電腦可讀媒體)、韌體或其等之一組合中。例如,可藉由執行來自一記憶體或其他儲存器件之程式指令之一處理器執行本文中所描述之功能之一或多者。
熟習此項技術者將理解,可在不脫離本文中所揭示之發明概念之情況下作出上文所描述之系統及方法之修改及變動。因此,本發明不應視為限制性的,除了如藉由隨附申請專利範圍之範疇及精神限制之外。
[ Cross-reference to related applications ]
This application is a continuation of US Application No. 14 / 037,106, filed on September 25, 2013, the entire contents of which are incorporated herein by reference.
The invention discloses a system and a method for medical ultrasound imaging. The presently disclosed systems and methods for medical ultrasound imaging use medical ultrasound imaging equipment. The medical ultrasound imaging equipment includes a housing in the appearance of a tablet computer and a touch screen display disposed on a front panel of the housing. . The touch screen display includes a multi-touch touch screen. The multi-touch touch screen can recognize and distinguish one or more single-point, multi-point, and And / or simultaneous touch, thereby allowing gestures (ranging from simple single-point gestures to complex multi-point movement gestures) to be used as user input to a medical ultrasound imaging device. Further details on the ultrasound system and operation of tablet computers are described in U.S. Application No. 10 / 997,062 filed on November 11, 2004, U.S. Application No. 10 / 386,360, filed on March 11, 2003, and U.S. Patent No. No. 6,969,352, the entire contents of these patents and applications are incorporated herein by reference.
FIG. 1 depicts one illustrative embodiment of an exemplary medical ultrasound imaging apparatus 100 according to the present invention. As shown in FIG. 1, the medical ultrasound imaging apparatus 100 includes a housing 102, a touch screen display 104, and a computer (which has at least one processor and at least one memory implemented on a computer motherboard 106). An ultrasonic engine 108 and a battery 110. For example, the casing 102 can be implemented in a tablet computer appearance size or any other suitable appearance size. The casing 102 includes a front panel 101 and a rear panel 103. The touch screen display 104 is disposed on the front panel 101 of the casing 102, and includes one or more multi-point and / or simultaneous touches that can be identified and distinguished on a surface 105 of the touch screen display 104. Control one multi-touch LCD touch screen. The computer motherboard 106, the ultrasonic engine 108, and the battery 110 are operatively disposed in the casing 102. The medical ultrasound imaging apparatus 100 further includes a Firewire connection 112 (see also FIG. 2A) operatively connected between the computer motherboard 106 and the ultrasound engine 108 in the housing 102, and a device having at least one ultrasound detection head / sensor. One of the probe head attachment / detachment rods 115 is connected to the probe head connector 114 (see also Figs. 2A and 2B). In some preferred embodiments, the sensor head housing may include circuit components including a sensor array, transmission and reception circuits, and a beamformer and beamformer control circuit. In addition, the medical ultrasound imaging apparatus 100 has one or more I / O port connectors 116 (see FIG. 2A). The I / O port connectors 116 may include (but are not limited to): one or more USB connectors, One or more SD cards, one or more network ports, one or more small display ports, and a DC power input.
In an exemplary operation mode, medical personnel (also referred to herein as "users" or "several users") can use simple single-point gestures and / or more complex multi-point gestures as touch screen displays 104. User input of the multi-touch LCD touch screen is used to control one or more operation modes and / or functions of the medical ultrasound imaging apparatus 100. This gesture is defined herein as at least one of a finger, a stylus, and / or a palm moving, a tap, or a position on the surface 105 of the touch screen display 104. For example, such single / multi-point gestures may include static or dynamic gestures, continuous or segmented gestures, and / or any other suitable gestures. A single-point gesture is defined herein as a gesture that can be performed using a single touch contact point on the touch screen display 104 with a single finger, a stylus, or a palm. A multi-point gesture is defined herein as being performed using multiple touch contact points on the touch screen display 104 by any suitable combination of multiple fingers or at least one finger, a stylus, and a palm. A gesture. A static gesture is defined herein as a gesture that does not involve the movement of at least one finger, a stylus pen, or a palm on the surface 105 of the touch screen display 104. A dynamic gesture is defined herein as a gesture involving movement of at least one finger, a stylus, or a palm (such as movement caused by dragging one or more fingers across the surface 105 of the touch screen display 104). A continuous gesture is defined herein as a gesture that can be performed in a single movement or tap of at least one finger, a stylus, or a palm on the surface 105 of the touch screen display 104. A segmented gesture is defined herein as a gesture that can be performed in multiple movements or taps of at least one finger, a stylus pen, or a palm on the surface 105 of the touch screen display 104.
These single-point / multi-point gestures performed on the surface 105 of the touch-screen display 104 may correspond to single-point or multi-touch events that are mapped to a computer and / or an ultrasound engine 108 Perform one or more scheduled operations. The user can perform these single-point / multi-point gestures by various single-finger, multi-finger, stylus, and / or palm movements on the surface 105 of the touch screen display 104. The multi-touch LCD touch screen receives single-point / multi-point gestures as user input, and provides these user inputs to the processor, which executes program instructions stored in memory to perform such operations. Predetermined operations associated with point / multi-point gestures (such operations are performed at least in conjunction with the ultrasound engine 108). As shown in FIG. 3A, these single-point / multi-point gestures on the surface 105 of the touch screen display 104 may include (but are not limited to): one-click selection gesture 302, one pinch gesture 304, one toggle gesture 306, 314 , A rotation gesture 308, 316, a one-and-two gesture 310, an expansion gesture 312, a drag gesture 318, a press gesture 320, a press and drag gesture 322, and / or a palm gesture 324. For example, these single-point / multi-point gestures can be stored in at least one gesture library in the memory implemented on the computer motherboard 106. A computer program operable to control the operation of the system may be stored on a computer-readable medium and may be implemented using a touch processor connected to an image processor and a control processor connected to a system beamformer as needed. Therefore, the beamformer delay associated with transmission and reception can be adjusted in response to both static touch gestures and mobile touch gestures.
According to the illustrative embodiment of FIG. 1, a user of a medical ultrasound imaging apparatus 100 may use at least one swipe gesture 306 or 314 to control the depth of tissue penetration of ultrasound waves generated by an ultrasound probe / sensor. For example, a dynamic, continuous swipe gesture 306 or 314 in the "up" direction or any other suitable direction on the surface 105 of the touch screen display 104 may increase the penetration depth to one (1) cm or any other suitable amount. In addition, a dynamic, continuous swipe gesture 306 or 314 in the "down" direction on the surface 105 of the touch screen display 104 or any other suitable direction can reduce the penetration depth by one (1) cm or any other suitable amount . In addition, a dynamic, continuous drag gesture 318 in the "up" or "down" direction or any other suitable direction on the surface 105 of the touch screen display 104 can increase or decrease the depth of penetration by multiple centimeters or any other combination. Right amount.
Additional operating modes and / or functions controlled by specific single / multi-point gestures on the surface 105 of the touch screen display 104 may include (but are not limited to): freeze / save operation, 2D mode operation, gain control, color Control, split-screen control, PW imaging control, movie / time-series video clip scroll control, zoom and pan control, full-screen control, Doppler and 2D beam steering control, and / or body marking control. At least some operation modes and / or functions of the medical ultrasound imaging apparatus 100 may be controlled by one or more touch control items implemented on the touch screen display 104. In addition, the user may provide one or more specific single / multi-point gestures as user inputs for specifying at least one selection of touch control items to be implemented on the touch screen display 104 as required and / or required. Subset. A plurality of preset scan parameters displayed as icons or selectable from a menu are associated with each imaging mode so that the scan parameters are automatically selected for the mode.
A processing sequence is shown in FIG. 3B, in which ultrasonic beamforming and imaging operations 340 are controlled in response to a touch gesture input on a touch screen. Various static and mobile touch gestures have been programmed into the system to enable the data processor to operate to control beamforming and image processing operations within the tablet device 342. A user may select 344 a first display operation having a first plurality of touch gestures associated with the first display operation. Using a static or moving gesture, the user can perform one of a plurality of gestures operable to control the imaging operation and can specifically select a plurality of beamforming parameters 346 that can be adjusted to generate image data associated with the first display operation One of the four gestures. The displayed image is updated and displayed 348 in response to the updated beamforming procedure. The user may further choose to perform a different gesture with a different speed characteristic (direction or speed or both) to adjust 350 a second characteristic of the first ultrasonic display operation. Then, the displayed image is updated 352 based on the second gesture, which may modify the imaging processing parameter or the beamforming parameter. An example of this process is described in further detail herein, in which changes in speed and direction of different gestures can be associated with distinct imaging parameters of a selected display operation.
Ultrasound images of blood flow or tissue movement (whether color blood flow or spectral Doppler) are basically obtained from measurements of movement. In an ultrasound scanner, a series of pulses are transmitted to detect blood movement. The echo from a fixed target is the same between pulses. The echo from the moving scatterer appears slightly different in the time it takes for the signal to return to the scanner.
As can be seen from FIGS. 3C to 3H, there must be movement in the direction of the beam; if the blood flow is perpendicular to the beam, no relative motion from pulse to pulse is received, and no blood flow is detected. These differences can be measured as a direct time difference or, more commonly, they can be measured in terms of the phase shift from which one of the "Doppler frequencies" is obtained. These differences are then processed to produce a color blood flow display or a Doppler echogram. In FIG. 3C to FIG. 3D, the blood flow direction is perpendicular to the beam direction, and no blood flow was measured by the pulse wave spectrum Doppler. In FIGS. 3G to 3H, when the ultrasound beam is directed to an angle that is better aligned to the blood flow, a weak blood flow is displayed in the color blood flow image, and in addition, the blood flow is measured by pulse wave Doppler. In FIG. 3H, when the ultrasound beam is directed to better alignment to an angle corresponding to a moving blood flow direction, the color blood flow image is stronger, and in addition, when the correction angle of the PWD is placed to align to the blood flow At this time, PWD measured a strong blood flow.
In this tablet ultrasound system, a ROI (area of interest) is also used to define the direction of a movement gesture in response to an ultrasound transmission beam. An image of the liver with a branch of renal blood flow in the color blood flow mode is shown in FIG. 3I. Because the ROI is straight down from the sensor, and the blood flow direction is almost normal to the ultrasound beam, it is very weakly detected. Renal blood flow. Therefore, the color blood flow pattern is used to image one of the renal blood flows in the liver. As can be seen, the beam is almost normal to the blood flow and a very weak blood flow is detected. One of the finger gestures outside the ROI is used to steer the beam. As can be seen in FIG. 3J, by resetting the beamforming parameters to steer the ROI, the beam direction is more aligned to the blood flow direction, and a stronger blood flow within the ROI is detected. In FIG. 3J, one of the finger gestures outside the ROI is used to direct the ultrasound beam into a direction more aligned to the direction of blood flow. Stronger blood flow can be seen within the ROI. A gesture of horizontal movement of the finger within the ROI will move the ROI frame to a position that overlaps the entire kidney region, that is, the horizontal movement allows one of the ROI frames to move in translation so that the frame overlaps the entire target region.
Figure 3K demonstrates a horizontal movement gesture. When the finger is within the ROI, the finger can move the ROI frame to any place in the image plane. In the above embodiment, it is easy to distinguish that one of the fingers is in a "flick" gesture outside a "ROI" box, which is intended to be used to guide a beam and one of the fingers is in a "drag and The "Move, ie move horizontally" gesture is intended for moving the ROI box. However, there are applications in which there is no ROI as a reference area, it is obvious that it is difficult to distinguish between a "flick" or a "horizontal move" gesture. In this case, the touch screen program needs to track the initial speed or acceleration of the finger It is determined whether the gesture is a "flick" gesture or a "drag and move" gesture. Therefore, the touch engine receiving data from the touch screen sensor device is programmed to discriminate between speed thresholds indicating different gestures. Therefore, the time, rate, and direction associated with different mobile gestures may have preset thresholds. Two- and three-finger static and movement gestures can have separate thresholds to distinguish these control operations. Note that preset displayed icons or virtual buttons can have different static pressures or duration thresholds. When operating in full screen mode, the touch screen processor (which preferably operates on the central processing unit of the system performing other imaging operations such as scan conversion) turns off static icons.
4A-4C depict exemplary subsets of touch controls 402, 404, 406 that can be implemented on the touch screen display 104 by a user of the medical ultrasound imaging device 100. It should be noted that any other suitable subset (s) of touch controls may be implemented on the touch screen display 104 as required and / or required. As shown in FIG. 4A, the subset 402 includes one touch control item 408 for performing a two-dimensional (2D) mode operation, one touch control item 410 for performing a gain control operation, and one for performing a color control operation. A touch control item 412 and a touch control item 414 for performing an image / clip freeze / save operation. For example, a user may use the pressing gesture 320 to activate the touch control item 408, so as to return the medical ultrasonic imaging apparatus 100 to the 2D mode. In addition, the user may use a pressing gesture 320 against one side of the touch control item 410 to reduce a gain level, and use a pressing gesture 320 against the other side of the touch control item 410 to increase the gain level. In addition, the user can use a drag gesture 318 on the touch control item 412 to identify a range of density on a 2D image using a predetermined color code. In addition, the user may use the press gesture 320 to activate the touch control 414 to freeze / store a still image or obtain a movie image clip.
As shown in FIG. 4B, the subset 404 includes a touch control item 416 for performing a split screen control operation, a touch control item 418 for performing a PW imaging control operation, and performing Doppler and 2D One touch control item 420 for beam steering control operation and one touch control item 422 for performing annotation operation. For example, a user may use a touch gesture 320 against the touch control item 416 to allow the user to alternately use a click gesture 302 on each side of the split screen on the opposite side of the split touch screen display 104. Switch. In addition, the user can use a press gesture 320 to activate the touch control 418 and enter PW mode, which allows (1) the user to control the angle correction, and (2) move by using a press and drag gesture 322 (for example, "up ("Or" Down ") may be displayed on a touch screen display 104 at a baseline, and / or (3) may be increased or decreased by using a click gesture 302 on a scale that may be displayed on the touch screen display 104 proportion. In addition, the user can use a pressing gesture 320 against one side of the touch control item 420 to perform 2D beam steering to "left" or any other suitable direction in five (5) increments or any other suitable increment, And the other side against the touch control item 420 uses a press gesture 320 to perform 2D beam steering to "right" or any other suitable direction in five (5) increments or any other suitable increment. In addition, the user may employ a tap gesture 302 on the touch control item 422 to allow the user to input annotation information via a pop-up keyboard that can be displayed on the touch screen display 104.
As shown in FIG. 4C, the subset 406 includes one touch control item 424 for performing dynamic range operations, one touch control item 426 for performing Teravision ™ software operations, and one touch control for performing mapping operations. Item 428 and one touch control item 430 for performing a needle guidance operation. For example, a user can control or set the dynamic range by using a touch gesture 320 and / or a press and drag gesture 322 against the touch control item 424. In addition, the user may use a tap gesture 302 on the touch control item 426 to select one of the desired levels of Teravision ™ software to be executed by the processor from the memory on the computer motherboard 106. Moreover, the user can use the tap gesture 302 on the touch control item 428 to perform a desired mapping operation. In addition, the user can use the pressing gesture 320 against the touch control item 430 to perform a desired needle guidance operation.
According to the present invention, an object displayed as an ultrasonic image on the touch-screen display 104 can be performed using a single-point / multi-point gesture on the surface 105 of the touch-screen display 104 of the medical ultrasonic imaging apparatus 100 (see FIG. 1). Such as organs, tissues, etc.). The user can directly view an original ultrasound image of one of the displayed objects, an enlarged version of the ultrasound image of one of the displayed objects, and / or a virtual window 506 on the touch screen display 104 (see FIGS. 5C and 5D). One of the ultrasound images within) performs such measurement and / or tracking of an object with an enlarged portion.
5A and 5B depict an original ultrasound image of an exemplary object (ie, a liver 502 with a cystic lesion 504) displayed on a touch screen display 104 of a medical ultrasound imaging apparatus 100 (see FIG. 1). It should be noted that this ultrasound image can be generated by the medical ultrasound imaging device 100 in response to ultrasound (the ultrasound waves are generated by an ultrasonic probe / sensor operatively connected to one of the devices 100) to penetrate the liver tissue. The measurement and / or tracking of the liver 502 with cystic lesions 504 can be performed directly on the original ultrasound image (see FIG. 5A and FIG. 5B) displayed on the touch screen display 104, or on an enlarged version of the ultrasound image. For example, the user may use an expansion gesture by placing two (2) fingers on the surface 105 of the touch screen display 104 and spreading them apart to enlarge the original ultrasound image (see, for example, the expansion of FIG. 3). Gesture 312) to obtain this enlarged version of the ultrasound image. Such measurement and / or tracking of the liver 502 and the cystic lesion 504 can also be performed on one of the ultrasound images in the virtual window 506 (see FIGS. 5C and 5D) on the touch screen display 104 with an enlarged portion.
For example, using his or her finger (for example, see one of the fingers 508 of FIGS. 5A-5D), the user can rest on the touch screen display by near the area of interest, such as the area corresponding to the cystic lesion 504. The surface 105 of 104 uses a pressing gesture (for example, see pressing gesture 320 of FIG. 3) (see FIG. 5B) to obtain a virtual window 506. In response to the pressing gesture, the virtual window 506 (see FIG. 5C and FIG. 5D) is displayed on the touch screen display 104 (may be at least partially superimposed on the original ultrasound image), thereby providing the user with the vicinity of the cystic lesion 504 One view of one enlarged portion of the liver 502. For example, the virtual window 506 of FIG. 5C may provide a magnified view of one of the ultrasound images of the cystic lesion 504. The ultrasound image of the cystic lesion 504 is a finger 508 pressed against the surface 105 of the touch screen display 104 Overlap. To reposition the enlarged cystic lesion 504 within the virtual window 506, the user can use a press and drag gesture against the surface 105 of the touch screen display 104 (see, for example, press and drag gesture 322 of FIG. 3) (see FIG. 5D), thereby moving the image of the cystic lesion 504 to a desired position within the virtual window 506. In an embodiment, the medical ultrasound imaging apparatus 100 may be configured to allow a user to select a magnification level within the virtual window 506 that is 2 times, 4 times, or any other suitable multiple of the original ultrasound image. The user can remove the virtual window 506 from the touch screen display 104 by lifting his or her finger (eg, see the finger 508 of FIGS. 5A-5D) from the surface 105 of the touch screen display 104.
FIG. 6A depicts an ultrasound image of another exemplary object (ie, a apical four (4) chamber view of a heart 602) displayed on the touch screen display 104 of the medical ultrasound imaging apparatus 100 (see FIG. 1). It should be noted that this ultrasonic image may be generated by the medical ultrasonic imaging device 100 in response to ultrasonic waves (which are generated by an ultrasonic probe / sensor operatively connected to the device 100) to penetrate the heart tissue. The measurement and / or tracking of the heart 602 may be performed directly on the original ultrasound image (see FIGS. 6A to 6E) displayed on the touch screen display 104, or on an enlarged version of the ultrasound image. For example, using his or her fingers (see, for example, fingers 610, 612 of FIGS. 6B-6E), the user can perform a heart by using one or more multi-finger gestures on the surface 105 of the touch screen display 104 One of the left ventricles 606 (see FIGS. 6B to 6E) and one of the endocardial borders 604 (see FIG. 6B) is manually tracked. In one embodiment, using his or her finger (for example, see fingers 610, 612 of FIGS. 6B to 6E), the user can perform one or two gestures on the surface 105 of the touch screen display 104 (for example, 3A, point two gestures 310) to obtain a cursor 607 (see FIG. 6B), and by using a finger (such as finger 610) to adopt a drag gesture (for example, see the drag gesture 318 of FIG. 3A) and The cursor 607 is moved to move the cursor 607 to a desired position on the touch screen display 104. The systems and methods described herein can be used for quantitative measurement of cardiac wall motion and specifically for measurement of ventricular asynchrony, as described in detail in US Application No. 10 / 817,316, filed April 2, 2004 , The entire contents of the case are incorporated herein by reference.
Once the cursor 607 is at a desired location on the touch screen display 104 (as determined by the location of the finger 610), the user can use a one-click gesture by using another finger (such as finger 612) (see, for example, click gesture 302; see FIG. 3) and fix the cursor 607 at this position. To perform manual tracking of one of the endocardial borders 604 (see FIG. 6B), the user can use a finger 610 to use a press and drag gesture (for example, see press and drag gesture 322 of FIG. 3), as shown in FIGS. 6C and 6D. Painted. This manual tracking of the endocardial border 604 may be highlighted on the touch screen display 104 in any suitable manner, such as by a dashed line 608 (see FIGS. 6C-6E). This manual tracking of the endocardial border 604 may continue until the finger 610 reaches any suitable location on the touch screen display 104, or until the finger 610 returns to the location of the cursor 607, as shown in FIG. 6E. Once the finger 610 is at the position of the cursor 607 or any other suitable location, the user can complete the manual tracking operation by using the finger 612 with a one-point selection gesture (for example, see the click gesture 302; see FIG. 3). It should be noted that this manual tracking operation may be employed to track any number of other suitable features and / or waveforms (such as a pulsed wave Doppler (PWD) waveform). In an embodiment, the medical ultrasound imaging device 100 may be configured to manually track at least in part based on one or more of the respective feature / waveform (s) to perform one or more of the features (s) and / or waveforms. Appropriate calculations and / or measurements.
As described above, the user may perform measurement and / or tracking of an object on an enlarged part of an original ultrasonic image of a displayed object in a virtual window on the touch screen display 104. 7A-7C depict an original ultrasound image of an exemplary object (ie, a liver 702 with a cystic lesion 704) displayed on a touch screen display 104 of a medical ultrasound imaging apparatus 100 (see FIG. 1). 7A to 7C further depict a virtual window 706 that provides an enlarged view of one of the ultrasound images of the cystic lesion 704. The ultrasound image of the cystic lesion 704 is pressed against the surface 105 of the touch screen display 104 One of the user's fingers (such as a finger 710) overlaps. Using his or her fingers (see, for example, fingers 710, 712 of FIGS. 7A-7C), a user can execute a virtual window 706 by using one or more multi-finger gestures on the surface 105 of the touch screen display 104 The size of one of the cystic lesions 704 was measured.
For example, using his or her fingers (see, for example, fingers 710, 712 of FIGS. 7A-7C), the user can apply a one-two gesture on surface 105 (e.g., two-point gesture 310 of FIG. 3) ) To obtain a first cursor 707 (see FIG. 7B, FIG. 7C), and the first cursor can be moved by using a finger (such as finger 710) using a drag gesture (for example, see drag gesture 318 of FIG. 3). 707, thereby moving the first cursor 707 to a desired position. Once the first cursor 707 is at the desired location (such as determined by the location of the finger 710), the user can use a single click gesture by using another finger (such as finger 712) (see, for example, click gesture 302; see FIG. 3) The first cursor 707 is fixed at the position. Similarly, the user can obtain a second cursor 709 (see FIG. 7C) by using a double-tap gesture on the surface 105 (see, for example, double-tap gesture 310 in FIG. 3), and can use a finger 710 The second cursor 709 is moved by using a drag gesture (for example, see the drag gesture 318 in FIG. 3), thereby moving the second cursor 709 to a desired position. Once the second cursor 709 is at the desired position (such as determined by the position of the finger 710), the user can use the finger 712 to adopt a one-click gesture (for example, see the click gesture 302; see FIG. 3). Two cursors 709 are fixed at this position. In an embodiment, the medical ultrasound imaging device 100 may be configured to perform calculations and / or measurements of any suitable size (s) related to the cystic lesion 704 based at least in part on the locations of the first cursor 707 and the second cursor 709. .
8A to 8C depict an original ultrasound image of an exemplary object (ie, a liver 802 with a cystic lesion 804) displayed on a touch screen display 104 of a medical ultrasound imaging apparatus 100 (see FIG. 1). Figures 8a to 8c further depict a virtual window 806 that provides an enlarged view of one of the ultrasound images of the cystic lesion 804. The ultrasound image of the cystic lesion 804 is pressed against the surface 105 of the touch screen display 104 One of the user's fingers (such as a finger 810) overlaps. Using his or her fingers (for example, see fingers 810, 812 of FIGS. 8A-8C), the user can execute a virtual window 806 by using one or more multi-finger gestures on the surface 105 of the touch screen display 104 Caliper measurement of one of the cystic lesions 804 inside.
For example, using his or her finger (see, for example, fingers 810, 812 of FIGS. 8A-8C), the user can apply a one-two gesture on surface 105 (e.g., two-point gesture 310 of FIG. 3) ) To obtain a first cursor 807 (see FIG. 8B, FIG. 8C), and move the cursor 807 by using a finger (such as finger 810) using a drag gesture (for example, see drag gesture 318 of FIG. 3), Thereby, the cursor 807 is moved to a desired position. Once the cursor 807 is at the desired location (as determined by the location of finger 810), the user can use a single-click gesture by using another finger (such as finger 812) (for example, see click gesture 302; see Figure 3) The cursor 807 is fixed at this position. Then, the user can use a press and drag gesture (for example, see the press and drag gesture 322 of FIG. 3) to obtain a connection line 811 (see FIGS. 8B and 8C) and extend from the first cursor 807 across the cystic lesion 804 The connecting line 811 is to a desired part on the other side of the cystic lesion 804. Once the connecting line 811 extends across the cystic lesion 804 to a desired location on the other side of the cystic lesion 804, the user can use the finger 812 to use a single click gesture (for example, see click gesture 302; see Figure 3) A second cursor 809 (see FIG. 8C) is obtained and fixed at the desired location. In an embodiment, the medical ultrasound imaging device 100 may be configured to perform any (several) any of the cystic lesions 804 related to the cystic lesion 804 based at least in part on the connecting line 811 extending between the portions of the first cursor 807 and the second cursor 809. Calculation and / or measurement with suitable calipers.
FIG. 9A shows a system 140 in which a sensor housing 150 having an array of sensor elements 152 may be attached to the housing 102 at a connector 114. Each probe 150 may have a probe identification circuit 154 that uniquely identifies one of the attached probes. When a user inserts a different probe with a different array, the system identifies the operating parameters of the probe. Note that the preferred embodiment may include a display 104 having a touch sensor 107, which can be connected to analyze touch screen data from the sensor 107 and transmit commands to two images A processing operation (1124 shown in FIG. 11) and a touch processor 109, which is one of a beamformer control processor (1116 shown in FIG. 11). In a preferred embodiment, the touch processor may include a computer-readable medium storing instructions for operating an ultrasonic touch screen engine that is operable to control the display and imaging operations described herein. .
FIG. 9B shows a software flowchart 900 of a typical sensor management module 902 in an ultrasound application. When a TRANSDUCER ATTACH 904 event is detected, the sensor management software module 902 first reads the sensor type ID 906 and the hardware version information from the IDENTIFICATION segment. This information is used to fetch the specific sensor profile data set 908 from the hard disk and load it into the application's memory. Then, the software reads the adjustment data 910 from the FACTORY segment and applies the adjustments to the profile data just loaded into the memory 912. The software module then sends a sensor attachment message 914 to the main ultrasound application, which uses the loaded sensor profile. After confirming 916, an ultrasound imaging sequence is performed and the USAGE segment 918 is updated. Then, the sensor management software module waits for a sensor detach (TRANSDUCER DETACH) event 920 or 5 minutes have passed. If a sensor detach event is detected 921 and a message 924 is sent and confirmed 926, the 928 sensor profile data set is removed from the memory and the module returns to wait for another sensor attach event. If a 5 minute time period expires without detecting a sensor detach event, the software module adds a cumulative use counter to the usage segment 922 and waits for another 5 minute time period or a sensor detach event. This cumulative use is recorded in memory for maintenance and replacement records.
There are many types of ultrasonic sensors. They differ in terms of geometry, number of components, and frequency response. For example, a linear array with a center frequency of 10 MHz to 15 MHz is better suited for chest imaging, and a curved array with a center frequency of 3 MHz to 5 MHz is better suited for abdominal imaging.
Different types of sensors are often required for the same or different ultrasound scanning sessions. For ultrasound systems with only one sensor connection, the operator will change the sensors before starting a new scanning session.
In some applications, it is necessary to switch between different types of sensors during an ultrasound scanning session. In this case, it is more convenient to have multiple sensors connected to the same ultrasound system, and the operator can control by clicking the operator without physically removing and reattaching the sensor (this takes a long time) A button on the stage to quickly switch between these connected sensors. A preferred embodiment of the present invention may include a multiplexer in a tablet computer housing. The multiplexer may select between a plurality of probe head connector ports in the tablet computer housing, or alternatively, the tablet The computer case can be connected to an external multiplexer that can be mounted to a cart as described herein.
FIG. 9C is a perspective view of an exemplary needle-sensing positioning system using an ultrasonic sensor without any active electronics in the sensor assembly. The sensor sensor may include a passive ultrasonic sensor element. These components can be used in a manner similar to one of the typical sensor probes of ultrasonic engine electronics. System 958 includes the addition of an ultrasonic sensor element 960 added to a needle guide 962, which is shown in FIG. 9C but can be of any suitable physical size. The ultrasonic sensor element 960 and the needle guide 962 may be mounted to an ultrasonic sensor probe head sound grip or an ultrasonic imaging probe head assembly 970 using a needle guide mounting bracket 966. A magnetic disk (ultrasonic reflector disk 964) mounted on the exposed end is reflective to ultrasonic waves.
The ultrasonic sensor element 960 on the needle guide 962 may be connected to an ultrasonic engine. This connection can be made through a separate cable to a dedicated probe connector on the engine (similar to a common pencil-shaped CW probe connector). In an alternative embodiment, a small short cable can be plugged into a larger image sensor probe head grip or a split cable connected to the same probe connector at the engine. In another alternative embodiment, the connection may be made via an electrical connector between the image detection head grip and the needle guide (there is no cable between them). In an alternative embodiment, the ultrasonic sensor element on the needle guide may be connected to the ultrasonic engine by enclosing the needle guide and the sensor elements in the same mechanical enclosure of the imaging probe head grip.
FIG. 9D is a perspective view of a needle guide 962 positioned with the sensor element 960 and the ultrasonic reflector disk 964. FIG. The position of the reflector disk 964 is positioned by transmitting ultrasonic waves 972 from the sensor element 960 on the needle guide 962. The ultrasonic wave 972 travels through the air toward the reflector disk 964 and is reflected by the reflector disk 964. The reflected ultrasonic wave 974 reaches the sensor element 960 on the needle guide 962. The distance 976 between the reflector disk 964 and the sensor element 960 is calculated from the elapsed time and the rate of sound in the air.
FIG. 9E is a perspective view of an alternative embodiment of an exemplary pin feeler positioning system using an ultrasonic sensor without any active electronics in the sensor assembly. The sensor sensor may include a passive ultrasonic sensor element. These components can be used in a manner similar to one of the typical sensor probes of ultrasonic engine electronics.
The system 986 includes a needle guide 962 that can be mounted to a needle guide mounting bracket 966 that can be coupled to an ultrasound imaging probe assembly 982 for imaging a patient's body or can be an alternative Suitable appearance size. An ultrasonic reflector disk 964 can be mounted at the exposed end of the needle 956. In this embodiment, a linear ultrasonic sound array 978 is mounted parallel to the moving direction of the needle 956. The linear ultrasonic sound array 978 includes an ultrasonic sensor array 980 positioned parallel to the needle 956. In this embodiment, an ultrasound imaging probe assembly 982 is positioned for imaging the patient's body. An ultrasound sensor array 984 is configured using an ultrasound sensor array 984 for imaging a patient's body.
In this embodiment, the position of the ultrasonic reflector disk 964 can be detected by using an ultrasonic sensor array 980 coupled to an ultrasonic imaging probe head assembly 978 for imaging. The position of the reflector disk 964 is positioned by transmitting ultrasonic waves 972 from a sensor element 980 on the ultrasonic imaging probe head assembly 978 for imaging. The ultrasonic wave 972 travels through the air toward the reflector disk 964 and is reflected by the reflector disk 964. The reflected ultrasonic wave 974 reaches the sensor element 980 on the ultrasonic imaging probe head assembly 978 for imaging. The distance 976 between the reflector disk 964 and the sensor element 980 is calculated from the elapsed time and the rate of sound in the air. In an alternative embodiment, an alternating algorithm may be used to sequentially scan the polarities of the elements in the sensor array and analyze the reflections produced by each sensor array element. In an alternative embodiment, multiple scans may occur before an ultrasound image is formed.
FIG. 9F illustrates a system 140 similar to the system shown in FIG. 9A and configured to receive a subscriber identity module (SIM) card for wireless communication. In this particular embodiment, the communication circuit 118 is connected to the computing circuit 106, and a SIM card port 119 is configured to receive a SIM card 120 and connect the SIM card 120 to the communication circuit 118 via a plurality of conductive contacts. In some embodiments, a SIM card port 119 configuration that can accept a standard SIM card, small SIM card, micro SIM card, nano SIM card, embedded SIM card, or other similar wireless identification / authorization card or circuit can be used. Ultrasonic devices. The system incorporates a SIM card interface circuit 118 (such as a SIM card interface circuit available from NXP Semiconductors NV in Eindhoven, The Netherlands), which may include electromagnetic interference ( EMI) filtering and electrostatic discharge (ESD) protection features. The identification card incorporates an identification circuit (usually an integrated circuit embedded in a plastic card or substrate). The identification circuit includes storing an International Mobile Subscriber Identity (IMSI) and a mobile wireless network such as 3G or 4G. Communication network) a memory device that identifies and authenticates a key of a user.
FIG. 10A illustrates an exemplary method for monitoring synchronization of a heart according to an exemplary embodiment. In this method, a reference template is loaded into the memory and used to guide a user to identify an imaging plane (step 930). Next, a user identifies a desired imaging plane (step 932). One apical 4-chamber view of the heart is commonly used; however, other views may be used without departing from the spirit of the invention.
Sometimes identifying the endocardial boundary can be difficult, and when encountering these difficulties, tissue Doppler imaging of the same view can be used (following step 934). A reference template is provided for identifying the septum and one of the lateral free walls (as per step 936). Then, a standard tissue Doppler imaging (TDI) with a preset speed level of, for example, ± 30 cm / sec can be used (per step 938).
A reference to one of the desired triple images can then be provided (step 940). Mode B or TDI can be used to guide the distance gate (as per step 942). The B mode can be used to guide the distance gate (in step 944) or TDI can be used to guide the distance gate (in step 946). Using TDI or B mode to guide the distance gate also allows the use of a directional correction angle to allow the spectral Doppler to display the radial average speed of the middle wall. Next, a first pulse wave spectral Doppler is used to measure the average speed of the next wall using the dual or triple mode (step 948). Software for processing data and calculating out-of-sync can use a site (eg, a central point) to automatically set an angle between dated sites on the heart wall to help simplify parameter setting.
A dual image or a TDI is also used to guide a second distance gate position (as per step 950), and a directional correction angle can be used if required. After step 950, the average velocity of the middle partition wall and the lateral free wall is tracked by the system. Next, the time integral 952 of the spectral Doppler average velocity at the region of interest (eg, the septal wall and the left ventricular free wall) provides displacement of the septum and left free wall, respectively.
The above method steps may be utilized in conjunction with one of the high-pass filtering components (analog or digital) known in the related art for removing any baseline interference present in the collected signal. In addition, the disclosed method employs multiple simultaneous PW spectral Doppler lines for tracking the movement of the interventricular septum and the left ventricular free wall. In addition, a multi-gate structure can be used along each spectrum line, so quantitative measurement of the area wall motion is allowed. Averaging multiple gates allows for the measurement of global wall movement.
10B is a detailed schematic block diagram of an exemplary embodiment of a system 1000 having an integrated ultrasonic probe 1040 that can be connected to any personal computer (PC) 1010 through an interface unit 1020. The ultrasonic probe 1040 is configured to transmit ultrasonic waves and reduce ultrasonic waves reflected from one or more image targets 1064. The sensor 1040 may be coupled to the interface unit 1020 using one or more cables 1066, 1068. The interface unit 1020 can be positioned between the integrated ultrasonic probe 1040 and the host computer 1010. The two-stage beamforming systems 1040 and 1020 can be connected to any PC via a USB connection 1022 and 1012.
The ultrasonic probe 1040 may include a sub-array / aperture 1052 composed of adjacent elements having an aperture smaller than the aperture of the entire array. The returned echoes are received by the 1D sensor array 1062 and transmitted to the controller 1044. The controller starts to form a coarse beam by transmitting signals to the memories 1058, 1046. The memories 1058 and 1046 transmit a signal to a transmission driver 1 1050 and a transmission driver m 1054. Next, the transmission driver 1 1050 and the transmission driver m 1054 send signals to the multiplexer 1 1048 and the multiplexer m 1056, respectively. This signal is transmitted to the sub-array beamformer 1 1052 and the sub-array beamformer n 1060.
The output of each rough beamforming operation may include further processing through a second stage beamforming in the interface unit 1020 to convert the beamforming output into a digital representation. The coarse beamforming operations may be successively summed to form a fine beam output for the array. The signals can be transmitted from the ultrasonic probe 1040 sub-array beamformer 1 1052 and the sub-array beamformer n 1060 to the A / D converters 1030 and 1028 in the interface unit 1020. Within the interface unit 1020, there are A / D converters 1028, 1030 for converting the first-stage beamforming output into a digital representation. A customer-specific application integrated circuit (ASIC), such as a programmable gate array (FPGA) 1026, can receive digital conversion from the A / D converters 1030, 1028 to complete the second stage beamforming. The FPGA digital beamforming 1026 can transmit information to the system controller 1024. The system controller can transmit information to a memory 1032, which can send a signal back to the FPGA digital beamforming 1026. Alternatively, the system controller 1024 can transmit information to the custom USB3 chipset 1022. The USB3 chipset 1022 can then transmit the information to a DC-DC converter 1034. Then, the DC-DC converter 1034 can transmit power from the interface unit 1020 to the ultrasonic probe 1040. In the ultrasonic probe 1040, a power supply 1042 can receive power signals and interface with the transmission driver 1 1050 to provide power to the front-end integrated probe.
The interface unit 1020 is customized or the USB3 chipset 1022 can be used to provide a communication link between the interface unit 1022 and the host computer 1010. The custom or USB3 chipset 1022 transmits a signal to the custom or USB3 chipset 1012 of the host computer 1010. The custom or USB3 chipset 1012 then interfaces with the microprocessor 1014. Then, the microprocessor 1014 can display information or send the information to a device 1075.
In an alternative embodiment, a narrow-band beamformer may be used. For example, an analog phase shifter is applied to each of the received echoes. Then, the phase-shifted outputs in each sub-array are summed to form a rough beam. An A / D converter can be used to digitize each of these coarse beams; a digital beamformer is then used to form a fine beam.
In another embodiment, forming a 64-element linear array may use eight neighboring elements to form a coarse beam output. This configuration can utilize eight output analog cables that connect the output of the integrated probe to the interface unit. The rough beam can be sent through a cable to the corresponding A / D converter located in the interface unit. Digital delay is used to form a fine beam output. Eight A / D converters may be required to form a digital representation.
In another embodiment, sixteen sub-array beamforming circuits can be used to form a 128-element array. Each circuit can form a rough beam provided in a first stage output to the interface unit from an adjacent eight-element array. This configuration can utilize sixteen output analog cables that connect the output of the integrated probe to the interface unit to digitize the output. A PC microprocessor or a DSP can be used to perform down-conversion, base-banding, scan conversion, and post-image processing functions. The microprocessor or the DSP can also be used to perform all Doppler processing functions.
FIG. 10C is a detailed schematic block diagram of one exemplary embodiment of a system 1080 of the integrated ultrasonic probe 1040 with a first sub-array beamforming circuit, and the second-stage beamforming circuit is integrated inside the host computer 1082. The back-end computer with the second-level beamforming circuit may be a PDA, a tablet computer, or a mobile device housing. The ultrasonic probe 1040 is configured to transmit ultrasonic waves and reduce ultrasonic waves reflected from one or more image targets 1064. The sensor 1040 is coupled to the host computer 1082 using one or more cables 1066, 1068. Note that A / D circuit components can also be placed in the sensor head housing.
The ultrasonic probe 1040 includes a sub-array / aperture 1052 composed of adjacent elements having an aperture smaller than one aperture of the entire array. The returned echoes are received by the 1D sensor array 1062 and transmitted to the controller 1044. The controller starts to form a rough beam by transmitting signals to the memories 1058, 1046. The memories 1058 and 1046 transmit a signal to a transmission driver 1 1050 and a transmission driver m 1054. Next, the transmission driver 1 1050 and the transmission driver m 1054 send signals to the multiplexer 1 1048 and the multiplexer m 1056, respectively. This signal is transmitted to the sub-array beamformer 1 1052 and the sub-array beamformer n 1060.
The output of each coarse beamforming operation is then passed through a second stage beamforming in the interface unit 1020 to convert the beamforming output into a digital representation. These coarse beamforming operations may be summed consecutively to form one fine beam output for the array. The signals are transmitted from the ultrasonic probe 1040 sub-array beamformer 1 1052 and the sub-array beamformer n 1060 to the A / D converters 1030 and 1028 in the host computer 1082. There are A / D converters 1028 and 1030 in the host computer 1082 for converting the first-stage beamforming output into a digital representation. A customer ASIC (such as an FPGA 1026) can receive digital conversion from the A / D converters 1030, 1028 to complete the second stage beamforming. The FPGA digital beamforming 1026 transmits information to the system controller 1024. The system controller transmits information to a memory 1032, which can send a signal back to the FPGA digital beamforming 1026. Alternatively, the system controller 1024 can transmit information to the custom USB3 chipset 1022. The USB3 chipset 1022 can then transmit the information to a DC-DC converter 1034. Then, the DC-DC converter 1034 can transmit power from the interface unit 1020 to the ultrasonic probe 1040. In the ultrasonic probe 1040, a power supply 1042 can receive power signals and interface with the transmission driver 1 1050 to provide power to the front-end integrated probe. The power supply may include a battery that enables wireless operation of the sensor assembly. A wireless transceiver can be integrated in the controller circuit or a separate communication circuit to enable wireless transmission of image data and control signals.
The custom or USB3 chipset 1022 of the host computer 1082 can be used to provide a communication link between the custom or USB3 chipset 1012 to transmit a signal to the microprocessor 1014. Then, the microprocessor 1014 can display information or send the information to a device 1075.
11 is an exemplary embodiment of an ultrasonic engine 108 (ie, a front-end ultrasonic specific circuit) and an exemplary embodiment of a computer motherboard 106 (ie, a host computer) of the ultrasonic device shown in FIGS. 1 and 2A. One detailed schematic block diagram. The components of the ultrasonic engine 108 and / or the computer motherboard 106 may be implemented in an application specific integrated circuit (ASIC). Exemplary ASICs have a high channel count and may in some exemplary embodiments package 32 or more channels per chip. Those of ordinary skill will recognize that the ultrasound engine 108 and the computer motherboard 106 may include more or fewer modules than those shown. For example, the ultrasonic engine 108 and the computer motherboard 106 may include the modules shown in FIG. 17.
A sensor array 152 is configured to transmit and receive ultrasound waves to and from one or more imaging targets 1102. The sensor array 152 is coupled to the ultrasound engine 108 using one or more cables 1104.
The ultrasonic engine 108 includes a high voltage transmission / reception (TR) module 1106 for applying a driving signal to the sensor array 152 and for receiving a return echo signal from the sensor array 152. The ultrasonic engine 108 includes a preamplifier / TGC module 1108 for amplifying the return echo signals and applying a suitable time gain compensation (TGC) function to these signals. The ultrasonic engine 108 includes a sampled data beamformer 1110, which has been amplified and processed by a preamplifier / TGC module 1108 and used for delay coefficients in each channel after returning an echo signal.
In some exemplary embodiments, the high voltage TR module 1106, the preamplifier / TGC module 1108, and the sample interpolation receiving beamformer 1110 may each be a silicon chip with 8 to 64 channels per chip, but Exemplary embodiments are not limited to this range. In some embodiments, the high-voltage TR module 1106, the preamplifier / TGC module 1108, and the sample interpolation receiving beamformer 1110 may each have 8, 16, 32, 64 channels, and the like One silicon chip. As shown in FIG. 11, an exemplary TR module 1106, an exemplary preamplifier / TGC module 1108, and an exemplary beamformer 1110 may each take the form of a silicon wafer including 32 channels.
The ultrasonic engine 108 includes a FIFO buffer module 1112. The FIFO buffer module 1112 is used to buffer the processed data output by the beamformer 1110. The ultrasonic engine 108 also includes a memory 1114 for storing program instructions and data, and a system controller 1116 for controlling operations of the ultrasonic engine module.
The ultrasound engine 108 interfaces with the computer motherboard 106 via a communication link 112, which can follow a standard high-speed communication protocol such as Firewire (IEEE 1394 standard serial interface) or fast (e.g., 200 Mbit / s Seconds to 400 megabits per second or faster) Universal Serial Bus (USB 2.0 USB 3.0) protocol. The standard communication link to the computer motherboard is operated at 400 Mbit / s or higher, preferably at 800 Mbit / s or higher. Alternatively, the link 112 may be a wireless connection, such as an infrared (IR) link. The ultrasound engine 108 includes a communication chipset 1118 (eg, a Firewire chipset) that establishes and maintains a communication link 112.
Similarly, the computer motherboard 106 also includes a communication chipset 1120 (eg, a Firewire chipset) that establishes and maintains a communication link 112. The computer motherboard 106 includes a core computer-readable memory 1122 for storing data and / or computer-executable instructions for performing ultrasound imaging operations. The memory 1122 forms the main memory of the computer, and can store about 4 GB of DDR3 memory in an exemplary embodiment. The computer motherboard 106 also includes a microprocessor 1124 for executing one of the computer-executable instructions stored on the core computer-readable memory 1122. The computer-executable instructions are used for performing ultrasound imaging processing operations. An exemplary microprocessor 1124 may be an existing commercial computer processor (such as an Intel-Core i5 processor). Another exemplary microprocessor 1124 may be a digital signal processor (DSP) based processor such as one or more from Texas InstrumentsTM processor). The computer motherboard 106 also includes a display controller 1126 for controlling a display device, which can be used to display ultrasound data, scans, and maps.
Exemplary operations performed by the microprocessor 1124 include (but are not limited to): down-conversion (for generating I and Q samples from received ultrasound data), scan conversion (for converting ultrasound data into one of the display devices) Display format), Doppler processing (for determining and / or imaging movement and / or flow information from ultrasound data), color blood flow processing (for generating superimposition on a B-mode ultrasound using autocorrelation in one embodiment One color-coded image of the Doppler frequency shift on the image), energy Doppler processing (for determining energy Doppler data and / or generating an energy Doppler image), spectral Doppler processing (for determining Spectral Doppler data and / or generating a spectral Doppler map) and post-signal processing. These operations are described in further detail in WO 03/079038 A2, entitled “Ultrasound Probe with Integrated Electronics”, filed on March 11, 2003, the entire contents of which are expressly incorporated herein by reference.
In order to achieve a smaller and lighter portable ultrasonic device, the ultrasonic engine 108 includes providing a reduction in overall package size and footprint of a circuit board of the ultrasonic engine 108. To this end, the exemplary embodiments provide a small and light portable ultrasonic device that minimizes the overall package size and footprint while providing a high channel count. In some embodiments, a high-channel-count circuit board of an exemplary ultrasonic engine may include one or more multi-chip modules, where each chip provides multiple channels (eg, 32 channels). The term "multi-chip module" as used herein refers to an electronic package in which multiple integrated circuits (ICs) are packaged into a unified substrate, thereby promoting their use as a single component (i.e., as a comparison Big IC). A multi-chip module can be used in an exemplary circuit board to enable two or more active IC components integrated on a high-density interconnect (HDI) substrate to reduce the overall package size. In an exemplary embodiment, a multi-chip module may be assembled by vertically stacking a transmission / reception (TR) silicon wafer, an amplifier silicon wafer, and a beamformer silicon wafer of an ultrasonic engine. A single circuit board of an ultrasonic engine may include one or more of these multi-chip modules to provide a high channel count while minimizing the overall package size and footprint of the circuit board.
FIG. 12 depicts a schematic side view of a portion of a circuit board 1200 including a multi-chip module assembled in a vertically stacked configuration. Two or more layers of active electronic integrated circuit components are vertically integrated in a single circuit. The IC layers are oriented in spaced planes that extend substantially parallel to each other in a vertically stacked configuration. In FIG. 12, the circuit board includes an HDI substrate 1202 for supporting one of the multi-chip modules. The first integrated circuit wafer 1204 containing, for example, one of the first beamformer devices is coupled to the substrate 1202 using any suitable coupling mechanism (eg, epoxy application and curing). A first spacer layer 1206 is coupled to the surface of the first integrated circuit wafer 1204 opposite to the substrate 1202 using, for example, epoxy resin application and curing. A second integrated circuit wafer 1208 having, for example, one of a second beamformer device is coupled to a surface of the first spacer layer 1206 opposite the first integrated circuit wafer 1204 using, for example, epoxy application and curing. A metal frame 1210 is provided for mechanical and / or electrical connection between integrated circuit wafers. An exemplary metal frame 1210 may be in the form of a lead frame. The first integrated circuit wafer 1204 may be coupled to the metal frame 1210 using a wiring 1212. The second integrated circuit wafer 1208 may be coupled to the same metal frame 1210 using the wiring 1214. A package 1216 is provided to encapsulate the multi-chip module assembly and maintain a plurality of integrated circuit chips in a substantially parallel configuration relative to each other.
As shown in FIG. 12, the vertical three-dimensional stacking of the first integrated circuit wafer 1204, the first spacer layer 1206, and the second integrated circuit wafer 1208 provides high-density functionality on the circuit board while minimizing the overall package size and Occupied area (compared to an ultrasonic engine circuit board that does not use a vertically stacked multi-chip module). Those of ordinary skill will recognize that an exemplary multi-chip module is not limited to two stacked integrated circuit chips. An exemplary number of chips vertically integrated in a multi-chip module may include (but is not limited to): two, three, four, five, six, seven, eight, and the like.
In one embodiment of an ultrasonic engine circuit board, a single multi-chip module is provided as shown in FIG. 12. In other embodiments, a plurality of multi-chip modules are also shown in FIG. 12. In an exemplary embodiment, a plurality of multi-chip modules (for example, two multi-chip modules) may be stacked vertically on top of each other on a circuit board of an ultrasonic engine to further minimize the packaging of the circuit board. Size and footprint.
In addition to reducing the occupied area, the overall package height in the multi-chip module also needs to be reduced. Exemplary embodiments may use wafers that are thinned to hundreds of microns to reduce the package height in multi-chip modules.
Any suitable technique can be used to assemble a multi-chip module on a substrate. Exemplary assembly techniques include (but are not limited to): multilayer MCM (MCM-L), where the substrate is a multilayer multilayer printed circuit board; deposition MCM (MCM-D), where multi-chip modules are deposited on a substrate using thin-film technology A substrate; and a ceramic substrate MCM (MCM-C), in which several conductive layers are deposited on a ceramic substrate and embedded in a glass layer (wherein the layers are co-fired at high temperature (HTCC) or low temperature (LTCC)).
13 is a flowchart of an exemplary method for manufacturing a circuit board including a multi-chip module assembled in a vertically stacked configuration. In step 1302, an HDI substrate is manufactured or provided. In step 1304, a metal frame (eg, a lead frame) is provided. In step 1306, a first IC layer is coupled or bonded to the substrate using, for example, epoxy application and curing. The first IC layer is bonded to the metal frame via a wire. In step 1308, a spacer layer is coupled to the first IC layer using, for example, epoxy application and curing such that the layers are stacked vertically and extend substantially parallel to each other. In step 1310, a second IC layer is coupled to the spacer layer using, for example, epoxy application and curing such that all of the layers are stacked vertically and extend substantially parallel to each other. The second IC layer is bonded to the metal frame via a wire. In step 1312, a package is used to encapsulate the multi-chip module assembly.
Exemplary wafer layers in a multi-chip module can be coupled to each other using any suitable technique. For example, in the embodiment shown in FIG. 12, a spacer layer may be provided between the wafer layers to separate the wafer layers at intervals. A passivation silicon layer, a die attach paste layer, and / or a die attach film layer can be used as the spacer layer. An exemplary spacer technology that can be used to make a multi-chip module is further described at the 58th Electronic Components and Technology Conference (May 27-30, 2008) in Florida, USA. Technology Conference) (ECTC2008), "Die Attach Adhesives for 3D Same-Sized Dies Stacked Packages" by Toh CH et al., Pages 1538 to 1543, the entire contents of the case are expressly incorporated herein by reference.
An important requirement for die attach (DA) pastes or films is excellent adhesion to passivation materials adjacent to the die. Also, for large die applications, a uniform bond chain thickness (BLT) is required. In addition, high cohesive strength under high temperature and low hygroscopicity is better for reliability.
14A to 14C are schematic side views of an exemplary multi-chip module including vertically stacked dies that can be used according to an exemplary embodiment. Both peripheral and center pad wire bond (WB) packages are illustrated and can be used to wire bond an exemplary chip layer in a multi-chip module. 14A is a schematic side view of one of a multi-chip module including four vertically stacked dies, wherein the dies passivate a silicon layer by having a 2-in-1 cut die attach film (D-DAF) Separated from each other. FIG. 14B is a schematic side view of one of a multi-chip module including four vertically stacked dies, where the dies are separated from each other by a DA film-based adhesive as a die-to-die spacer. FIG. 14C is a schematic side view of one of a multi-chip module including four vertically stacked dies, where the dies are separated by a DA paste or film-based adhesive spacer as a die-to-die spacer. In some exemplary embodiments, the DA paste or film-based adhesive may have a wire penetrating ability. In the exemplary multi-chip module of FIG. 14C, a film covered wire (FOW) is a die package that allows long wire bonding and center bond pad stacking. FOW uses a die attach film with wire penetration capability that allows wire bonding dies of the same or similar size to be stacked directly on top of each other without a passivation silicon layer. This solution solves the problem of having the same or similar sized grains directly stacked on top of each other, and this presents another challenge because there is no gap or not enough gap for the bonding wires of the lower grains.
The DA material shown in FIGS. 14B and 14C preferably maintains a bond line thickness (BLT) with almost no voids and is discharged through the assembly process. After assembly, the DA material sandwiched between the grains maintains excellent adhesion to the grains. Customize the material properties of DA materials as needed to maintain high cohesive strength for high temperature reliability pressurization without block cracking. Customize the properties of the DA material as needed to also minimize or better eliminate moisture build-up that can cause package reliability failures (for example, bursting, whereby interface or block cracking occurs due to pressure buildup from moisture in the package) .
Figure 15 uses (a) a passivation silicon layer with a 2-in-1 cut die attach film (D-DAF), (b) a DA paste, (c) a thick DA film, and (d) a film with the same or similar dimensions Specific exemplification of die-to-die stacking of wire-bonded die with direct pass stacking on top of each other without passivation silicon spacers Flow chart of one method. Each method performs backside polishing of the wafer to reduce the thickness of the wafer to achieve stacking of integrated circuits and high-density packaging. The wafers are sawed to separate individual dies. A first die is bonded and bonded to a substrate of a multi-chip module using, for example, epoxy resin in an oven. Wire bonding is used to couple the first die to a metal frame.
In the method (A), a dicing die attach film (D-DAF) is used to bond a first passivation silicon layer to the first die in a stacked manner. A second die is bonded to the first passivation silicon layer in a stacked manner using D-DAF. Wire bonding is used to couple the second die to the metal frame. A second passivation silicon layer is bonded to the second die in a stacked manner using D-DAF. A third die is bonded to the second passivation silicon layer in a stacked manner using D-DAF. Wire bonding is used to couple the third die to the metal frame. A third passivation silicon layer is bonded to the third die in a stacked manner using DAF. A fourth die is bonded to the third passivation layer in a stacked manner using D-DAF. Wire bonding is used to couple the fourth die to the metal frame.
In method (B), repeating die attach (DA) paste application and curing for multiple thin die stacks. DA paste is dispensed on a first die, and a second die is provided on the DA paste and cured to the first die. Wire bonding is used to couple the second die to the metal frame. A DA paste is dispensed on the second die, and a third die is provided on the DA paste and cured to the second die. Wire bonding is used to couple the third die to the metal frame. DA paste is dispensed on the third die, and a fourth die is provided on the DA paste and cured to the third die. Wire bonding is used to couple the fourth die to the metal frame.
In method (C), a die attach film (DAF) is cut and pressed to a bottom die and then a top die is placed and thermally compressed on the DAF. For example, a DAF is pressed onto the first die and a second die is thermally compressed onto the DAF. Wire bonding is used to couple the second die to the metal frame. Similarly, a DAF is pressed onto the second die and a third die is thermally compressed onto the DAF. Wire bonding is used to couple the third die to the metal frame. A DAF is pressed onto the third die and a fourth die is thermally compressed onto the DAF. Wire bonding is used to couple the fourth die to the metal frame.
In method (D), the film-covered wire (FOW) uses a die attach having a wire-penetrating capability that allows wire bonding dies of the same or similar size to be stacked directly on top of each other without a passivation silicon layer membrane. A second die is bonded and cured to the first die in a stacked manner. Film-covered wire bonding is used to couple the second die to a metal frame. A third die is bonded and cured to the first die in a stacked manner. Film-covered wire bonding is used to couple the third die to a metal frame. A fourth die is bonded and cured to the first die in a stacked manner. Film-covered wire bonding is used to couple the fourth die to a metal frame.
After the above steps are completed, in each of the methods (a) to (d), wafer molding and post-mold curing (PMC) are performed. Subsequently, bead installation and singulation are performed.
"Die Attach Adhesives for" at the 58th Electronic Components and Technology Conference (ECTC2008) by Toh CH et al. (May 27-30, 2008) in Florida, USA "3D Same-Sized Dies Stacked Packages", pages 1538 to 1543 provide further details on the die attach technology described above, the entire contents of which are expressly incorporated herein by reference.
FIG. 16 is a schematic side view of a multi-chip module 1600 including a TR chip 1602, an amplifier chip 1604, and a beamformer chip 1606 vertically integrated on a substrate 1614 in a vertical stack configuration. Any suitable technique shown in FIGS. 12 to 15 can be used to fabricate a multi-chip module. Those of ordinary skill will recognize that the particular order of stacking wafers may be different in other embodiments. A first spacer layer 1608 and a second spacer layer 1610 are provided to separate the wafers 1602, 1604, and 1606 at intervals. Each wafer is coupled to a metal frame (eg, a lead frame) 1612. In some exemplary embodiments, a heat transfer and heat dissipation mechanism may be provided in the multi-chip module to maintain high-temperature reliability pressure without block cracking. The other components of FIG. 16 are described with reference to FIGS. 12 and 14.
In this exemplary embodiment, each multi-chip module can handle full transmission, reception, TGC amplification, and beamforming operations for a larger number of channels (eg, 32 channels). By vertically integrating three silicon wafers into a single multi-chip module, the space and footprint required by the printed circuit board are further reduced. Multiple multi-chip modules can be provided on a single ultrasonic engine circuit board to further increase the number of channels while minimizing package size and footprint. For example, a 128-channel ultrasonic engine circuit board 108 can be manufactured in an exemplary planar size of about 10 cm x about 10 cm, which is a significant improvement on the space requirements of conventional ultrasonic circuits. In a preferred embodiment, a single circuit board including an ultrasonic engine of one or more multi-chip modules may have 16 channels to 128 channels. In some embodiments, a single circuit board containing one or more multi-chip module ultrasonic engines may have 16, 32, 64, 128 channels, and the like.
FIG. 17 is a detailed illustration of an exemplary embodiment of an ultrasonic engine 108 (ie, a front-end ultrasonic specific circuit) and an exemplary embodiment of a computer motherboard 106 (ie, a host computer) provided as a single-board complete ultrasound system. Sex block diagram. One exemplary veneer ultrasound system as illustrated in FIG. 17 may have an exemplary planar size of about 25 cm x about 18 cm, but other sizes are also feasible. The single-board complete ultrasound system of FIG. 17 can be implemented in the ultrasound devices shown in FIGS. 1, 2A, 2B, and 9A, and can be used to execute the processes depicted in FIGS. 3 to 8, 9B, and 10 operating.
The ultrasonic engine 108 includes a probe connector 114 that facilitates the connection of at least one ultrasonic probe / sensor. In the ultrasonic engine 108, a TR module, an amplifier module, and a beamformer module can be vertically stacked to form a multi-chip module as shown in FIG. 16, thereby minimizing the entirety of the ultrasonic engine 108. Package size and footprint. The ultrasonic engine 108 may include a first multi-chip module 1710 and a second multi-chip module 1712. Each module includes a TR chip vertically integrated into a stacked configuration as shown in FIG. 16, and an ultrasonic pulse. A generator and a receiver, an amplifier chip including a time gain control amplifier, and a sample beamformer chip. The first multi-chip module 1710 and the second multi-chip module 1712 can be vertically stacked on top of each other to further minimize the required area on the circuit board. Alternatively, the first multi-chip module 1710 and the second multi-chip module 1712 may be horizontally disposed on a circuit board. In an exemplary embodiment, the TR chip, amplifier chip, and beamformer chip are each a 32-channel chip, and each multi-chip module 1710, 1712 has 32 channels. Those of ordinary skill will recognize that the exemplary ultrasonic engine 108 may include, but is not limited to, one, two, three, four, five, six, seven, eight multi-chip modules. Note that in a preferred embodiment, the system can be configured with a first beamformer in the sensor housing and a second beamformer in the tablet housing.
The ASIC and multi-chip module configuration enable a 128-channel complete ultrasound system to be implemented on a small single board the size of a tablet format. An exemplary 128-channel ultrasonic engine 108 (for example) can be accommodated in an exemplary planar size of about 10 cm x about 10 cm, which is a significant improvement on the space requirements of conventional ultrasonic circuits. An exemplary 128-channel ultrasonic engine 108 can also be accommodated at about 100 cm2 One is exemplary area.
The ultrasound engine 108 also includes a clock generation complex programmable logic device (CPLD) 1714 for generating a timing clock to perform an ultrasound scan using a sensor array. The ultrasonic engine 108 includes an analog-to-digital converter (ADC) 1716 for converting an analog ultrasonic signal received from a sensor array to a digital RF-formed beam. The ultrasound engine 108 also includes one or more delay profiles and a waveform generator field programmable gate array (FPGA) 1718 for managing a receive delay profile and generating a transmission waveform. The ultrasound engine 108 includes a memory 1720 for storing a delay profile for ultrasound scanning. An exemplary memory 1720 may be a single DDR3 memory chip. The ultrasound engine 108 includes a configuration to manage the ultrasound scan sequence, transmission / reception timing, storage of profiles to and from the memory 1720, and buffering and moving digital RF data streams via a high-speed serial interface 112 One of the scan sequence control fields to the computer motherboard 106 is a programmable gate array (FPGA) 1722. The high-speed serial interface 112 may include a Fire Wire or other serial or parallel bus interface between the computer motherboard 106 and the ultrasonic engine 108. The ultrasound engine 108 includes a communication chipset 1118 (eg, a Fire Wire chipset) that establishes and maintains a communication link 112.
A power module 1724 is provided to supply power to the ultrasonic engine 108, manage a battery charging environment, and perform power management operations. The power module 1724 can generate regulated, low-noise power for ultrasonic circuits and can generate high voltages for ultrasonic transmission pulse generators in TR modules.
The computer motherboard 106 includes a core computer-readable memory 1122 for storing data and / or computer-executable instructions for performing ultrasound imaging operations. The memory 1122 forms the main memory of the computer and can store about 4 Gb of DDR3 memory in an exemplary embodiment. The memory 1122 may include a solid state hard disk drive (SSD) for storing an operating system, computer executable instructions, programs, and image data. An exemplary SSD may have a capacity of about 128GB.
The computer motherboard 106 also includes a microprocessor 1124 for executing computer-executable instructions stored on the core computer-readable memory 1122 to perform ultrasonic imaging processing operations. Exemplary operations include, but are not limited to: down-conversion, scan conversion, Doppler processing, color blood flow processing, energy Doppler processing, spectral Doppler processing, and post-signal processing. An exemplary microprocessor 1124 may be an existing commercial computer processor (such as an Intel Core-i5 processor). Another exemplary microprocessor 1124 may be a digital signal processor (DSP) -based processor such as DaVinci from Texas InstrumentsTM processor).
The computer motherboard 106 includes an input / output (I / O) and graphics chipset 1704, which includes input / output (I / O) and graphics chipset 1704 configured to control I / O and graphics peripherals such as USB Port, video display port, and the like). The computer motherboard 106 includes a wireless network adapter 1702 configured to provide a wireless network connection. An exemplary adapter 1702 supports the 802.11g and 802.11n standards. The computer motherboard 106 includes a display controller 1126 configured to interface the computer motherboard 106 to one of the displays 104. The computer motherboard 106 includes a communication chipset 1120 (eg, a Fire Wire chipset or interface) configured to provide a fast data communication between the computer motherboard 106 and the ultrasonic engine 108. An exemplary communication chipset 1120 may be an IEEE 1394b 800 Mbit / sec interface. Other serial or parallel interfaces 1706 may alternatively be provided, such as USB3, Thunder-Bolt, PCIe, and the like. A power module 1708 is provided to supply power to the computer motherboard 106, manage a battery charging environment, and perform power management operations.
An exemplary computer motherboard 106 can be accommodated in an exemplary planar size of about 12 cm x about 10 cm. An exemplary computer motherboard 106 can hold about 120 cm2 One is exemplary area.
FIG. 18 is a perspective view of one exemplary portable ultrasound system 100 provided according to an exemplary embodiment. The system 100 is included in one of the tablet PC exterior dimensions as shown in FIG. 18 but may be one of the housings 102 in any other suitable exterior dimensions. An exemplary housing 102 may have a thickness of less than 2 cm and preferably between 0.5 cm and 1.5 cm. A front panel of the housing 102 includes a multi-touch LCD touch screen display 104, and the multi-touch LCD touch screen display 104 is configured to identify and distinguish one of the touch screen displays 104. One or more multi-point and / or simultaneous touches on the surface. The surface of the display 104 can be touched using a user's finger, a user's hand, or one or more optional stylus 1802. The housing 102 includes one or more I / O port connectors 116 (the one or more I / O port connectors 116 may include (but is not limited to): one or more USB connectors, one or more SD cards , One or more network small display ports) and a DC power input. The embodiment of the housing 102 in FIG. 18 can also be configured with a size of 150 mm x 100 mm x 15 mm (225000 mm3 One volume) or smaller than the external dimensions of one palm. The housing 102 may have a weight of less than 200 g. Optionally, the cabling between the sensor array and the display housing may include an interface circuit 1020 as described herein. The interface circuit 1020 may include, for example, a beamforming circuit and / or an A / D circuit in a pod suspended from a tablet computer. Detach connectors 1025, 1027 can be used to connect the hanging pod to the sensor probe cable. The connector 1027 may include a probe identification circuit as described herein. The unit 102 may include a camera, a microphone, and a speaker, and a wireless telephone circuit for voice and data communication, and software that can be used to control the activation of the ultrasound imaging operation as described herein.
The housing 102 includes or is coupled to one of the probe head connectors 114 that facilitates the connection of the at least one ultrasonic probe head / sensor 150. The ultrasonic probe 150 includes a sensor housing, and the sensor housing includes one or more sensor arrays 152. The ultrasonic probe head 150 may be coupled to the probe head connector 114 using a housing connector 1804 provided along a flexible cable 1806. Those of ordinary skill will recognize that the ultrasound probe 150 may be coupled to the housing 102 using any other suitable mechanism (eg, an interface housing containing one of the circuits for performing ultrasound specific operations such as beamforming). Other exemplary embodiments of the ultrasound system are described in further detail in WO 03/079038 A2, entitled "Ultrasound Probe with Integrated Electronics", filed on March 11, 2003, the entire contents of which are expressly incorporated by reference. In this article. The preferred embodiment may use a wireless connection between the handheld sensor detection head 150 and the display housing. Beamformer electronics may be incorporated into the probe head housing 150 to provide beamforming of a sub-array in a 1D or 2D sensor array as described herein. The display housing can be sized to be held in the palm of a user's hand and can include wireless network connectivity to a public access network, such as the Internet.
FIG. 19 illustrates an exemplary view of a main graphical user interface (GUI) 1900 presented on the touch screen display 104 of the portable ultrasound system 100 of FIG. 18. This main GUI 1900 may be displayed when the ultrasound system 100 is activated. To assist a user to browse the main GUI 1900, the GUI can be regarded as including four exemplary working areas: a function bar 1902, an image display window 1904, an image control bar 1906, and a toolbar 1908. Additional GUI components may be provided on the main GUI 1900 to, for example, enable a user to close the GUI / or the window in the GUI, adjust the size of the GUI / or the window in the GUI, and exit the GUI and / or the Windows in the GUI.
The menu bar 1902 enables a user to select ultrasound data, images and / or videos for display in the image display window 1904. The function list 1902 may include, for example, a GUI component for selecting one or more files in a patient folder directory and an image folder directory. The image display window 1904 displays ultrasound data, images and / or videos and provides patient information as needed. Toolbar 1908 provides the functionality associated with an image or video display, including (but not limited to): a save button for saving the current image and / or video to a file, a maximum allowable number of previous frames ( Such as a movie playback (Cine loop) one of the save playback button, a print button for printing the current image, a freeze image button for freezing an image, a playback for controlling the playback of a movie. Toolbar and similar. The exemplary GUI functionality available in the main GUI 1900 is described in further detail in WO 03/079038 A2, entitled "Ultrasound Probe with Integrated Electronics", filed on March 11, 2003, the entire contents of which are incorporated by reference The way is explicitly incorporated into this article.
The image control bar 1906 includes touch control items that can be operated by a user directly applying touch and touch gestures to the surface of the display 104. Exemplary touch control items may include (but are not limited to): a 2D touch control item 408, a gain touch control item 410, a color touch control item 412, a stored touch control item 414, a split touch Control 416, a PW imaging touch control 418, a beam steering touch control 420, an annotation touch control 422, a dynamic range operation touch control 424, a Teravision ™ touch control 426, a The map operates a touch control item 428 and a needle-guided touch control item 430. These exemplary touch control items are further described in detail with reference to FIGS. 4a to 4c.
FIG. 20A depicts an illustrative embodiment of an exemplary medical ultrasound imaging apparatus 2000 implemented in a tablet computer's external dimensions according to one embodiment of the present invention. The tablet can have a size of 12.5 ”x 1.25” x 8.75 ”or 31.7 cm x 3.175 cm x 22.22 cm but it can also be3 One volume and any other suitable appearance size of less than 8 lbs. As shown in FIG. 20A, the medical ultrasound imaging device 2000 includes a housing 2030 and a touch screen display 2010, in which an ultrasound image 2010 and ultrasound data 2040 can be displayed, and an ultrasound control item 2020 is configured to be touched by a touch The on-screen display 2010 is controlled. The casing 2030 may have a front panel 2060 and a rear panel 2070. The touch screen display 2010 forms the front panel 2060 and includes one or more multi-touch and / or one touch multi-touch LCD touches that can identify and distinguish a user on the touch screen display 2010. Screen. The touch screen display 2010 may have a capacitive multi-touch and AVAH LCD screen. For example, capacitive multi-touch and AVAH LCD screens allow a user to view images from multiple angles without loss of resolution. In another embodiment, the user can use a stylus to input data on the touch screen. The tablet computer may include an integrated foldable stand that allows a user to rotate the stand from a storage position that is conformal to the external dimensions of the tablet computer so that the device can lie flat on the back panel, or instead Generally, the user can rotate the stand to enable the tablet computer to stand in an upright position at one of a plurality of inclined angles with respect to a support surface.
The capacitive touch screen module includes an insulator (for example, glass) coated with a transparent conductor (such as indium tin oxide). The process may include a bonding process between glass, x-sensor film, y-sensor film, and a liquid crystal material. The tablet is configured to allow a user to perform multi-touch gestures (such as pinch and spread) while wearing a dry glove or a wet glove. The surface of the screen records the electrical conductors that are in contact with the screen. This contact distorts the screen's electrostatic field, resulting in a measurable change in capacitance. A processor then interprets the change in the electrostatic field. Increase the level of response by using "in-cell" technology to reduce layers and generate touch screens. "In-cell" technology reduces layers by placing capacitors in the display. Apply "in-cell" technology to reduce the visible distance between the user's finger and the touch screen target, thereby generating more directional contact with one of the displayed content and enabling the click gesture to have A response increased.
FIG. 20B depicts an illustrative embodiment of an exemplary medical ultrasound imaging device 2000 implemented in a tablet PC form factor and configured to receive a wireless SIM card according to one embodiment of the present invention. In this particular embodiment, the ultrasound imaging device / device 2000 includes a SIM card port 2080 configured to receive a SIM card 2084 and connect the SIM card circuit to one of the wireless communication circuits within the device. The SIM card port 2080 in this embodiment includes internal metal contacts that connect the ID circuit of the SIM card 2084 to the circuit of the device 2000. In this particular example, a SIM card tray 2082 is configured to accept the SIM card 2084 and connect it to the SIM card port 2080. In some embodiments, the SIM card port 2080 and / or the SIM card tray 2082 may be configured to accept a standard SIM card, a small SIM card, a micro SIM card, a nano SIM card, or other similar wireless identification / authorization card or circuit. .
FIG. 21 illustrates a preferred cart system for a modular ultrasound imaging system according to an embodiment of the present invention. The cart system 2100 uses a base assembly 2122 including a docking rack that receives a tablet computer. The cart configuration 2100 is configured to interface a tablet computer 2104 including a touch screen display 2102 to a cart 2108, which may include a complete operator console 2124. After docking the tablet 2104 to the cart stand 2108, the system is formed to rotate about one of the complete features of the system. Rotating around this complete feature of the system may include an adjustable height device 2106, a gel holder 2110 and a bin 2114, a plurality of wheels 2126, a thermal probe holder 2120, and an operator console 2124. The control device may include a keyboard 2112 on the operator console 2124, and the keyboard 2112 may also have other peripheral devices (such as a printer or a video interface or other control devices) added.
FIG. 22 illustrates a preferred cart system for an embodiment with a modular ultrasound imaging system according to an embodiment of the present invention. The cart system 2200 can be configured using a vertical support member 2212 coupled to one of the horizontal support members. One of the auxiliary device connectors 2018 having one position for the auxiliary device attachment 2014 may be configured to connect to the vertical support member 2212. A 3-port probe MUX connection device 2016 can also be configured to connect to a tablet. A storage box 2224 may be configured to be attached to the vertical support member 2212 by a storage box attachment mechanism 2222. The cart system may also include a rope management system 2226 configured to attach to the vertical support member. The cart assembly 2200 includes a support beam 2212 mounted on a base 2228. The support beam 2212 has wheels 2232 and a battery 2230 that provides power for extended operation of the tablet. The assembly may also include an accessory holder 2224 mounted using a height adjustment device 2226. The holders 2210 and 2218 can be mounted on the beam 2212 or the console panel 2214. The multi-port probe multiplexer device 2216 is connected to the tablet computer to provide the user with simultaneous connection of several sensor probes selected sequentially by the displayed virtual switch. Move a touch gesture (such as a three-finger swipe) to one of the displayed images or touch a displayed virtual button or icon to switch between connected probes.
FIG. 23A illustrates a preferred cart mount system for a modular ultrasound imaging system according to an embodiment of the present invention. Configuration 2300 depicts a tablet computer 2302 coupled to a docking station 2304. The docking station 2304 is fixed to the attachment mechanism 2306. The attachment mechanism 2306 may include a hinge member 2308 that allows the user display to be tilted to a desired position of the user. This attachment mechanism 2306 is attached to the vertical member 2312. One of the tablet computers 2302 as described herein may be mounted on a base coupling unit 2304 that is mounted on a mounting assembly 2306 on top of the beam 2212. The base unit 2304 includes a bracket 2310, an electrical connector 2305 and a port 2307 that connect the system 2302 to the battery 2230 and the multiplexer device 2216.
23B illustrates a cart mount system for a modular ultrasound imaging system configured to receive a wireless SIM card according to an embodiment of the present invention. In this particular embodiment, the docking station 2304 includes a SIM card port 2080 configured to receive a SIM card 2084 and connect the SIM card circuit to one of the wireless communication circuits located in the docking station 2304 or the tablet computer 2302. In this particular example, a SIM card 2084 can be inserted directly into the SIM card port 2080, while in other examples a SIM card tray (such as the SIM card tray shown in Figure 20B) can be used to connect the SIM card 2084 to Metal contacts in SIM card port 2080. In some embodiments, the SIM card port 2080 and / or the SIM card tray 2082 may be configured to accept a standard SIM card, a small SIM card, a micro SIM card, a nano SIM card, or other similar wireless identification / authorization card or circuit. .
FIG. 24 illustrates a preferred cart system 2400 for a modular ultrasound imaging system according to an embodiment of the present invention, in which a tablet computer 2402 is connected to an installation assembly 2406 using a connector 2404. The configuration 2400 depicts a tablet computer 2402 coupled to a vertical support member 2408 via an attachment mechanism 2404 without an engagement element 2304. The attachment mechanism 2404 may include a hinge 2406 for display adjustment.
25A and 25B illustrate a multifunctional docking station system 2500. 25A illustrates a docking station 2502 and a tablet computer 2504 having a base assembly 2506 mated to the docking station 2502. The tablet computer 2504 and the docking station 2502 can be electrically connected. The tablet computer 2504 can be released from the docking station 2502 by engaging the release mechanism 2508. The docking station 2502 may include a sensor port 2512 for connecting a sensor head 2510. The docking station 2502 may include three USB 3.0 ports, a LAN port, a headphone jack, and a power connector for charging. 25B illustrates a side view of a tablet computer 2504 and a docking station 2502 with a stand according to a preferred embodiment of the present invention. The docking station may include an adjustable stand / grip 2526. The adjustable stand / grip 2526 can be tilted for multiple viewing angles. The adjustable stand / grip 2526 can be flipped up for shipping purposes. The side view also shows a sensor port 2512 and a sensor probe connector 2510.
Referring to FIG. 26A, the integrated probe system 2600 includes a front-end probe 2602, a host computer 2604, and a portable information device (such as a personal digital assistant (PDA) 2606). The PDA 2606 (such as a Palm Pilot device or other handheld computing device) is a remote display and / or recording device 2606. In the illustrated embodiment, the front-end probe 2602 is connected to the host computer 2604 via a communication link 2608 (which is a wired link). The host computer 2604 (a computing device) is connected to the PDA 2606 through a communication link or interface 2610 (which is a wireless link 2610).
Because the integrated ultrasonic probe system 2600 in the described embodiment has a Windows®-based host computer 2604, the system can utilize a wide selection of software that can be used with the Windows® operating system. One potentially useful application is to electrically connect an ultrasound system to allow a physician to use the system to send and receive messages, diagnostic images, instructions, reports, or even remotely control the front-end probe 2602.
Connections through communication links or interfaces 2608 and 2610 can be wired through an Ethernet network or through a wireless communication link (such as, but not limited to: IEEE 802.11a, IEEE 802.11b, hyperlink, or HomeRF) ) Is wireless. FIG. 26A shows one wired link for communication link 2608 and one wireless link for communication link 2610. It should be appreciated that other wired embodiments or agreements may be used.
The wireless communication link 2610 may use a variety of different protocols, such as an RF link, and may use all or part of a dedicated protocol, such as the IEEE 1394 protocol stack or the Bluetooth system protocol stack, to implement the different protocols. IEEE 1394 is a better interface for high-bandwidth applications, such as high-quality digital video editing of ultrasound imaging data. The Bluetooth protocol uses a combination of circuit and packet switching. The time slot can be reserved for isochronous packets. Bluetooth can support one asynchronous data channel (up to three simultaneous synchronous channels) or one channel of asynchronous data and synchronous voice at the same time. Each synchronization channel supports one 64 kb / s synchronization (voice) channel in each direction. Asynchronous channels can support up to 723.2 kb / s asymmetry or 433.9 kb / s symmetry.
The Bluetooth system consists of a radio unit, a link control unit, and a support unit for link management and host terminal interface functions. The link controller implements baseband protocols and other low-order link routines.
The Bluetooth system provides a point-to-point connection (only involving two Bluetooth units) or a single point-to-multipoint connection. In this single point-to-multipoint connection, a channel is shared between several Bluetooth units. Two or more units sharing the same channel form a piconet. One Bluetooth unit is used as the master unit of the piconet, and the other units are used as slave units. Up to seven slaves can be active in a piconet.
The Bluetooth link controller has two main states: STANDBY and CONNECTION. In addition, there are seven sub-states: paging, paging scan, query, query scan, master response, slave response, and query response. These sub-states are used to add new temporary states from the unit to a piconet.
Links can also be implemented using, but not limited to, Home RF or IEEE 802.11 wireless LAN specifications. For more information on the IEEE 802.11 wireless LAN specification, see the IEEE Standard for Wireless LAN, which is incorporated herein by reference. IEEE standards can be found at the World Wide Web's Global Resource Locator (URL) www.ieee.org. For example, hardware support for the IEEE standard 802.11b provides two PCs with one communication link at 2 Mbps and 11 Mbps. The frequency band allocated for signal transmission and reception is approximately 2.4 GHz. In comparison, the IEEE standard 802.11a provides 54 Mbps communication. The frequency allocation for this standard is approximately 5 GHz. Recently, merchants (such as Proxim) have manufactured PC cards and access points (base stations) that use a proprietary data doubling, chipset technology to achieve 108 Mbps communication. A data doubling chip (AR5000) manufactured by Atheros Communications. As with any radio system, the actual data rate maintained between two computers is related to the physical distance between the transmitter and receiver.
The wireless link 2610 may also take other forms (such as, for example, an infrared communication link as defined by the Infrared Data Association (IrDA)). Depending on the desired communication type (ie, Bluetooth, infrared, etc.), the host computer 5 and the remote display and / or recording device 9 each have a desired communication port.
FIG. 26B shows the communication link 2608 between the probe 3 and the host computer 5 as a wireless link. The communication link 2610 between the host computer 2604 and the PDA 2606 is shown as a wired link.
The integrated probe system 2600 of FIG. 26C has a wireless link for both a communication link 2608 between the probe 2602 and the host computer 2604 and a communication link 2610 between the host computer 2604 and the PDA 2606. . It should be appreciated that both wired and wireless links may be used together or alternatively may be purely wired or wireless links in a system 2600.
The remote display and / or recording device 2606 of the integrated probe system 2600 of FIG. 27 is a remote computing system 2612. The remote computing system 2612 can remotely control the probe 2602 in addition to having remote display and / or recording capabilities. The communication link 2610 is shown as a wireless link. The communication link 2608 between the probe 2602 and the host computer 2604 is shown as a wired link.
An example of a remote control system includes the use of a wearable computer (such as a personal computer manufactured by Xybernaut), a pair of high-speed, wireless PC cards (such as those provided by Proxim), and ultrasound programs and detection Head 2602. A portable network-connected ultrasound system can be configured to weigh less than 2.5 pounds. Using a program similar to Microsoft® NetMeeting, you can establish an instant connection between a remote PC and a portable computer. The remote host can monitor all interactions with the portable computer, including real-time ultrasound imaging (at a display rate of up to about 4 frames per second). NetMeeting can also be used to "control" portable computers and manage ultrasound sessions from remote PCs in real time. In addition, it is possible to transfer images and repeatedly executable software instructions archived to a hard disk on a portable computer to the host computer at 108 Mbps. With this technology, real-time ultrasound diagnosis can be performed at a rate comparable to a hard-wired 100 million bits per second (100 Mbps) local area network (LAN) and relayed to a remote field of view.
FIG. 28 illustrates an integrated probe head system 2800 having a hub 2802 for connecting a plurality of remote devices 2606 to a host computer 2604. The communication link 2804 from the hub 2802 to the remote device is shown as both a wireless link and a wired link. It should be recognized that a fully wired network (such as a LAN or Ethernet) can be used. Alternatively, using one of the wireless transceivers and ports in each of the computers (remote devices) 2606 makes it easy to build a wireless network / communication system. With the recent emergence of high-speed wireless standards (such as IEEE 802.11a), the communication between the remote and local machines is comparable to a wired, 100 Mbps local area network (LAN) communication. Another alternative is to use a Bluetooth system to form a piconet.
The increased use of combined audio-visual and computer data has led to a greater need for multimedia network connection capabilities and the solutions included in the preferred embodiments of the present invention have begun to appear. Standardization of multimedia network connections is underway, and IEEE 1394 appears as an important competitor capable of interfacing with many audio-visual (AV) computers and other digital consumer electronics devices and providing transmission bandwidths up to 400 Mbps .
The preferred embodiment uses IEEE 1394 technology, which uses a wireless solution for 1394 over IEEE 802.11 (an emerging standard for wireless data transmission in corporate environments and increasingly in the home) Agreement transmission. In a preferred embodiment, the IEEE 1394 system is implemented as a Protocol Adapter Layer (PAL) on top of 802.11 radio hardware and Ethernet protocols, bringing together one of these important technologies. This protocol interface layer enables the PC to operate as a wireless 1394 device. The engineering goal is to make the actual IEEE 1394 bandwidth available for a single high-definition MPEG2 video stream (or multiple standard-definition MPEG2 video streams) from one room in a facility to another One-room transmission.
A preferred embodiment of the present invention includes the use of Wi-LAN's Wideband Orthogonal Frequency Division Multiplexing (W-OFDM) technology for wireless transmission of IEEE 1394 at 2.4 GHz. This development builds W-OFDM (the most bandwidth-efficient wireless transmission technology) as one of the technologies that can provide the data rate required for home multimedia network connections.
The wireless IEEE 1394 system includes an MPEG-2 data stream generator that feeds a multi-transport stream to, for example, a set-top box (STB) provided by Philips Semiconductors. The STB converts this signal to an IEEE 1394 data stream and applies the IEEE 1394 data stream to a W-OFDM radio system such as that provided by Wi-LAN.TM. The radio transmitter then sends the IEEE 1394 data stream over the air to a corresponding W-OFDM receiver in the host computer, for example. On the receiving side, the IEEE 1394 signal is demodulated and sent to two STBs, which display the contents of different MPEG-2 data streams on two separate TV monitors. Using IEEE 1394 as the interface of the wired part of the network optimizes the entire system for transmitting isochronous information (voice, live video) and provides an ideal interface to multimedia devices in the facility. W-OFDM technology is essentially unaffected by multipath effects. Like all modulation schemes, OFDM encodes the data in a radio frequency (RF) signal. Radio communications are often hindered by noise, stray, and reflected signals. By simultaneously transmitting high-speed signals on different frequencies, OFDM technology provides robust communication. OFDM-capable systems are highly tolerant of noise and multipath, making multi-point coverage in wide area and homes possible. In addition, because these systems are very efficient at using bandwidth, more high-speed channels are possible in one frequency band. W-OFDM is one of the cost-effective variants of OFDM that allows a much larger throughput than conventional OFDM by using a wide band. W-OFDM further processes the signal to maximize range. These improvements to conventional OFDM have led to a sharp increase in transmission rates.
OFDM technology is becoming more and more visible because the US and European Standardization Commission is choosing it as the only technology that can provide reliable wireless high data rate connections. European terrestrial digital video broadcasting uses OFDM and the IEEE 802.11 working group recently selected OFDM among its 6 Mbps to 54 Mbps wireless LAN standards. The European Telecommunications Standards Institute is considering W-OFDM for the ETSI BRAN standard. More information about Wi-LAN.TM. Can be found on the website http://www.wi-lan.com/Philips Semiconductors, a Royal Semiconductor based in Eindhoven, Netherlands. A division of Philips Electronics. Additional information about Philips Semiconductors can be obtained by accessing its homepage at http://www.semiconductors.philips.com/.
In addition, it is also possible to use the IEEE 1394 high-speed serial bus based NEC capable of reaching 400 million bits (Mbps) in the transmission range of up to 7 meters through the inner wall and up to 12 meters under the line of sight in the preferred embodiment. The company's wireless transmission technology. With the development of the ASK modulation scheme and a low-cost transceiver, this embodiment uses 60 GHz millimeter wavelength transmission, which does not require any kind of authorization. This embodiment incorporates an echo detection function of the PD72880 400 Mbps long-distance transmission physical layer device of NEC to prevent the influence of signal reflection (this is a significant obstacle to the stable operation of IEEE 1394 via a wireless connection).
Wireless IEEE 1394 can play an important role in bridging a PC to a cluster of interconnected IEEE 1394 devices, which can be in another room in the facility. Three exemplary applications are source video or audio streaming from a PC, providing Internet content and connectivity to an IEEE 1394 cluster, and providing command, control, and configuration capabilities to the cluster. In a first embodiment, the PC can provide information to someone in another room in a facility. In the second embodiment, the PC can provide a channel for the 1394 enabled device to access the Internet. In the third embodiment, the PC plays a role of carefully arranging activities in the 1394 cluster and routing data within the cluster and via the bridge (although actual data does not flow through the PC).
FIG. 29 is a diagram showing the deployment of wireless access to an image generated by an ultrasonic imaging system and associated architecture 2902 according to a preferred embodiment. The imaging system 2906 exports patient information and images to files in the corresponding folder 2908. The executable software instructions have all the functionality needed to implement the ultrasound imaging methods described above.
The wireless agent 2910 is used to detect the patient directory and image files and open a port for the wireless client to obtain a connection to it. After establishing a connection 2914, it sends the patient list and corresponding images back to the client. For example, the wireless agent 2910 may include a data interface circuit, and the data interface circuit may include a first port (such as an RF interface port).
A wireless viewer 2912 residing on the side of a handheld device can establish a connection to the wireless agent 2910 and retrieve patient and image information. After the user selects the patient and the image, it starts the file transfer from the wireless agent. After receiving an image, the viewer 2912 displays the image along with patient information. The image is stored on the handheld device for future use. Users of handheld devices can view the images captured in a previous session or can request a new image transmission.
FIG. 33 is a block diagram of a portable information device (such as a personal digital assistant (PDA) or any computing device) according to an exemplary embodiment of the present invention. The link interface or data interface circuit 3310 illustrates (but is not limited to) a link interface for establishing a wireless link to another device. The wireless link is preferably an RF link defined by the IEEE 1394 communication specification. However, the wireless link may take other forms (such as an infrared communication link as defined by the Infrared Data Association (IrDA)). The PDA includes a processor 3360 capable of executing an RF stack 3350. The RF stack 3350 communicates with a data interface circuit 3310 through a bus 3308. The processor 3360 is also connected to the user interface circuit 3370, the data storage 3306, and the memory 3304 through the bus 3308.
The data interface circuit 3310 includes a port (such as an RF interface port). The RF link interface may include a first connection 3312. The first connection 3312 includes a radio frequency (RF) circuit 3314 for converting a signal into a radio frequency output and for receiving a radio frequency input. The RF circuit 3314 can send and receive RF data communications through a transceiver built in the communication port 1026. The RF communication signal received by the RF circuit 3314 is converted into an electrical signal and relayed to the RF stack 3350 in the processor 3360 via the bus 3308. The radio interfaces 3314, 3316 and links between the laptop personal computer (PC) (host computer) and the PDA can be implemented by, but not limited to, the IEEE 1394 specification.
Similarly, the PC host computer has an RF stack and circuitry that is capable of communicating to remotely located image viewers. In a preferred embodiment, the remote image viewer can be used to monitor and / or control ultrasound imaging operations (rather than just displaying the resulting imaging data).
The current market offers many different options related to wireless connectivity. In a preferred embodiment, a spread-spectrum technology wireless LAN is used. The most advanced wireless LAN solution is the 802.11b standard. Many manufacturers offer 802.11b compliant devices. Compatibility with selected handheld devices is a major criterion in one of the specified categories of wireless connectivity options.
The handheld device market also offers a variety of handheld devices. For imaging purposes, it is important to have a high-quality screen and sufficient processing power to display an image system. Considering these factors, in a preferred embodiment, a Compaq iPAQ is used, and in particular, a Compaq iPAQ 3870 is used. Use a wireless PC card compatible with handheld devices (such as Compaq's Wireless PC Card WL110 and the corresponding wireless access point).
FIG. 30 illustrates an image viewer 3020 in communication with a personal computer in a preferred embodiment or a probe in an alternative embodiment. The image viewer has user interface buttons 3022, 3024, 3026, and 3028 that allow a user to interface with an ultrasound imaging system computer or probe head according to a preferred embodiment of the present invention. In a preferred embodiment, a communication interface (such as button 3022) allows the user to initiate a connection with one of the ultrasound imaging applications. Similarly, button 3024 is used to terminate a connection with one of the ultrasound imaging applications. A button 3026 is used as a selection button for providing a selectable list of patients and corresponding images. These images are stored locally or remotely. If selected, remotely stored images are transferred to the viewer. The selected image is displayed on the viewer 3030.
An additional communication interface button (such as button 3028) acts as an option button, which may (but is not limited to) allow changes to configuration parameters (such as an Internet Protocol (IP) address).
FIG. 31 is a diagram of a preferred embodiment ultrasonic image collection and distribution system 3140 including four main software components. The main hardware components of the system are ultrasonic probes 3142a to 3142n. The probes in communication with the laptops 3144a to 3144n allow the generation of ultrasound images and related patient information and deliver the images and information to an image / patient information distribution server 3146. The distribution server utilizes a SQL database server 3148 to store and retrieve images and related patient information. The SQL server provides decentralized database management. Multiple workstations can manipulate data stored on a server, and the server coordinates operations and performs resource-intensive calculations.
Image viewing software or executable instructions may be implemented in two different embodiments. In a first embodiment, a fully fixed version of an image viewer as described in FIG. 30 may reside on a workstation or laptop computer equipped with a high-bandwidth internet connection. In a second embodiment, a light-weight version of the video viewer may reside on a small handheld personal computer (PocketPC) 3020 equipped with one of IEEE 802.11b and / or IEEE 802.11a compliant network cards. The handheld PC image viewer implements only limited functionality that allows basic image viewing operations. A wireless network protocol 3150 (such as IEEE 802.11) can be used to transmit information to a handheld device or other computing device 3152 in communication with a hospital network.
This preferred embodiment describes an ultrasound imaging system that covers a wide range of image collection and acquisition needs in hospitals. It also provides instant access to non-imaging patient-related information. In order to provide information exchange between hospitals, the image distribution server has the ability to maintain mutual connectivity across a wide area network.
In another preferred embodiment, the probe 3262 can directly use a wireless communication link 3266 to communicate with a remote computing device (such as a PDA 3264), as shown in the system 3260 of FIG. 32. This communication link can use the IEEE 1394 protocol. Both the probe and the PDA 3302 have one of the RF stacks and circuits described with reference to FIG. 33 to communicate using wireless protocols. The detection head includes a sensor array, a beam forming circuit, a transmission / reception module, a system controller, and a digital communication control circuit. Provide ultrasonic image data for processing in PDA (including scan conversion).
FIG. 34 shows a schematic diagram 3440 of an imaging and telemedicine system of an integrated ultrasound system 3442 according to a preferred embodiment of the present invention. A preferred embodiment of the system outputs real-time RF digital data or front-end data.
As will be understood, various wireless connections described herein (such as ultrasound clinics 3442 (e.g., for ultrasound capture), community clinics 3446 (e.g., for general telemedicine capture), radiology clinics 3448 (e.g., , For digital film capture) and the heart clinic 3450 (for example, for echocardiographic imaging capture) connection 3444; and the connections shown in Figures 26A to 29 and Figures 31 to 32) This includes 3G, 4G, GSM, CDMA, CDMA2000, W-CDMA or any other suitable wireless connection using any number of communication protocols. In some cases, the ultrasound imaging device may be configured to respond to a command from the user and one of a remote computer or other electronic device executed via a touch sensitive user interface (UI) of the ultrasound imaging screen. Wireless connections. For example, during, before, or after performing an ultrasound procedure, a user may initiate a wireless connection with a hospital or doctor to transmit ultrasound imaging data. The data can be transmitted in real-time as it is generated, or it can be transmitted. An audio and / or video connection can also be initiated via the same wireless network so that the user of the ultrasound imaging device can contact a hospital or doctor while performing the procedure. You can navigate through the touch screen UI provided on the ultrasound imaging screen and / or select the options described herein. In some embodiments, all or part of the touch screen UI may be provided to a user via a wireless network using, for example, JavaScript or some other browser-based technology. Part of the UI can be executed on the device or remotely, and various device-side and / or server-side technologies can be implemented to provide various UI features described herein.
FIG. 35 illustrates a 2D imaging operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention. The touch screen of the tablet 2504 can display an image obtained by a two-dimensional sensor head using a 256 digital beamformer channel. The 2D image window 3502 depicts a 2D image scan 3504. A two-dimensional image can be obtained using the flexible frequency control 3506, where the control parameters are represented on a tablet computer.
FIG. 36 illustrates a motion operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention. The touch screen display 3600 of the tablet computer can display images obtained by a motion operation mode. The touch screen display 3600 of the tablet computer can simultaneously display two-dimensional imaging 3606 and sports imaging 3608. The touch screen display 3600 of the tablet computer can display a two-dimensional image window 3604 having a two-dimensional image 3606. The flexible frequency control 3506 displayed using the graphical user interface can be used to adjust the frequency from 2 MHz to 12 MHz.
FIG. 37 illustrates a color Doppler operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. The tablet's touchscreen display 3700 displays images obtained with a color Doppler operating mode. A 2D image window 3706 is used as the base display. The color-coded information 3708 is superimposed on the two-dimensional image 3710. Ultrasound-based imaging of red blood cells is derived from the received echo of the transmitted signal. The main characteristics of the echo signal are frequency and amplitude. The amplitude depends on the amount of moving blood in the volume sampled by the ultrasound beam. You can use the display to adjust a high frame rate or high resolution to control the quality of the scan. Higher frequencies are produced by fast blood flow and can be displayed in lighter colors, while lower frequencies are displayed in darker colors. Flexible frequency control items 3704 and color Doppler scan information 3702 can be displayed on the tablet computer display 3700.
FIG. 38 illustrates a pulse wave Doppler operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. The touch screen display 3800 of the tablet computer can display images obtained by the pulse wave Doppler operation mode. A pulse wave Doppler scan generates a series of pulses for analyzing blood flow in a small area along a desired ultrasonic cursor (referred to as the sample volume or sample gate 3812). The tablet computer display 3800 can depict a two-dimensional image 3802 with the sample volume / sample gate 3812 overlaid. The tablet computer display 3800 may use a hybrid operating mode 3806 to depict a two-dimensional image 3802 and a time / Doppler shift 3810. If an appropriate angle between the beam and the blood flow is known, the time / Doppler frequency shift 3810 can be converted into velocity and blood flow. The gray shading 3808 in this time / Doppler shift 3810 may indicate the strength of the signal. The thickness of the spectral signal can indicate laminar blood flow or turbulence. The tablet computer display 3800 can depict adjustable frequency controls 3804.
FIG. 39 illustrates a triple scan operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention. The tablet computer display 3900 may include a two-dimensional window 3902 capable of displaying two-dimensional images alone or in combination with color Doppler or directional Doppler features. The touch screen display 3900 of the tablet computer can display images obtained by the color Doppler operation mode. A 2D image window 3902 is used as a base display. The color-coded information 3904 is an overlay 3906 on the two-dimensional image 3916. Pulse wave Doppler features can be used alone or in combination with two-dimensional imaging or color Doppler imaging. The tablet computer display 3900 may include a pulse wave Doppler scan represented by a sample volume / sample gate 3908 overlaid on a two-dimensional image 3916 or color coded overlaid 3906 (alone or in combination). The tablet monitor 3900 can depict one of the split screens representing time / Doppler shift 3912. If an appropriate angle between the isolated beam and the blood flow is known, the time / Doppler frequency shift 3912 can be converted into velocity and blood flow. The gray shading 3914 in the time / Doppler frequency shift 3912 may indicate the strength of the signal. The thickness of the spectral signal can indicate laminar blood flow or turbulence. The tablet monitor 3900 can also depict a flexible frequency control 3910.
FIG. 40 illustrates a GUI home screen interface 4000 for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. The screen interface 4000 for a user operation mode can be displayed when the ultrasound system is activated. To assist a user to navigate the GUI main screen 4000, the main screen can be regarded as including three exemplary working areas: a menu bar 4004, an image display window 4002, and an image control bar 4006. Additional GUI components may be provided on the main GUI home screen 4000 to enable a user to close the GUI home screen and / or windows in the GUI home screen, adjust the GUI home screen and / or the windows in the GUI home screen Size and exit the window on the GUI home screen and / or the GUI home screen.
The menu bar 4004 enables the user to select ultrasound data, images, and / or videos for display in the image display window 4002. The menu bar may include components for selecting one or more files in a patient folder directory and an image folder directory.
The image control bar 4006 includes touch control items that can be operated by a user directly applying touch and touch gestures to the surface of the display. Exemplary touch control items may include (but are not limited to): a depth control touch control item 4008, a two-dimensional gain touch control item 4010, a full-screen touch control item 4012, a text touch control item 4014, a Split screen touch control 4016, an ENV touch control 4018, a CD touch control 4020, a PWD touch control 4022, a frozen touch control 4024, a stored touch control 4026, and Optimized touch control 4028.
FIG. 41 illustrates a GUI function screen interface 4100 for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. When the menu selection mode is triggered from the menu bar 4104 to start the operation of the ultrasound system, a screen interface 4100 for a user operation mode may be displayed. To assist a user to navigate the GUI main screen 4100, the main screen can be regarded as including three exemplary working areas: a menu bar 4104, an image display window 4102, and an image control bar 4120. Additional GUI components may be provided on the main GUI menu screen 4100 to, for example, enable a user to close the GUI menu screen and / or windows in the GUI menu screen, adjust the GUI menu screen and / or the The size of the window in the GUI menu screen and exit the window in the GUI menu screen and / or the GUI menu screen.
The menu bar 4104 enables the user to select ultrasound data, images and / or videos for display in the image display window 4102. The function list 4104 may include a touch control component for selecting one or more files in a patient folder directory and an image folder directory. The function list depicted in an extended format 4106 may include exemplary touch controls, such as a patient touch control 4108, a preset touch control 4110, a view touch control 4112, a report touch A control item 4114 and a touch control item 4116 are set.
The image control bar 4120 includes touch control items that can be operated by a user directly applying touch and touch gestures to the surface of the display. Exemplary touch control items may include (but are not limited to): depth control touch control items 4122, a two-dimensional gain touch control item 4124, a full-screen touch control item 4126, a text touch control item 4128, a segmentation Screen touch control 4130, one-pin visual ENV touch control 4132, one CD touch control 4134, one PWD touch control 4136, one frozen touch control 4138, one stored touch control 4140, and An optimized touch control item 4142.
FIG. 42 illustrates a GUI patient data screen interface 4200 for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. When the ultrasound system is activated, when a patient selection mode is triggered from the function list 4202, a screen interface 4200 for a user operation mode may be displayed. To assist a user to navigate the GUI patient data screen 4200, the patient data screen can be considered to include five exemplary work areas: a new patient touch screen control 4204, a new research touch screen control 4206, a A research list touch screen control 4208, a work list touch screen control 4210, and an edit touch screen control 4212. Within each touch screen control, further information input fields 4214, 4216 are available. For example, the patient information section 4214 and the research information section 4216 may be used to record data.
In the patient data screen 4200, the image control bar 4218 includes touch control items that can be operated by the user directly applying touch and touch gestures to the surface of the display. Exemplary touch control items may include (but are not limited to): accept research touch control items 4220, study touch control items 4222 closely, print touch control items 4224, print preview touch control items 4226, eliminate touch A control item 4228, a two-dimensional touch control item 4230, a frozen touch control item 4232, and a stored touch control item 4234.
FIG. 43 illustrates a GUI patient data screen interface 4300 (such as a default parameter screen interface) for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. When the ultrasound system is activated, when the preset selection mode 4304 is triggered from the function list 4302, a screen interface 4300 for a user operation mode may be displayed.
In the default screen 4300, the image control bar 4308 includes touch controls that can be operated by the user directly applying touch and touch gestures to the surface of the display. Exemplary touch control items may include (but not limited to): a save setting touch control item 4310, a delete touch control item 4112, a CD touch control item 4314, a PWD touch control item 4316, a frozen touch control Item 4318, a stored touch control item 4320, and an optimized touch control item 4322.
FIG. 44 illustrates a GUI viewing screen interface 4400 for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. When the ultrasound system is activated, a screen interface 4400 for a user operation mode may be displayed when the preset extended view 4404 is triggered from the menu bar 4402.
Within the viewing screen 4400, the image control bar 4416 includes touch controls that can be operated by the user directly applying touch and touch gestures to the surface of the display. Exemplary touch control items may include (but not limited to): a thumbnail set touch control item 4418, a synchronized touch control item 4420, a selected touch control item 4422, a previous image touch control item 4424, and an image next A touch control item 4426, a two-dimensional image touch control item 4428, a pause image touch control item 4430, and a stored image touch control item 4432.
An image display window 4406 allows the user to view images in multiple formats. The image display window 4406 may allow a user to view the images 4408, 4410, 4412, 4414 in a combination or subset, or allow any individual image 4408, 4410, 4412, 4414 to be viewed individually. The image display window 4406 can be configured to display up to four images 4408, 4410, 4412, 4414 to be viewed simultaneously.
FIG. 45 illustrates a GUI report screen interface for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. When the ultrasound system is activated, when the report extended view 4504 is triggered from the function list 4502, a screen interface 4500 for a user operation mode may be displayed. Display 4506 contains ultrasound report information 4526. The user can use the worksheet selection in the ultrasound report 4526 to enter notes, patient information, and research information.
In the report screen 4500, the image control bar 4508 includes touch controls that can be operated by the user directly applying touch and touch gestures to the surface of the display. Exemplary touch control items may include (but not limited to): a save touch control item 4510, a save as touch control item 4512, a print touch control item 4514, a print preview touch control item 4516, A touch control item 4518, a two-dimensional image touch control item 4520, a frozen image touch control item 4522, and a stored image touch control item 4524 are studied.
FIG. 46A illustrates a GUI setting screen interface for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. When the ultrasound system is activated, when the report extended view 4604 is triggered from the function bar 4602, a screen interface 4600 for a user operation mode may be displayed.
In the setting expansion screen 4604, the setting control bar 4644 includes touch control items that can be operated by the user directly applying touch and touch gestures to the surface of the display. Exemplary touch control items may include (but are not limited to): a universal touch control item 4606, a display touch control item 4608, a measurement touch control item 4610, an annotation touch control item 4612, and a print touch Control item 4614, a storage / retrieval touch control item 4616, a DICOM touch control item 4618, an export touch control item 4620, and a research information image touch control item 4622. These touch controls may include a display screen that allows the user to enter configuration information. For example, the universal touch control 4606 includes a configuration screen 4624, in which a user can input configuration information. In addition, the universal touch control 4606 contains a choice of allowing the user to configure one of the soft key docking positions 4626. FIG. 46B depicts a soft key control 4652 with a right alignment. FIG. 46B further illustrates that the activation of the soft key control arrow 4650 will change the key alignment to the opposite side (in this case, the left alignment). FIG. 46C depicts the left alignment of the soft key control 4662. The user can use the soft key control arrow 4660 to initiate a certain direction change to change the position to the right alignment.
In the viewing screen 4600, the image control bar 4628 includes touch control items that can be operated by a user directly applying touch and touch gestures to the surface of the display 4664. Exemplary touch control items may include (but are not limited to): a thumbnail set touch control item 4630, a synchronous touch control item 4632, a selected touch control item 4634, a previous image touch control item 4636, and an image next A touch control item 4638, a two-dimensional image touch control item 4640, and a pause image touch control item 4642.
FIG. 47 illustrates a GUI setting screen interface for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention. When the ultrasound system is activated, a screen interface 4700 for a user operation mode may be displayed when the report extended view 4704 is triggered from the menu bar 4702.
In the setting expansion screen 4704, the setting control bar 4744 includes touch control items that can be operated by the user directly applying touch and touch gestures to the surface of the display. An exemplary touch control item may include (but is not limited to) a plurality of icons, such as a universal touch control item 4706, a display touch control item 4708, a measurement touch control item 4710, and an annotation touch control item 4712. , A print touch control 4714, a save / retrieve touch control 4716, a DICOM touch control 4718, an export touch control 4720, and a research information image touch control 4722. These touch controls may include a display screen that allows the user to enter storage / retrieval information. For example, the storage / retrieval touch control item 4716 includes a configuration screen 4702, in which a user can input configuration information. The user can activate a virtual keyboard that allows the user to enter alphanumeric characters in one of the fields with different touch activations. In addition, the Store / Retrieve Touch Control 4702 contains an option that allows the user to enable retroactive retrieval 4704. When the user enables the save function, the system is preset to save the expected movie playback. If the user enables retroactive acquisition, the storage function collects movie playbacks retroactively.
In the setting screen 4700, the image control bar 4728 includes touch control items that can be operated by the user directly applying touch and touch gestures to the surface of the display. Exemplary touch control items may include (but not limited to): a thumbnail set touch control item 4730, a synchronous touch control item 4732, a selected touch control item 4734, a previous image touch control item 4736, and an image next A touch control item 4738, a two-dimensional image touch control item 4740, and a pause image touch control item 4742.
A preferred embodiment of a miniaturized PC-enabled ultrasound imaging system runs on an industry standard PC and Windows® 2000 operating system (OS). As a result, a network that is ideal and cost-effective for telemedicine solutions is ready. Provide open architecture support for embedding and therefore integration with third-party applications. The preferred embodiment includes an improved application programming interface (API), a common interface, for third-party applications (for example, such as (but not limited to): radiation therapy plans, image-guided surgery, integrated Export support for solutions such as computing, 3D, and report packaging. The API provides a set of software interrupts, calls, and data formats for use by applications to initiate communication with network services, host communications programs, telephone equipment, or program-to-program communications. Software-based feature enhancements reduce obsolete hardware and provide effective upgrades.
In addition, the preferred embodiment includes a system-on-a-chip integrated circuit (IC) that runs on a PC and has a large number of channels, a large dynamic range, high image quality, and a complete feature set. , Extensive diagnostic coverage, minimum supply chain requirements, simplified design for simple testing and high reliability, and very low maintenance costs.
As previously described herein, the preferred embodiment includes a PC-based design that is intuitive, has a simple graphical user interface, is easy to use and train, and utilizes PC industry technology know-how, sound electronics, and high quality Displays and low manufacturing costs. It also provides support for software-controlled communication with other applications that are embedded applications that allow patient data, scanner images, current procedure terminology (CPT) code management, and current procedure terminology (CPT) code management It is a digital coding system. The physician uses this digital coding system to record all the procedures and services, the doctor's plan, and the result evaluation report on an integrated PC. Pressure has been put on healthcare reform to reduce costs, highlighting the need for first-time visits / infield diagnostics, data storage, and retrieval solutions that combine technological innovations such as, for example, medical-based digital imaging and Communication (DICOM) standard data storage and retrieval, broadband and image archiving and communication system (PACS)), driving changes in patient record storage and retrieval and transmission, at lower cost for ultrasound data acquisition / handheld devices Innovations in this respect all realize the preferred embodiments of the present invention. The DICOM standard facilitates the dissemination and viewing of medical images, such as ultrasound, magnetic resonance imaging (MRI), and computed tomography (CT) scans. Broadband is a wide area network term that refers to one of the transmission facilities that provides a bandwidth greater than 45 Mbps. Broadband systems are generally fiber in nature.
A preferred embodiment of the present invention provides image acquisition and end-user applications (eg, radiation therapy, surgery, angiography), all of which are executed on the same platform. This provides low-cost, user-friendly control through a common software interface. The ultrasound system has a scalable user interface for advanced users and an intuitive Windows®-based PC interface. A preferred embodiment of the ultrasound system is attributed to the one-stop image capture, analysis, storage, retrieval, and transmission capabilities of data and imaging, which also provides an improved diagnostic capability. Provides a high image quality with a 128-channel bandwidth. In addition to being easy to use, the ultrasound system also provides patient access at any time, anywhere, and with any tool. A 10-ounce probe according to one of the preferred embodiments of the present invention is used to provide point of care imaging. Data storage and retrieval capabilities are based on DICOM standards and are compatible with existing third-party analysis and patient record systems. The ultrasound system according to a preferred embodiment also uses, for example, but is not limited to, e-mail, LAN / WAN, DICOM, and Digital Imaging Network Image Archive and Communication System (DINPAC) to provide direct image transmission capabilities. Options for displaying captured images include, but are not limited to: a desktop computer, a laptop computer, a portable personal computer, and a handheld device (such as a personal digital assistant).
As described above, the ultrasound system of the present invention is used in minimally invasive surgery and robotic surgery methods, including (but not limited to): biopsy procedures, catheterization for diagnostic and therapeutic angiography, fetal imaging, Cardiac imaging, vascular imaging, imaging during endoscopic procedures, imaging for telemedicine applications and imaging for veterinary applications, radiation therapy and cryotherapy. These embodiments use a computer-based tracking system and CT and MR images to pinpoint the precise location of the target area. An alternative preferred embodiment of the ultrasound system may provide imaging at a lower cost and using a smaller footprint device just before, during, and immediately after the procedure. The preferred embodiment overcomes the need for a separate ultrasonic device that requires a procedure room to be advanced and a method that moves the image from the ultrasound to a tracking position and a device that registers the target for previously captured CT and MR images. A preferred embodiment of the ultrasound system provides a fully integrated solution because it can run its ultrasound application on the same platform as any third-party application that processes the image. The system includes a streaming video interface (an interface between a third-party application and the system's ultrasound application). A key component of this system allows the two applications to run on the same computer platform (using the same operating system (OS)), for example, such as a Windows®-based platform, other platforms (such as Linux) can also be used and therefore Provides seamless integration of one of these two applications. The following describes the details of moving the image from the system's ultrasound application to another software interface.
The preferred embodiment includes control and data transfer methods that allow a third-party Windows®-based application to run an ultrasound application as a background task, send control commands to the ultrasound application, and receive images in exchange ( Data) to control, for example, a portable Windows®-based ultrasound system. In addition, the embodiment configures a portable ultrasonic Windows®-based application as a server that supplies another ultrasonic image frame of a Windows®-based application (which acts as a client). The client application receives these ultrasound image frames and further processes them. In addition, an alternative embodiment configures the portable ultrasound Windows®-based application as being activated and controlled by two communication agencies (e.g., by a third party (hereinafter interchangeably referred to as external or a client)) Portable Ultrasound is a server based on a component object model (COM) automation interface for Windows®-based applications and a high-speed shared memory interface for delivering live ultrasound images. It interacts with a third-party client application.
A preferred embodiment includes and is configured as a shared memory interface as one of a streaming video interface between a portable Windows®-based ultrasound application and another third-party Windows®-based application. This streaming video interface is designed to provide ultrasound images to a third-party client in real time.
A preferred embodiment allows third-party Windows®-based applications to control the flow rate of images from portable ultrasound Windows®-based applications through a shared memory interface within the same PC platform and the memory required to implement this interface The amount. These controls consist of setting the number of image buffers, the size of each buffer, and the rate of image transmission. This flow rate control can be set for zero data loss to ensure that each frame is delivered from the ultrasound system to a third-party Windows®-based application, or this flow rate control is set for the lowest latency, so that the ultrasound system will first generate The latest frames are delivered to third-party Windows®-based applications.
A preferred embodiment formats the ultrasound image frame so that when a third-party Windows®-based application acquires images (generated by a portable ultrasound Windows®-based application) from a shared memory interface, the third-party based Windows® applications interpret head, space and time information. The actual image data passed between the server (that is, the portable ultrasound application) and the client application (a third-party Windows®-based application) is one of 8-bit pixels and a 256-item color table. Device Independent Bit Map (DIB). The image frame also contains a header that provides one of the following additional information, such as (but not limited to): probe type, probe serial number, frame serial number, frame rate, frame time stamp, frame trigger time stamp, image Width (in pixels), image height (in pixels), pixel size (on X and Y), pixel origin (x, y positioning of the first pixel in the image relative to the sensor head), and direction ( The spatial direction along or across the lines of the image).
In addition, the preferred embodiment controls the shared memory interface for transmitting ultrasound images between a Windows®-based portable ultrasound system and a third-party Windows®-based system by using ActiveX controls. A Windows®-based portable ultrasound application contains an ActiveX control that sends a frame to shared memory and sends a Windows® event (which contains an indicator for the frame just written) ) To third-party Windows®-based applications. This third-party application has an ActiveX-like control that receives this event and retrieves an image frame from shared memory.
The graphical user interface includes one or more control programs, each of which is preferably a self-contained (for example) client-side script. These control programs are independently configured to (among other functions) generate graphical or text-based user controls in the user interface for generating in user interfaces such as guided by user controls A display area, or used to display processed streaming media. These control programs can be implemented as ActiveX controls, Java applets, or any other self-contained and / or self-executing applications or portions thereof that can operate in a media gateway container environment and can be controlled through web pages.
The ultrasound content can be displayed in a frame in a graphical user interface. In one embodiment, the program generates an instance of an ActiveX control. ActiveX refers to a set of object-oriented programming technologies and tools provided by Redmond's Microsoft® company in Washington. The core part of ActiveX technology is the Component Object Model (COM). A program running under the ActiveX environment is called a "component" and can run a self-sufficient program anywhere in the network (as long as the program is supported). This component is often called an "ActiveX Control". Therefore, an ActiveX control is a component program object that can be reused by many applications in a computer or in several computers on a network, regardless of the programming language in which it is generated. An ActiveX control runs in a so-called container, which is an application that uses one of the COM programming interfaces.
One advantage of using a component is that it can be reused by many applications (which are referred to as "component containers"). Another advantage is that an ActiveX control can be generated using one of several well-known languages or development tools (including C ++, Visual Basic, or PowerBuilder) or using a script tool (such as VBScript). ActiveX controls can be downloaded as, for example, smaller executable programs, or as self-executable code for web animation. An applet similar to an ActiveX control and suitable for client-side scripting. An applet is usually a self-contained, self-executing computer written in Java.TM. (a web-based object-oriented programming language released by Sun Microsystems, Sunnyvale, California). Program.
The control program can be stored and accessed locally at the client system, or the control program can be downloaded from the network. It is usually downloaded by encapsulating a control program in one or more markup language based files. The control program can also be used for any task normally required by an application running in one of several operating system environments. Windows®, Linux, and Macintosh are examples of operating system environments that can be used in the preferred embodiment.
A preferred embodiment of the ultrasound imaging system has a specific software architecture for image streaming capabilities. This ultrasonic imaging system is an application program that controls an ultrasonic probe of a preferred embodiment and allows obtaining and displaying visual images for medical purposes. The imaging system has its own graphical user interface. This interface has reached its features and is suitably organized to provide maximum flexibility to work with separate images and image streams. Some possible medical applications require the development of graphical user interfaces with significantly different characteristics. This involves integrating imaging systems into other more complex medical systems. The preferred embodiment allows the imaging data to be exported in a highly efficient and convenient manner for direct access by the original equipment manufacturer (OEM).
The quality of the video streaming solution according to a preferred embodiment is measured by the following criteria, such as data transfer performance. Imaging data consumes a lot of memory and processor power. A larger number of separate image frames are required to generate live medical video patient examinations. Minimizing data handling operations in transferring data from one process that generates video data to one process that consumes video data is very important. The second criterion contains industry standard imaging formats. Because it is intended that third-party companies develop applications that consume video imaging data, the data can be represented in an industry-standard format. A third criterion is convenience. Imaging data can be presented through a programming interface that is easy to use and does not require additional learning.
In addition, the criteria include scalability and extensibility. A stream data architecture can be easily extended to accommodate new data types. It can provide one of the basic architectures for future multiplications of video streaming that target more than one data receiving process.
The image streaming architecture of the preferred embodiment provides a method for data transmission between two processing procedures. The video streaming architecture defines operating parameters for adjusting data transmission processing procedures, and describes a mechanism for transmitting parameters between processing procedures. One of the methods of transmitting operating parameters from a third-party client application to the imaging system of a preferred embodiment is by using an existing COM interface.
In a preferred embodiment, the image transfer architecture uses an object-oriented programming methodology to focus on the processing room capabilities of the Microsoft Windows® operating system. This object-oriented methodology provides one of the necessary foundations for an architectural solution that allows the necessary requirements to be met. It also lays the foundation for future enhancements and extensions that make modification relatively simple and backward compatible.
Video imaging data represents complex data structures that interfere with each other between different data elements. It also allows and often requires different interpretations of the same data element. The following preferred embodiment of the image transmission architecture includes a shared memory for physical data exchange. For example, the Windows® shared memory system is a fast and economical way to exchange data between processes. In addition, in some embodiments, the shared memory may be subdivided into separate segments having a fixed size. Each section can then be at a minimum controllable unit. In addition, imaging data can be abstracted into objects. Each frame of the imaging data can be represented by a separate object. These objects can then be mapped to sections of shared memory.
The preferred embodiment may include locking-unlocking a segment of objects. The programming API notification mechanism used is an event-driven mechanism. The event-driven mechanism is an implementation scheme based on C ++ pure virtual functions.
In a preferred embodiment, the image transmission architecture is composed of three layers: an application programming interface (API) layer, a programming interface implementation and a shared memory access layer, and a physical shared memory layer. The application programming interface layer provides two different C ++ class library interfaces to an application on a client and a server. All associated sequences of instructions that belong to the application itself are also part of this layer. Application derived classes and their implementations are key elements of the application design interface layer. The server serving as the imaging data provider uses, for example, the object transmitter class and related derivative and base classes. The client as the imaging data consumer uses, for example, an Object Factory category and related derivative and base categories.
The programming interface implementation layer provides two different dynamic link library (DLL) implementation classes for applications. This layer maps an object of a class associated with an application to one of the internal implementations of an object that accesses a shared memory entity system object. This layer allows hiding all implementation-specific member variables and functions from the scope of the application. As a result, the application design interface layer is less cluttered, easier to understand, and easier to use. Server-side applications can use (for example) Object-Xmitter.DLL, and client-side applications can use (for example) ObjectFactory.DLL.
The physical shared memory layer represents operating system objects that implement the shared memory functionality. It also describes the structure of shared memory, its segments, and memory control blocks.
Regarding the organization of shared memory, since shared memory is intended for inter-program communication, the operating system assigns a unique name at the time of its creation. To manage shared memory, other inter-process communication (IPC) system objects are required. They also need to have unique names. To simplify a unique name generation process, only one base name is required. All other names are derived from the base name by an implementation code. Therefore, the application programming interface requires a specification of only one base name for logical shared memory objects. The same unique name can be used on both the server side of the application and the client side of the application.
The server side of the application is responsible for the generation of shared memory. In a generation process, not only must the unique name of the shared memory be specified, but also other configuration parameters must be specified. These parameters include (but are not limited to): the number of fragments specifying the number of fragments to be allocated, the fragment size, and the operation flag. There are three such flags in a preferred embodiment. The first flag specifies the order in which fragments are submitted and retrieved. The order can be one of Last In First Out (LIFO), First In First Out (FIFO), or Last In Out (LIO). LIO is a modification of ordinary LIFO so that whenever a new frame arrives, if it finds a frame that is ready for retrieval but is not locked for retrieval, it is one of the ways to erase the frame. The second flag specifies the shared memory implementation behavior under one of the conditions when a new fragment allocation is requested but no fragment is available. This usually happens when the receiving application processes data slower than submitting the application. This flag may allow deletion of one of the previously allocated segments. If it does not allow deletion of one of the previously allocated segments, it reports an abnormal condition back to the application. Using this flag application can automatically choose to rewrite data into a shared memory or it can control the data rewrite process itself. The third flag may be used only when the second flag allows the segment to be rewritten into a shared memory. It specifies how to select one segment to be rewritten. By default, the shared memory implementation deletes the youngest or most recently submitted piece of data. Alternatively, the oldest fragment can be selected for rewriting the handler.
When shared memory is created, its physical layout is initialized. Because the operating system does not allow address calculation in a physical shared memory, no data pointer is used in the shared memory. All addressing within the shared memory control block and segment can be implemented based on the relative displacement from the virtual origin (VO). With a zero displacement from VO, the shared memory header structure is allocated. It contains all the parameters listed above. FIG. 48 is a block diagram showing the structure of the physical shared memory 4880.
The system immediately following the allocation of the shared memory header structure 4882 generates a header array 4884 for each memory segment. The memory segment header contains the size occupied by the segment, a unique label of the object type mapped to the segment, and the segment status. Each segment can be one of the following four states: unused state, where the segment is available for allocation; locked state for writing, where the segment is mapped to an object of a particular category and is currently formed; written A state in which a fragment is mapped to an object of a specific category and is available for retrieval; and a state locked in for reading, in which a fragment is mapped to an object of a specific category and is currently in a processing procedure for data retrieval . Because each fragment has its own state, it is possible for the application to lock more than one fragment for object formation and object retrieval. This allows the system to have a flexible multi-threaded architecture on both the server and client sides of the application. In addition, the ability to place more than one fragment in a "write" state provides a "buffering" mechanism that cancels or minimizes performance differences between the server and client applications.
The last element in a physical shared memory layout contains a memory segment 4888. In addition to the physical shared memory, the logical shared memory also contains a physical system mutex 4886 and a system event 4890. The physical mutex provides mutually exclusive access to physical shared memory. The entity event has a manual control type. It remains at a high level when at least one of the segments has a "write" state. It goes to the level "low" only when there is no single segment in a "write" state. This mechanism allows "write" objects to be retrieved from shared memory without passing control to an operating system within the same time slice allocation for threads.
In a preferred embodiment, the object transfer programming interface is composed of the following three categories: AObjectXmitter, USFrame, and BModeFrame. The AObjectXmitter class allows to initially specify an object transfer service for one of the desired operating parameters. Once the AObjectXmitter class object is instantiated, initialized objects of USFrame and BmodeFrame classes can be generated. The USFrame class builder needs to reference one of the objects of the AObjectXmitter class. The first action that must be completed after instantiating a USFrame object is to establish an association between the object and one of the fragments in the shared memory. The function Allocateo maps an object to an unused shared memory segment and locks this segment for use by the current object. When mapping an object, a bitmap size can be provided by an application. The size provided only represents the size required for the bitmap data (not including the memory size required for other data elements of the object).
The BModeFrame category is a category derived from the USFrame category. It inherits all the methods and functionality of this base class. The only additional functionality provided by the BModeFrame class is an additional method that allows providing information specific to the BMode operation.
After instantiating a USFrame or BModeFrame class object and mapping it to a shared memory segment, the application can populate all the required data elements of the object. It is not necessary to provide a value for each data element. When mapping an object to a shared memory segment, all data elements of the object are initialized with default values. The only data element that is not initialized after mapping is a bit mapped data element. When the server side of the application has provided all the required data elements, it can deliver the object to the client of the application by calling a method (for example, Submit ()).
USFrame or BModeFrame objects can be reused by subsequently remapping and resubmitting. Alternatively, when the object is applicable to an application, the object may be deleted and a new object may be generated. Because object instantiation does not require any interprogram communication mechanism, it is as simple as the memory allocation for a common variable.
There are at least two advantages of the architecture of the preferred embodiment. Because the ObjectXmitter category does have knowledge about the USFrame or BModeFrame categories, it can be very simple to introduce additional categories that are similar or derived directly or indirectly from the USFrame category. This allows future versions of the object transfer programming interface to be generated without any modification to the code or instruction sequences developed for use with existing embodiments. In addition, the object transfer programming interface class does not have any member variables. This provides two other benefits to the interface. The first benefit is that these categories are oriented through the COM object interface and can be directly used for COM object interface specifications and implementations. A second benefit is that these categories effectively hide all implementation-specific details, making the interface very clear, easy to understand, and easy to use.
ObjectXmitter.DLL implements object transfer programming interface. For each object generated by the application, there is a mirrored implementation object generated by code residing in the ObjectXmitter.DLL. Because each programming interface category has a corresponding mirroring category in the implementation, modification is facilitated and the modification is currently extended to the specified image type. This can be done by generating a corresponding image class in the implementation DLL4910. Implementation objects handle the mapping of shared memory and programming interface objects. One embodiment of the present invention includes a DLL that allows instantiation of only one ObjectXmitter class object using only one communication channel with a client application. The object transfer implementation not only transfers the object data but also provides additional information describing the type of object being transferred.
Object Factory programming interface consists of three categories: AObjectFactory, USFrame and BModeFrame. This class AObjectFactory contains three pure virtual member functions. This makes this class an abstract class that cannot be instantiated by an application. Must define its own class derived from the application definition from the AObjectFactory class. It is not necessary to define any "special" category derived from the AObjectFactory category. Because the application intends to process the image to be received, it will have a very high chance of processing one of the categories of image. An image processing category can be well derived from the AObjectFactory category.
Classes derived from an AObjectFactory class must define and implement only pure virtual functions, such as, for example, OnFrameOverrun (), OnUSFrame (), and OnBModeFrame (). For example, once exported categories can be defined as follows:


After instantiating a class object, the image processor base class member function Open () can be called. This function provides a shared memory name that matches one of the shared memory names used by the server side of the application. The function Open () connects the client application to the server application via a specified shared memory.
At any time after the shared memory is turned on, the application can expect to call one of the virtual functions OnFrameOverrun (), OnUSFrame (), and OnBModeFrame (). Each call of the OnUSFrame () function carries an object of type USFrame as an argument. Each call to the OnBModeFrame () function carries an object of type BModeFrame as an argument. It is not necessary for an application to instantiate an object of the USFrame or BModeFrame class. USFrame and BModeFrame objects are "given" to an application by the underlying implementation of an AObjectFactory class.
The only action the application needs to complete is to process the received frame and release the "given" object. The application did not attempt to delete a frame object because the deletion was performed by an underlying implementation. The member function Release () of the USFrame object is called only when the application completes all data processing or the application no longer needs the USFrame object or objects of the exported class.
Once the application has received an object of a class USFrame or BModeFrame, it can retrieve the imaging data and process it appropriately. The application needs to be aware that it does process frame object data in a separate thread and ensures that processing functions are written using a thread-safe programming technique. Because either of the pure virtual functions is called within a separate thread generated by the implementation DLL, subsequent calls are not possible until the virtual function returns control to the calling thread. This means that as long as the application has not returned control to the threads generated by the implementation, the application cannot receive any new frames. At the same time, the server side of the application can continue to submit additional frames. This eventually results in shared memory overflow and prevents any new frame transfers.
The application always maintains shared memory resources while processing frame data from subsequent remapping locks. The more frames the application has not released, the fewer shared memory fragments can be used for the object transfer interface on the server side of the application. If the frame companion object is not released at an appropriate rate ratio, the client application eventually locks all the memory segments of the shared memory. At that time, the image transfer application stopped sending new frames or rewritten frames that were not locked by the receiving application. If the receiving app locks the entire clip, the transmitting app cannot even choose to rewrite the existing frame.
The function OnFrameOverrun () is called when Frame Overrun is proposed by the service application. This condition is raised at any time when the service application attempts to submit a new frame and there is no available shared fragment to which an object is mapped. This condition can be cleared only by the client of the application by calling the function ResetFrameOverrun (). If the client application does not call this function, it proposes the Frame Overrun condition and calls OnFrameOverrun () pure virtual function again.
The Object Factory interface has the same advantages outlined above when describing the object transport interface. In addition to these advantages, it also implements an event-driven programming method that minimizes programming tasks and maximizes performance. There are also functions such as USFrames (), BModeFrames (), GetUSFrame (), and GetBModeFrame (). These functions can be used to implement a less efficient "polling" programming method.
Object Factory programming interface is implemented by ObjectFactory.DLL4918. This DLL retrieves an object class type information and object related data from the shared memory. It produces one of the types used by the transmitter. The Object Factory implementation maps newly generated objects to corresponding data. The Object Factory implementation has a separate thread that fires one of the newly generated and mapped objects via pure virtual function events. The application "owns" this object throughout the processing and instructs the application to no longer need the object by calling the Releaseaseo function. The factory implementation locally releases resources allocated for objects and shared memory resources.
The process flow 4900 described above is shown diagrammatically in block diagram FIG. 49. The preferred embodiment includes the simplicity of code maintenance and enhancements to the image transmission mechanism. The object transfer interface 4908 and the Object Factory interface 4916 and their implementations allow such modifications to be made at relatively low development costs. Regarding object modification, the shared memory implementation is completely independent of the type of data transmitted. Therefore, any type of modification does not require any changes to the underlying code that controls the shared memory. Because the transmitted data is encapsulated within a specific type of category, the only action required to modify an object is to modify the corresponding category that defines the object. Because an object represents a category export tree, any modification of the base category causes an appropriate change for each object of the derived category. These modifications to the object type do not affect application code that is not related to the modified object class.
New types of objects can be introduced by deriving a new category from one of the existing categories. A newly derived category can be derived from the appropriate level of the base category. An alternative way to generate a new object type is by generating a new base class. This approach can be advantageous in situations where a newly defined category is significantly different from existing categories.
Regarding multiple object transmission channels, alternative preferred embodiments may support more than one AObjectXmitter class object and more than one corresponding communication channel. It can also expand in one of the ways that it allows communication of objects in opposite directions. This allows applications to distribute imaging data to more than one client application. It accepts incoming communications that control image generation and probe operation.
In addition, wireless and remote video streaming channels can be adapted in the preferred embodiment. An identical object transfer programming interface may be implemented to transmit images not via shared memory but via a high-speed wireless communication network (for example, such as ISO 802.11a). The same object transfer programming interface can also be used to send images across a wired Ethernet connection. Remote and wireless video streaming assumes that the recipient computing system can differ in performance. This makes the choice of one of the recipient's device models one of the important factors for successful implementation.
Therefore, the streaming imaging included in the preferred embodiment utilizes a shared memory client-server architecture that provides high bandwidth at low overhead.
A software application of the ultrasound imaging system in a preferred embodiment is used by a client application 4904 as a server 4902 of a live ultrasound image frame. This client-server relationship is supported by the two communication mechanisms described above. The client application uses a COM automation interface to launch and control the ultrasound imaging system application 4906. A high-speed shared memory interface 4912 delivers live ultrasound images with probe identification, space and time information from the application to the client application.
For a simple ActiveX COM API (TTFrameReceiver), the client application encapsulates the complexity of the shared memory implementation. Shared memory communication has flexible parameters specified by the client application. The queue order, buffer number, buffer size, and rewrite permission are specified by the client when the image frame streaming is turned on. The queue order mode can be specified as first-in-first-out (FIFO), last-in-first-out (LIFO), and last-in-first-out (LIO). In general, when zero data loss is more important than minimum delay, this FIFO mode is better. The LIO mode delivers only the most recent image frame and is better when the lowest latency is more important than data loss. LIFO mode can be used when minimum delay and minimum data loss are equally important. However, in the LIFO mode, frames may not always be delivered in sequential order and after receiving the frames, a more sophisticated client application is required to classify them. When all the shared memory buffers are full, the rewrite permission is specified as not allowing, rewriting the oldest frame and rewriting the latest frame.
Each image frame contains a single ultrasonic image, probe identification information, pixel space information, and time information. The image format is a standard Microsoft device-independent bit map (DIB) with 8-bit pixels and a 256-item color table.
The TTFrameReceiver ActiveX control provides two schemes for receiving picture frames. The first scenario is event-driven. A COM event FrameReady is triggered when a frame has been received. After the FrameReady event, the interface data access method can be used to read the image and associated data. After the image and other data have been copied, the client releases the frame by calling the ReleaseFrame method. The next FrameReady event does not occur until the previous frame is released. In another embodiment, the client can use the WaitForFrame method to poll the next available frame.
In a preferred embodiment, both the client application and the server application run on the same computer. This computer can run (for example, without limitation) Microsoft® Windows® 2000 / XP operating system. Client applications (USAutoView) can be developed using Microsoft® Visual C ++ 6.0 and MFC. Source code can be compiled in, for example, Visual Studio 6.0. The server-side COM automation interface and TTFrameReceiver ActiveX control are compatible with other MS Windows® software development environments and languages.
In one embodiment of the present invention, the name of the server-side COM automation interface (ProgfD) is, for example, "Ultrasound.Document", and the interface is registered on the computer when the application program is run for the first time. The dispatch interface can be imported from a type library into a client application.
In a preferred embodiment, the automation interface is extended to support frame streaming by adding different methods such as void OpenFrameStream (BSTR * queneName, short numBuffers, long buffersize, BSTR * queueOrder, short overwritepermission). Open the frame stream transmitter on the server side; open the shared memory interface with the client application, queueName is a unique name of the shared memory "file" and is the same name used when the receiver is opened NumBuffer is the number of buffers in the shared memory queue, bufferSize is the size in bytes of each buffer in the shared memory queue, where the buffer size is 5120 larger than the largest image that can be transmitted In units of bytes, queueOrder is "LIO", "FIFO" or "LIFO". OverwritePermission is 0 for overwrites that are not allowed, 1 for overwritePermissions for the oldest rewrite or 2 for overwritePermissions for the latest rewrite. Note that OpenFrameStream must be called before the TTFrameReceiver control is turned on.
The next additional method includes: void CloseFrameStream (), which closes the frame stream transmitter on the server side; void StartTransmitting (), which tells the server side to start transmitting the ultrasonic frame; void StopTransmitting (), which tells the server The terminal stops transmitting the ultrasonic frame; and short GetFrameStreamStatus (), which obtains the status of the frame stream transmitter. It is important to check that the stream transmitter is turned on before turning on TTFrameReceiver. The COM automation interface is not blocked and the OpenFrameStream call cannot occur at the moment it is called from the client application.
In a preferred embodiment, the TTFrameReceiver ActiveX control is a client-side application program interface for live ultrasound frame streaming. The frame stream control method includes boolean Open (BSTR name), which opens the frame stream receiver. The frame stream receiver cannot be turned on until the frame stream transmitter on the server has been turned on. The frame stream control method also includes: boolean Close (), which closes the frame stream receiver; long WaitForFrame (long timeoutms), which waits for a frame to be ready or until the timeout period ends; and boolean ReleaseFrame (), It releases the current image frame. Once you have copied all the required data, you can release the current frame. You cannot receive the next frame until the current frame has been released. The return value of other data access functions will not be valid until the next FrameReady event after releasing the current frame.
In a preferred embodiment, the data access method for an image includes long GetPtrBitmapinfo (), which obtains an index for a header (with a color table) of a DIB containing an image. The ultrasound image is stored as a standard Microsoft device-independent bit map (DIB). The BITMAPINFO and BITMAPINFOHEADER structures can be cast to the returned indicators as needed. The memory system for the BITMAPINFO structure is allocated in the shared memory and cannot be unallocated; instead, ReleaseFrame () can be called to transfer the memory back to the shared memory mechanism. A further method includes long GetPtrBitmapBits (), which obtains an index of image pixels. The returned metrics can be dispatched for use with the Microsoft DIB API as needed. The memory system for the bit-mapped pixels is allocated in the shared memory and cannot be unallocated; instead, ReleaseFrame () can be called to return the memory to the shared memory mechanism.
Methods related to probe identification include: short GetProbeType (), which gets the defined ultrasonic probe type used; BSTR GetProbeType (), which gets the defined probe name; long GetProbeSN (), which gets used The serial number of the probe.
Regarding time information, the method includes short GetSequenceNum (), which obtains the serial number of the current frame. The serial number is derived from an 8-bit counter and is therefore repeated every 256 frames. It is useful for determining gaps in a frame sequence and for reordering received frames when using LIFO buffer ordering mode. In addition, double GetRate () obtains the frame rate when combined with the serial number, providing accurate relative timing for the received frame; BSTR GetTimestamp (), which obtains a timestamp of one of the current frames, which provides One absolute time that may be useful when synchronizing to external events. The resolution is about milliseconds. Timestamps can be averaged and used in conjunction with rate and sequence numbers to achieve higher accuracy. Finally, regarding time information, the method includes BSTR GetTriggerTimestamp (), which obtains a timestamp of the start of an ultrasound scan, where the ultrasound probe is stopped when the image is "frozen". Record trigger time stamps when live imaging is resumed.
The spatial information in the preferred embodiment has the following methods: short GetXPixels (), which obtains the width of the image in pixels; short GetYPixels (), which obtains the height of the image in pixels; double GetXPixelSize (), which Get the size of each pixel in the x direction, (the x direction is defined as horizontal and parallel to each image line); and double GetYPixelSize (), which gets the size of each pixel in the y direction. The y-direction is defined as vertical and perpendicular to each image line. In addition, double GetXOrigin () obtains the x-position of the first pixel in the image relative to the sensor head; and double GetYOrigin () obtains the y-position of the first pixel in the image relative to the sensor head. The positive y-direction is defined as being away from the sensor head into the patient's body. Another method includes short GetXDirection (), which obtains the spatial direction along the lines of the image. The positive x direction is defined as being away from the probe head mark. Short GetYDirection (), get the spatial direction across the lines of the image. The positive y-direction is defined as being away from the sensor head into the patient's body.
The spatial position of any pixel in the image relative to the sensor head can be easily calculated as follows:
PX = OX + NX * SX * DX
PY = OY + NY * SY * DY
among them,
P = the position of the pixel relative to the sensor head,
O = origin,
N = the index value of the pixel in the image,
S = pixel size,
D = pixel orientation.
In addition, when a frame is prepared and data can be read, the event void FrameReady () in a preferred embodiment is used. The handler copies the data from the data access method and then calls ReleaseFrame (). It is recommended to avoid any kind of infinite processing in the handler (for example, a function that calls a message loop). In addition, use void FrameOverrun () when the server cannot send a frame or has to rewrite a frame in the buffer (because the buffer is full). This only applies to FIFO and LIFO modes, because the LIO automatically releases the old buffer. This event is useful to determine if the client application is fast enough to read the frame and the number of allocated buffers is sufficient for the client's latency.
In a preferred embodiment, USAutoView is a sample client application that enables the client to automate and display live ultrasound image frames. It has proven to start and stop the server, hide and show the server, switch between graphics on the display image and graphics on the non-display image, freeze and restore ultrasound acquisition, load a preset check, change the Designated patient size, change image size, spatial information and reverse image function.
FIG. 50 is a view of a graphical user interface 4950 for a USAutoView UI according to a preferred embodiment of the present invention. The USAutoView program is a Windows® dialog application with one of three ActiveX components. TTFrameReceiver, which provides an ActiveX interface for receiving ultrasonic picture frames; TTAutomate, which encapsulates server-side automation; and TTSimplelmageWnd, which is an image display window. CUSAutoViewDlg is the main dialog. It manages server-side automation through the TTAutomate control, receives ultrasound frames through TTFrameReceiver, and displays images through TTSimplelrnageWnd. CUSAutoViewDlg's OnStartUS () method calls to start or stop the TTAutomate and TTFrameReceiver methods required for server-side automation and data transmission.
The method OnFramReady () handles the FrameReady event from TTFrameReciever. It copies the required data from TTFrameReceiver and then releases the frame using TTFrameReceiver's ReleaseFrame () method. It avoids any function that performs uncertain processing (such as a function that calls a message loop).
TTAutomate is an ActiveX control that encapsulates server-side automation functions. The native COM automation interface on the server side is not blocked and needs to wait with GetStatusFlags to coordinate the function. TTAutomate wraps each function in the required waiting loop. These wait loops allow processing of Windows® messages so that the user interface thread of the client application is not blocked while waiting. Although the automation methods in TTAutomate cannot return until the function is completed, other Windows® messages are processed before the function is completed. It is recommended to prevent multiple concurrent calls from the message handler to the TTAutomate method, because coordination with the server side is generally not re-entrant. The source code for this control is included in the USAutoView workspace. It can be reused or modified as needed.
TTSimplelmageWnd is an ActiveX control that provides a display window for device independent bit mapping (DIB). The two properties of the display interface are long DIBitmaplnfo and long DIBits. DIBitmapInfo corresponds to an index of a block of memory containing a BITMAPINFO structure for DIB. DIBits corresponds to an index of a block of memory containing image pixels. To load a new image, DIBitmapInfo is set as an indicator of the bit mapping information of the DIB. DIBits is then set as an indicator of bit-mapped bits. When setting DIBits, it is expected that the index set for DIBitmapInfo is still valid and both the bit mapping information and the bit mapping bits are internally copied to be displayed on the screen. Set DIBitmapInfo and DIBits to zero to clear the image. The source code for this control is included in the USAutoView workspace. It can be reused or modified as needed.
The preferred embodiment of the present invention includes a plurality of probe head types. For example, these probes include (but are not limited to): a convex linear sensor array operating between 2 MHz and 4 MHz, a phased linear sensor array operating between 2 MHz and 4 MHz, A convex linear cavity sensor array operating between 8 MHz, a linear sensor array operating between 4 MHz and 8 MHz, and a linear sensor array operating between 5 MHz and 10 MHz.
A preferred embodiment of the portable ultrasound system of the present invention provides high-resolution images during an examination, such as the following images: B-mode, M-mode, color Doppler (CD), pulse wave Doppler (PWD), orientation Energy Doppler (DirPwr) and Energy Doppler (PWR). Once the system software is installed, the probe device is connected to a desktop or laptop computer. The probe can be an industry standard sensor connected to a 28 oz. Box, one of the beamforming hardware containing the system. If the probe is connected to a laptop, a 4-pin FireWire cable is connected to an IEEE 1394 serial connection positioned on a built-in MediaBay. However, if the probe is connected to a desktop computer, the computer may not be equipped with MediaBay. We can use an external DC module (EDCM) connector to connect the probe head. Before connecting the probe, we need to make sure that Firewire is connected to both the right and left sides of the computer.
In one embodiment, the EDCM is designed to accept a 6-pin IEEE 1394 (also known as FireWire) cable at one end and a Lemo connector from one of the probes at the other end. The EDCM accepts an input DC voltage from one of +10 volts to +40 volts. In addition, in one embodiment, the system can be connected to a host computer using IEEE 1394. The 6-pin IEEE 1394 input to the EDCM can be derived from any IEEE 1394-equipped host computer running, for example, the Windows® 2000 operating system. An external IEEE 1394 hub may also be necessary to provide the required DC voltage to the EDCM. In a host computer equipped with one of the IEEE 1394, there is one of two types of the IEEE 1394 connector (a 4-pin or a 6-pin). 6-pin connectors are most commonly found in PC-based workstations that use internal PCI bus cards. Usually, the 6-pin connector provides the required DC voltage to the EDCM. A 6-pin male to 6-pin male IEEE 1394 cable is used to connect the host computer to the EDCM.
The 4-pin connector is found in laptops that do not contain MediaBay or provide a DC voltage output according to one of the preferred embodiments. When using this connector type, an external IEEE-1394 hub can be used to power the EDCM and the probe.
When power is not provided from the host computer, an external IEEE-1394 hub can be used between the host computer and the EDCM. The hub derives its power from a wall socket and is connected using a medical grade power supply that complies with IEC 60601-1 electrical safety standards.
To connect the hub to the host computer, a 4-pin male to 6-pin male or 6-pin male to 6-pin male IEEE cable is required. Insert the appropriate connector (4-pin or 6-pin) into the host computer and the 6-pin connector into the hub. Next, use a 6-pin male to 6-pin male IEEE 1394 cable to connect the hub to the EDCM. An IEEE 1394 hub is required only if the host computer cannot supply at least +10 volts to +40 volts direct current (DC) and 10 watts of power to the EDCM. If the host computer can supply enough voltage and power, a 6-pin male to 6-pin male IEEE 1394 cable can be used to connect the computer directly to the EDCM.
FIG. 51 illustrates a view of a home screen display of a graphical user interface according to a preferred embodiment of the present invention. When the user activates the system according to the present invention, the home screen 5170 is displayed. To help users navigate, the home screen can be thought of as four separate work areas that provide information to help us perform tasks. These working areas include a menu bar 5172, an image display window 5174, an image control bar 5176, and a toolbar 5178 to 5186.
To adjust the size of the window and area, the user can click the small button at the top right of the window to close, resize and exit the program. A user interface or button closes the window but leaves the program running (minimize the window). A system button appears at the bottom of the screen in an area called the taskbar. By clicking the system button in the taskbar, the window reopens. The other interface button enlarges the window to fill the entire screen (called maximize), however, when the window is at its maximum, the frame rate can be reduced. Another interface button returns the window to its previous size before zooming in. The system program can be closed by another interface button.
Users can increase or decrease the width of each area of the application to meet our needs. For example, to make the Explorer window narrower, place the cursor at either end of the area and click and drag to get the new desired size. We can reposition the size and location of each area so that they become floating windows. To create a floating window, the user simply selects his own mouse on the double edge boundary of a specific area and drags it until it looks like a floating window. To restore the floating window to its original form, I double-clicked in the window. These functionalities are depicted in FIGS. 52A-52C, which are views in a graphical user interface 5200, 5208, 5220 according to a preferred embodiment of the present invention.
The Explorer window provides a nested level file directory 5202 for all patient folders of user-generated images that have been generated and saved. The folder directory structure includes the following (but not limited to): a patient folder and an image folder. The patient folder directory is where the patient information files are stored along with any associated images. The image folder directory contains images by date and exam type. The images in this catalog are not associated with a patient and are produced without patient information. 53A to 53B illustrate a patient folder 5340 and an image folder 5350 according to a preferred embodiment of the present invention. The menu bar at the top of the screen provides nine options that I can use to perform basic tasks. To access a menu option, simply select the menu name to display the drop-down menu options. Users can also access any menu by using their shortcut key combinations.
The image display window provides two tabs: Image Display and Patient Information. The user clicks on the image display index tab to view the ultrasound image. The image is displayed in the window according to the defined control settings. Once the image is saved, when the user captures it again, the classification, date, and time of the image are also displayed in the image display window. The patient information tab is used to enter new patient information that will be stored in a patient folder later. Users can access this tab to also modify and update patient information.
54A and 54C illustrate an XY dual-plane detection head composed of two one-dimensional, multi-element arrays. The arrays can be constructed by stacking each other, where one of the polarization axes of each array is aligned in the same direction. The elevation axes of the two arrays may be at right angles to each other or orthogonal to each other. Exemplary embodiments may employ a sensor assembly, such as the sensor assembly described in U.S. Patent No. 7,066,887 (the entire contents of which are incorporated herein by reference) or the Cade, Tours, France Sensors sold by Tours Cedex. As shown in FIG. 54A, the array orientation is represented by the configuration 5400. The polarization axes (5408, 5422) of the two arrays are indicated in the z-axis 5406. The elevation axis of the bottom array is indicated in the y direction 5402, and the elevation axis of the top array is in the x direction 5404.
With further illustration in FIG. 54B, a one-dimensional multi-element array forms an image as depicted in configuration 5412. A one-dimensional array having an elevation axis 5410 in a y direction 5402 forms an ultrasound image 5414 on the x-axis 5404 and z-axis 5406 planes. A one-dimensional array having an elevation angle axis 5410 in the x direction 5404 forms an ultrasound image 5414 on the y axis 5402 and the z axis 5406. A one-dimensional sensor array having an elevation axis 5410 along a y-axis 5402 and a polarization axis 5408 along a z-axis 5406 will result in an ultrasound image 5414 formed along the x-plane 5404 and the z-plane 5406. An alternative embodiment illustrated by FIG. 54C depicts a one-dimensional sensor array having an elevation axis 5420 on the x-axis 5404 and a polarization axis 5422 in the z-axis 5406 direction. An ultrasound image 5424 is formed on the y-plane 5402 and the z-plane 5406.
FIG. 55 illustrates the operation of a dual-plane image forming xy probe, wherein the array 5512 has a high voltage applied to form an image. High voltage drive pulses 5506, 5508, 5510 can be applied to the bottom array 5504 with a y-axis elevation angle. This application may cause a transmission pulse for forming a received image on the XZ plane, while keeping the elements of the top array 5502 at a ground level. These probes enable a 3D imaging mode that uses one of the simpler electronics than a full 2D sensor array. As described herein, a touch screen-enabled user interface may employ screen icons and gestures to activate 3D imaging operations. These imaging operations can be augmented by software running on a tablet computer data processor, which processes the image data into 3D ultrasound images. This image processing software may use smooth filtering and / or interpolation operations known in the art. Beam steering can also be used to enable 3D imaging operations. A preferred embodiment uses a plurality of 1D sub-array sensors configured for dual plane imaging.
Figure 56 illustrates the operation of a dual plane image forming xy probe. FIG. 56 illustrates an array 5610 having a high voltage applied to it for forming an image. High voltage pulses 5602, 5604, 5606 can be applied to the top array 5612 with an elevation angle on the x-axis, thereby generating a transmission pulse for forming a received image on the yz plane, while keeping the element 5614 of the bottom array 5614 grounded. This embodiment may also utilize an orthogonal 1D sensor array using a sub-array beamforming operation as described herein.
Figure 57 shows the circuit requirements for a dual plane image forming xy probe. Receive beamforming requirements are described for a dual-plane probe. Make a connection to one of the receiving electronics 5702. Next, the components from the selection bottom array 5704 and the selection top array 5708 are connected to share one connection to the receiving electronics 5702 channel. A two to one multiplexer circuit can be integrated on the high voltage drivers 5706, 5710. The two to one multiplexer circuits can be integrated on the high voltage drivers 5706, 5712. A receive beam is formed for each transmission beam. The dual plane system requires a total of 256 transmission beams. For the 256 transmission beams, 128 transmission beams are used to form an XZ plane image and another 128 transmission beams are used to form a YZ plane image. Beam forming techniques that have been received multiple times can be used to improve the frame rate. One ultrasonic system with dual receiving beam capabilities for each transmission beam provides a system in which two reception beams can be formed. The dual plane probe only needs a total of 128 transmission beams to form two orthogonal plane images, of which 64 transmission beams are used to form an XZ plane image, and another 64 transmission beams are used to form a YZ plane image. Similarly, for an ultrasound system with a quadruple or four receive beam capability, the probe requires 64 transmission beams to form two orthogonal plane images.
58A to 58B illustrate an application for simultaneous dual plane evaluation. The ability to use cardiographic imaging to measure LV mechanical asynchrony can help identify patients who are more likely to benefit from cardiac resynchronization therapy. The LV parameters that need to be quantified are Ts- (lateral-septal), Ts-SD, Ts-peak, etc. The Ts- (lateral-septal) can be measured on a 2D apical 4-chamber view echo image, and Ts-SD, Ts-peak (medial), Ts-onset (medial), Ts-peak (basal), Ts-onset (basal) can be obtained on two separate parasternal short-axis views with 6 segments (providing a total of 12 segments) at the mitral valve and papillary muscle levels, respectively. 58A to 58B depict one of the apical four-chamber images 5804 and the apical two-chamber images 5802, which are to be viewed simultaneously, one of the xy probes.
59A to 59B illustrate measurement techniques of the ejection fraction detecting head. When the visualization of two orthogonal planes ensures an on-axis view, the dual plane probe provides EF measurement. Automatic boundary detection algorithms provide quantized echo results to select implanted transponders and guide AV delay parameter settings. As depicted in Figure 59A, the XY probe acquires real-time simultaneous images from two orthogonal planes and the images 5902 and 5904 are displayed on a split screen. A manual contour tracking or automatic boarder tracking technique can be used to track endometrial boarders (from which EF is calculated) at both end-systole and end-diastole. The LV regions (A1 and A2, respectively) in apical 2CH view 5902 and apical 4CH view 5904 were measured at the end of diastole and end of systole. LVEDV (left ventricular end-diastolic volume) and LVESV (left ventricular end-systolic volume) are calculated using the following formula:
. And the ejection fraction isCalculation.
FIG. 60 illustrates an exemplary method for wirelessly transmitting data to and from a portable ultrasonic imaging device according to an embodiment of the present invention. The method may begin with selecting a wireless communication menu option 6001 to present one of the various wireless connections available to the user. For example, a user may wish to connect to a WiFi network, a 3G or 4G cellular network, or some other wireless network. This method can choose 6002 to continue a desired wireless connection. The method may further include selecting one or more destinations for transmitting 6003 ultrasound data. In some embodiments, this selection may be performed by using a touch screen UI to select one or more hospitals, doctors, clinics, etc., similar to the way we select a contact from a telephone contact list. The method may further include determining whether 6004 expects an audio connection and / or a video connection. In some embodiments, in addition to transmitting ultrasound and other medical information between the portable ultrasound device and a remote hospital or clinic, users can also establish an audio and / or video connection via the same wireless network . This feature allows users of ultrasound imaging devices, for example, to remotely execute and transmit ultrasound data when in direct audio and / or video contact with a hospital or medical professional. In one example, initiating an audio and / or video call with a hospital may allow a user of a portable ultrasound imaging device to receive guidance and / or advice from a doctor while performing an ultrasound procedure. If an audio and / or video call is desired, the method may further include starting an audio / video call with 6005 and the desired destination.
If no audio / video connection is desired, or after an audio / video call is initiated, the method may further include determining whether 6006 intends to transmit ultrasonic imaging data in real time, or whether the user only wishes to transmit the generated ultrasonic data. If an instant connection is desired, the method may further include starting a 6007 ultrasound scan and transmitting 6008 ultrasound data to the desired destination (s) in real time. If real-time ultrasound data transmission is not required, the method can select 6009 (s) of the desired file and transmit 6010 (s) of the selected file (s) to the desired (s) of the desired destination to continue.
In some embodiments, a user can perform the method described above by navigating through various windows, folders, sub-folders, menus and / or sub-menus presented through a touch-sensitive UI. One touch screen gesture can be performed on an icon (by dragging and dropping an icon from one part to another, selecting or deselecting one or more check boxes, or performing any other sufficiently unique or distinguishable touch Control screen commands) to execute various UI commands for selecting a menu option, destination, file, etc. In some embodiments, the various touch screen commands described herein may be user-configurable, while in other embodiments they are hard-coded. As will be understood, the various elements of the methods described herein may be performed in any desired order. For example, in some embodiments, a user may select 6003 one or more destinations before selecting the file (s) to be transmitted 6009, while in other embodiments, a user may select 6003 before the desired destination. Select 6009 one or more files. Similarly, other elements of the methods described above may be performed in a variety of sequences or simultaneously, and the methods described herein are not intended to be limited to any particular sequence unless otherwise stated.
It should be noted that the operations described herein are purely illustrative and do not imply any particular order. In addition, such operations may be used in any sequence as appropriate, and / or may be used in part. Exemplary flowcharts are provided herein for illustrative purposes and these exemplary flowcharts are non-limiting examples of methods. Those of ordinary skill will recognize that the exemplary method may include more or less steps than those shown in the illustrative flowchart, and may perform the steps in the exemplary flowchart in a different order than one shown.
In describing exemplary embodiments, specific terminology is used for clarity. For the purpose of description, specific terms are intended to include at least all technical and functional equivalents that operate in a similar manner to accomplish a similar purpose. Furthermore, in some instances where a particular exemplary embodiment includes a plurality of system elements or method steps, a single element or step may be substituted for those elements or steps. Similarly, a single element or step may be replaced by a plurality of elements or steps serving the same purpose. In addition, in the case where parameters for various properties are specified for the exemplary embodiments herein, unless otherwise specified, these parameters may be adjusted up and down by one-twentieth, one-tenth, and one-fifth. , One-third, one-half, etc., or a rounded approximation.
In view of the above illustrative embodiments, it should be understood that these embodiments may be implemented using various computers that involve transferring or storing data in a computer system. These operations require physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic, and / or optical signals capable of being stored, transferred, combined, compared, and / or otherwise manipulated.
Moreover, any of the operations described herein that form part of the illustrative embodiments are useful machine operations. The illustrative embodiments are also related to a device or a device for performing such operations. The device may be specially constructed for the required purpose, or it may be incorporated into a general-purpose computer device that is selectively activated or configured by a computer program stored in the computer. In particular, various general-purpose machines employing one or more processors coupled to one or more computer-readable media may be used with computer programs written in accordance with the teachings disclosed herein, or constructed to perform one of the required operations More specialized devices may be more convenient.
The foregoing description has been directed to specific illustrative embodiments of the invention. It will be understood, however, that other variations and modifications of the described embodiments may be made to obtain some or all of their associated advantages. In addition, the programs, processes, and / or modules described herein may be implemented in hardware, software (embodied as a computer-readable medium with program instructions), firmware, or a combination thereof. For example, one or more of the functions described herein may be performed by a processor that executes program instructions from a memory or other storage device.
Those skilled in the art will understand that modifications and changes to the systems and methods described above can be made without departing from the inventive concepts disclosed herein. Therefore, the present invention should not be regarded as limiting, except as for example, by the scope and spirit of the scope of the accompanying patent application.

3‧‧‧探測頭3‧‧‧ Probe

5‧‧‧主機電腦 5‧‧‧ host computer

9‧‧‧遠端顯示器及/或記錄器件 9‧‧‧ Remote display and / or recording device

100‧‧‧醫療超聲波成像設備/設備/可攜式超聲波系統/系統/超聲波系統 100‧‧‧Medical Ultrasound Imaging Equipment / Equipment / Portable Ultrasound System / System / Ultrasonic System

101‧‧‧前面板 101‧‧‧ front panel

102‧‧‧殼體/單元 102‧‧‧case / unit

103‧‧‧後面板 103‧‧‧ rear panel

104‧‧‧觸控螢幕顯示器/分割觸控螢幕顯示器/顯示器/多點觸控式LCD觸控螢幕顯示器 104‧‧‧Touch screen display / Split touch screen display / Monitor / Multi-touch LCD touch screen display

105‧‧‧表面 105‧‧‧ surface

106‧‧‧電腦主機板/運算電路 106‧‧‧Computer motherboard / computing circuit

107‧‧‧觸控感測器/感測器 107‧‧‧touch sensor / sensor

108‧‧‧超聲波引擎/128通道超聲波引擎電路板 108‧‧‧Ultrasonic engine / 128-channel ultrasonic engine circuit board

109‧‧‧觸控處理器 109‧‧‧Touch Processor

110‧‧‧電池 110‧‧‧ Battery

112‧‧‧通信鏈路/鏈路/高速串列介面 112‧‧‧communication link / link / high-speed serial interface

114‧‧‧探測頭連接器/連接器 114‧‧‧ Probe head connector / connector

115‧‧‧探測頭附接/卸離桿 115‧‧‧ Probe attachment / detachment lever

116‧‧‧輸入/輸出(I/O)埠連接器 116‧‧‧Input / Output (I / O) port connector

118‧‧‧通信電路/用戶識別模組(SIM)介面電路 118‧‧‧Communication Circuit / Subscriber Identity Module (SIM) Interface Circuit

119‧‧‧用戶識別模組(SIM)卡埠 119‧‧‧ Subscriber Identity Module (SIM) card port

120‧‧‧用戶識別模組(SIM)卡 120‧‧‧ Subscriber Identity Module (SIM) Card

140‧‧‧系統 140‧‧‧System

150‧‧‧傳感器殼體/探測頭/超聲波探測頭/傳感器/手持式傳感器探測頭/探測頭殼體 150‧‧‧sensor housing / probe / ultrasonic probe / sensor / handheld sensor probe / probe housing

152‧‧‧傳感器元件陣列/傳感器陣列 152‧‧‧Sensor element array / sensor array

154‧‧‧探測頭識別電路 154‧‧‧detection head recognition circuit

302‧‧‧點選手勢 302‧‧‧click gesture

304‧‧‧捏合手勢 304‧‧‧ pinch gesture

306‧‧‧撥動手勢/動態、連續撥動手勢 306‧‧‧Push gesture / Dynamic, continuous gesture

308‧‧‧旋轉手勢 308‧‧‧Rotate gesture

310‧‧‧點兩下手勢 310‧‧‧ double click gesture

312‧‧‧展開型手勢 312‧‧‧Expand gesture

314‧‧‧撥動手勢/動態、連續撥動手勢 314‧‧‧Push gesture / Dynamic, continuous gesture

316‧‧‧旋轉手勢 316‧‧‧Rotate gesture

318‧‧‧拖曳手勢/動態、連續拖曳手勢 318‧‧‧Drag gesture / dynamic, continuous drag gesture

320‧‧‧按壓手勢 320‧‧‧Press gesture

322‧‧‧按壓及拖曳手勢 322‧‧‧Press and drag gestures

324‧‧‧手掌手勢 324‧‧‧ palm gesture

340‧‧‧超聲波波束成形及成像操作 340‧‧‧Ultrasonic Beamforming and Imaging Operations

342‧‧‧波束成形及影像處理操作 342‧‧‧Beamforming and image processing operations

344‧‧‧選擇一第一顯示操作 344‧‧‧Select a first display operation

346‧‧‧調整波束成形參數 346‧‧‧Adjust beamforming parameters

348‧‧‧更新及顯示經顯示影像 348‧‧‧ Update and display the displayed image

350‧‧‧執行具有一不同速度特性(方向或速率或兩者)之一不同手勢以調整第一超聲波顯示操作之一第二特性 350‧‧‧ Perform one of different gestures with a different speed characteristic (direction or velocity or both) to adjust one of the second ultrasonic display operations

352‧‧‧更新經顯示影像 352‧‧‧Update displayed image

402‧‧‧子集 402‧‧‧subset

404‧‧‧子集 404‧‧‧subset

406‧‧‧子集 406‧‧‧subset

408‧‧‧觸控控制項/2D觸控控制項 408‧‧‧Touch Controls / 2D Touch Controls

410‧‧‧觸控控制項/增益觸控控制項 410‧‧‧Touch Control / Gain Touch Control

412‧‧‧觸控控制項/色彩觸控控制項 412‧‧‧Touch Controls / Color Touch Controls

414‧‧‧觸控控制項/儲存觸控控制項 414‧‧‧Touch Controls / Save Touch Controls

416‧‧‧觸控控制項/分割觸控控制項 416‧‧‧Touch Controls / Split Touch Controls

418‧‧‧觸控控制項/PW成像觸控控制項 418‧‧‧Touch Controls / PW Imaging Touch Controls

420‧‧‧觸控控制項/波束導向觸控控制項 420‧‧‧Touch Controls / Beam Steering Touch Controls

422‧‧‧觸控控制項/註釋觸控控制項 422‧‧‧Touch Controls / Annotation Touch Controls

424‧‧‧觸控控制項/動態範圍操作觸控控制項 424‧‧‧Touch Controls / Dynamic Range Operation Touch Controls

426‧‧‧觸控控制項/Teravision™觸控控制項 426‧‧‧Touch Controls / Teravision ™ Touch Controls

428‧‧‧觸控控制項/映射操作觸控控制項 428‧‧‧Touch Controls / Map Operations Touch Controls

430‧‧‧觸控控制項/針導引觸控控制項 430‧‧‧Touch Controls / Pin Guided Touch Controls

502‧‧‧肝臟 502‧‧‧ liver

504‧‧‧囊性病變/經放大之囊性病變 504‧‧‧cystic lesions / enlarged cystic lesions

506‧‧‧虛擬視窗 506‧‧‧Virtual window

508‧‧‧手指 508‧‧‧finger

602‧‧‧心臟 602‧‧‧heart

604‧‧‧心內膜邊界 604‧‧‧Endocardial border

606‧‧‧左心室 606‧‧‧ left ventricle

607‧‧‧游標 607‧‧‧Cursor

608‧‧‧虛線 608‧‧‧ dotted line

610‧‧‧手指 610‧‧‧finger

612‧‧‧手指 612‧‧‧finger

702‧‧‧肝臟 702‧‧‧ liver

704‧‧‧囊性病變 704‧‧‧cystic lesion

706‧‧‧虛擬視窗 706‧‧‧Virtual window

707‧‧‧第一游標 707‧‧‧first cursor

709‧‧‧第二游標 709‧‧‧second cursor

710‧‧‧手指 710‧‧‧finger

712‧‧‧手指 712‧‧‧finger

802‧‧‧肝臟 802‧‧‧ liver

804‧‧‧囊性病變 804‧‧‧cystic lesions

806‧‧‧虛擬視窗 806‧‧‧Virtual window

807‧‧‧第一游標/游標 807‧‧‧first cursor / cursor

809‧‧‧第二游標 809‧‧‧second cursor

810‧‧‧手指 810‧‧‧finger

811‧‧‧連接線 811‧‧‧connecting wire

812‧‧‧手指 812‧‧‧finger

900‧‧‧軟體流程圖 900‧‧‧ Software Flowchart

904‧‧‧步驟 904‧‧‧step

906‧‧‧步驟 906‧‧‧step

908‧‧‧步驟 908‧‧‧step

910‧‧‧步驟 910‧‧‧step

912‧‧‧步驟 912‧‧‧step

914‧‧‧步驟 914‧‧‧step

916‧‧‧步驟 916‧‧‧step

918‧‧‧步驟 918‧‧‧step

920‧‧‧步驟 920‧‧‧step

921‧‧‧步驟 921‧‧‧step

922‧‧‧步驟 922‧‧‧step

924‧‧‧步驟 924‧‧‧step

926‧‧‧步驟 926‧‧‧step

928‧‧‧步驟 928‧‧‧step

930‧‧‧步驟 930‧‧‧step

932‧‧‧步驟 932‧‧‧step

934‧‧‧步驟 934‧‧‧step

936‧‧‧步驟 936‧‧‧step

938‧‧‧步驟 938‧‧‧step

940‧‧‧步驟 940‧‧‧step

942‧‧‧步驟 942‧‧‧step

944‧‧‧步驟 944‧‧‧step

946‧‧‧步驟 946‧‧‧step

948‧‧‧步驟 948‧‧‧step

950‧‧‧步驟 950‧‧‧step

952‧‧‧頻譜多普勒平均速度之時間積分/手持式掌上個人電腦 952‧‧‧Spectrum Doppler Mean Time Integral / Handheld Personal Computer

956‧‧‧針 956‧‧‧ needle

958‧‧‧系統 958‧‧‧ system

960‧‧‧超聲波傳感器元件/傳感器元件 960‧‧‧Ultrasonic sensor element / sensor element

962‧‧‧針導件 962‧‧‧ Needle Guide

964‧‧‧超聲波反射體磁碟/反射體磁碟 964‧‧‧Ultrasonic Reflector Disk / Reflector Disk

966‧‧‧針導件安裝支架 966‧‧‧ Needle guide mounting bracket

970‧‧‧超聲波成像探測頭總成 970‧‧‧Ultrasonic imaging probe head assembly

972‧‧‧超聲波 972‧‧‧Ultrasonic

974‧‧‧經反射之超聲波 974‧‧‧Reflected Ultrasound

976‧‧‧距離 976‧‧‧distance

978‧‧‧線性超聲波聲音陣列/用於成像之超聲波成像探測頭總成 978‧‧‧Linear ultrasonic sound array / Ultrasonic imaging probe head assembly for imaging

980‧‧‧超聲波傳感器陣列/傳感器元件 980‧‧‧ultrasonic sensor array / sensor element

982‧‧‧用於成像患者之身體之超聲波成像探測頭總成/超聲波成像探測頭總成 982‧‧‧Ultrasonic imaging probe head assembly / ultrasonic imaging probe head assembly for imaging the body of a patient

984‧‧‧超聲波傳感器陣列 984‧‧‧Ultrasonic sensor array

986‧‧‧系統 986‧‧‧System

1010‧‧‧個人電腦(PC)/主機電腦 1010‧‧‧PC / PC

1012‧‧‧USB連接/定製或USB3晶片組 1012‧‧‧USB connection / custom or USB3 chipset

1014‧‧‧微處理器 1014‧‧‧Microprocessor

1020‧‧‧介面單元/兩級波束成形系統/介面電路 1020‧‧‧Interface unit / Two-stage beamforming system / Interface circuit

1022‧‧‧USB連接/定製USB3晶片組/USB3晶片組/定製或USB3晶片組 1022‧‧‧USB Connect / Custom USB3 Chipset / USB3 Chipset / Custom or USB3 Chipset

1024‧‧‧系統控制器 1024‧‧‧System Controller

1025‧‧‧連接器 1025‧‧‧ Connector

1026‧‧‧場可程式化閘極陣列/場可程式化閘極陣列(FPGA)數位波束成形/通信埠 1026‧‧‧ Field Programmable Gate Array / Field Programmable Gate Array (FPGA) Digital Beamforming / Communication Port

1027‧‧‧連接器 1027‧‧‧ Connector

1028‧‧‧A/D轉換器 1028‧‧‧A / D converter

1030‧‧‧A/D轉換器 1030‧‧‧A / D converter

1032‧‧‧記憶體 1032‧‧‧Memory

1034‧‧‧DC-DC轉換器 1034‧‧‧DC-DC converter

1040‧‧‧整合式超聲波探測頭/超聲波探測頭/傳感器/兩級波束成形系統 1040‧‧‧Integrated ultrasonic probe / ultrasonic probe / sensor / two-stage beamforming system

1042‧‧‧電源供應器 1042‧‧‧Power Supply

1044‧‧‧控制器 1044‧‧‧Controller

1046‧‧‧記憶體 1046‧‧‧Memory

1048‧‧‧多工器1 1048‧‧‧ Multiplexer 1

1050‧‧‧傳輸驅動器1 1050‧‧‧Transfer driver 1

1052‧‧‧子陣列/孔徑/子陣列波束成形器1 1052‧‧‧Subarray / Aperture / Subarray Beamformer 1

1054‧‧‧傳輸驅動器m 1054‧‧‧Transmission driver m

1056‧‧‧多工器m 1056‧‧‧Multiplexer m

1058‧‧‧記憶體 1058‧‧‧Memory

1060‧‧‧子陣列波束成形器n 1060‧‧‧Subarray Beamformern

1062‧‧‧1D傳感器陣列 1062‧‧‧1D sensor array

1064‧‧‧影像目標 1064‧‧‧Image target

1066‧‧‧纜線 1066‧‧‧cable

1068‧‧‧纜線 1068‧‧‧cable

1075‧‧‧器件 1075‧‧‧device

1082‧‧‧主機電腦 1082‧‧‧Host computer

1102‧‧‧影像目標 1102‧‧‧Image Target

1104‧‧‧纜線 1104‧‧‧cable

1106‧‧‧高電壓傳輸/接收(TR)模組/傳輸/接收(TR)模組 1106‧‧‧High Voltage Transmit / Receive (TR) Module / Transmit / Receive (TR) Module

1108‧‧‧前置放大器/時間增益補償(TGC)模組 1108‧‧‧Preamp / Time Gain Compensation (TGC) Module

1110‧‧‧取樣資料波束成形器/樣本內插接納波束成形器/波束成形器 1110‧‧‧Sampling data beamformer / sample interpolation acceptance beamformer / beamformer

1112‧‧‧先進先出(FIFO)緩衝模組 1112‧‧‧FIFO buffer module

1114‧‧‧記憶體 1114‧‧‧Memory

1116‧‧‧系統控制器/波束成形器控制處理器 1116‧‧‧System Controller / Beamformer Control Processor

1118‧‧‧通信晶片組 1118‧‧‧Communication Chipset

1120‧‧‧通信晶片組 1120‧‧‧Communication Chipset

1122‧‧‧核心電腦可讀記憶體/記憶體 1122‧‧‧Core Computer Readable Memory / Memory

1124‧‧‧微處理器/波束成形器控制處理器 1124‧‧‧Microprocessor / Beamformer Control Processor

1126‧‧‧顯示控制器 1126‧‧‧Display Controller

1200‧‧‧電路板 1200‧‧‧Circuit Board

1202‧‧‧高密度互連(HDI)基板/基板 1202‧‧‧High-density interconnect (HDI) substrate / substrate

1204‧‧‧第一積體電路晶片 1204‧‧‧The first integrated circuit chip

1206‧‧‧第一間隔層 1206‧‧‧First spacer

1208‧‧‧第二積體電路晶片 1208‧‧‧Second Integrated Circuit Chip

1210‧‧‧金屬框架 1210‧‧‧ metal frame

1212‧‧‧配線 1212‧‧‧Wiring

1214‧‧‧配線 1214‧‧‧Wiring

1216‧‧‧封裝 1216‧‧‧Packaging

1302‧‧‧步驟 1302‧‧‧step

1304‧‧‧步驟 1304‧‧‧step

1306‧‧‧步驟 1306‧‧‧step

1308‧‧‧步驟 1308‧‧‧step

1310‧‧‧步驟 1310‧‧‧step

1312‧‧‧步驟 1312‧‧‧step

1600‧‧‧多晶片模組 1600‧‧‧Multi-chip module

1602‧‧‧傳輸/接收(TR)晶片/晶片 1602‧‧‧Transmit / Receive (TR) Chip / Chip

1604‧‧‧放大器晶片/晶片 1604‧‧‧Amplifier Chip / Chip

1606‧‧‧波束成形器晶片/晶片 1606‧‧‧Beamformer Chip / Chip

1608‧‧‧第一間隔層 1608‧‧‧First spacer

1610‧‧‧第二間隔層 1610‧‧‧Second spacer

1612‧‧‧金屬框架 1612‧‧‧metal frame

1614‧‧‧基板 1614‧‧‧ substrate

1702‧‧‧無線網路配接器/配接器 1702‧‧‧Wireless Network Adapter / Adapter

1704‧‧‧輸入/輸出(I/O)及圖形晶片組 1704‧‧‧Input / Output (I / O) and graphics chipset

1706‧‧‧串列或並列介面 1706‧‧‧Serial or Parallel Interface

1708‧‧‧電力模組 1708‧‧‧Power Module

1710‧‧‧第一多晶片模組/多晶片模組 1710‧‧‧The first multi-chip module / multi-chip module

1712‧‧‧第二多晶片模組/多晶片模組 1712‧‧‧Second multi-chip module / multi-chip module

1714‧‧‧時脈產生複雜可程式化邏輯器件(CPLD) 1714‧‧‧ Clock generates complex programmable logic device (CPLD)

1718‧‧‧延遲設定檔及波形產生器場可程式化閘極陣列(FPGA) 1718‧‧‧ Delay profile and waveform generator field programmable gate array (FPGA)

1720‧‧‧記憶體 1720‧‧‧Memory

1722‧‧‧掃描序列控制場可程式化閘極陣列(FPGA) 1722‧‧‧Scanning Sequence Control Field Programmable Gate Array (FPGA)

1724‧‧‧電力模組 1724‧‧‧Power Module

1802‧‧‧觸控筆 1802‧‧‧Stylus

1804‧‧‧殼體連接器 1804‧‧‧shell connector

1806‧‧‧可撓性纜線 1806‧‧‧flexible cable

1900‧‧‧主圖形使用者介面(GUI) 1900‧‧‧Main Graphical User Interface (GUI)

1902‧‧‧功能表列 1902‧‧‧Function List

1904‧‧‧影像顯示視窗 1904‧‧‧Image display window

1906‧‧‧影像控制列 1906‧‧‧Image Control Bar

1908‧‧‧工具列 1908‧‧‧toolbar

2000‧‧‧醫療超聲波成像設備/超聲波成像設備/器件 2000‧‧‧Medical ultrasound imaging equipment / ultrasonic imaging equipment / devices

2010‧‧‧觸控螢幕顯示器/超聲波影像 2010‧‧‧Touch screen display / ultrasonic image

2020‧‧‧超聲波控制項 2020‧‧‧Ultrasonic control item

2030‧‧‧殼體 2030‧‧‧shell

2040‧‧‧超聲波資料 2040‧‧‧ Ultrasonic Data

2060‧‧‧前面板 2060‧‧‧Front panel

2070‧‧‧後面板 2070‧‧‧ rear panel

2080‧‧‧用戶識別模組(SIM)卡埠 2080‧‧‧ Subscriber Identity Module (SIM) card port

2082‧‧‧用戶識別模組(SIM)卡盤 2082‧‧‧ Subscriber Identity Module (SIM) chuck

2084‧‧‧用戶識別模組(SIM)卡 2084‧‧‧Subscriber Identity Module (SIM) Card

2100‧‧‧推車系統/推車組態 2100‧‧‧cart system / cart configuration

2102‧‧‧觸控螢幕顯示器 2102‧‧‧Touch screen display

2104‧‧‧平板電腦 2104‧‧‧ Tablet

2106‧‧‧可調整高度器件 2106‧‧‧Adjustable height device

2108‧‧‧推車/推車支架 2108‧‧‧cart / cart holder

2110‧‧‧凝膠固持器 2110‧‧‧gel holder

2112‧‧‧鍵盤 2112‧‧‧Keyboard

2114‧‧‧儲存箱 2114‧‧‧Storage Box

2120‧‧‧熱探測頭固持器 2120‧‧‧ Thermal Probe Holder

2122‧‧‧基底總成 2122‧‧‧base assembly

2124‧‧‧完整操作者控制台/操作者控制台 2124‧‧‧Complete Operator Console / Operator Console

2200‧‧‧推車系統/推車總成 2200‧‧‧Cart System / Cart Assembly

2210‧‧‧固持器 2210‧‧‧ holder

2212‧‧‧立式支撐部件/支撐樑/樑 2212‧‧‧Vertical support components / support beams / beams

2214‧‧‧控制台面板 2214‧‧‧Control Panel

2216‧‧‧多埠探測頭多工器器件/多工器器件 2216‧‧‧Multiport Probe Multiplexer Device / Multiplexer Device

2218‧‧‧固持器 2218‧‧‧ holder

2222‧‧‧儲存箱附接機構 2222‧‧‧Storage box attachment mechanism

2224‧‧‧儲存箱/配件固持器 2224‧‧‧Storage Box / Accessory Holder

2226‧‧‧繩管理系統/高度調整器件 2226‧‧‧ Rope management system / height adjustment device

2228‧‧‧基底 2228‧‧‧ substrate

2230‧‧‧電池 2230‧‧‧Battery

2232‧‧‧輪子 2232‧‧‧ Wheel

2300‧‧‧配置 2300‧‧‧Configuration

2302‧‧‧平板電腦/系統 2302‧‧‧ Tablet / System

2304‧‧‧銜接站/基底銜接單元/基底單元/銜接元件 2304‧‧‧Docking station / base joint unit / base unit / joint element

2305‧‧‧電連接器 2305‧‧‧electrical connector

2306‧‧‧附接機構/安裝座總成 2306‧‧‧ Attachment / mounting assembly

2307‧‧‧埠 2307‧‧‧port

2308‧‧‧鉸接部件 2308‧‧‧articulated parts

2310‧‧‧托架 2310‧‧‧Carriage

2312‧‧‧垂直部件/樑 2312‧‧‧Vertical Components / Beams

2400‧‧‧推車系統/配置 2400‧‧‧ cart system / configuration

2402‧‧‧平板電腦 2402‧‧‧ Tablet

2404‧‧‧連接器/附接機構 2404‧‧‧connector / attachment mechanism

2406‧‧‧安裝總成/鉸接部件 2406‧‧‧Mounting Assembly / Hinge

2408‧‧‧垂直支撐部件 2408‧‧‧Vertical support parts

2502‧‧‧銜接站 2502‧‧‧Interchange Station

2504‧‧‧平板電腦 2504‧‧‧ Tablet

2506‧‧‧基底總成 2506‧‧‧base assembly

2508‧‧‧釋放機構 2508‧‧‧Release agency

2510‧‧‧傳感器探測頭/傳感器探測頭連接器 2510‧‧‧Sensor head / Sensor head connector

2512‧‧‧傳感器埠 2512‧‧‧Sensor port

2526‧‧‧可調整支架/握把 2526‧‧‧Adjustable stand / grip

2600‧‧‧整合式探測頭系統/整合式超聲波探測頭系統/系統 2600‧‧‧Integrated Probe System / Integrated Ultrasonic Probe System / System

2602‧‧‧前端探測頭/探測頭 2602‧‧‧Front head / probe

2604‧‧‧主機電腦/基於Windows®之主機電腦 2604‧‧‧host computer / Windows®-based host computer

2606‧‧‧個人數位助理(PDA)/遠端顯示器及/或記錄器件/遠端器件/電腦(遠端器件) 2606‧‧‧Personal Digital Assistant (PDA) / Remote Display and / or Recording Device / Remote Device / Computer (Remote Device)

2608‧‧‧通信鏈路/通信鏈路或介面 2608‧‧‧communication link / communication link or interface

2610‧‧‧通信鏈路或介面/無線鏈路/通信鏈路/無線通信鏈路 2610‧‧‧communication link or interface / wireless link / communication link / wireless communication link

2612‧‧‧遠端運算系統 2612‧‧‧Remote Computing System

2802‧‧‧集線器 2802‧‧‧ Hub

2804‧‧‧通信鏈路 2804‧‧‧communication link

2906‧‧‧成像系統 2906‧‧‧Imaging System

2910‧‧‧無線代理器 2910‧‧‧Wireless Agent

2912‧‧‧無線觀看器/觀看器 2912‧‧‧Wireless viewer / viewer

3020‧‧‧影像觀看器 3020‧‧‧Image viewer

3024‧‧‧使用者介面按鈕 3024‧‧‧User Interface Button

3026‧‧‧使用者介面按鈕 3026‧‧‧User Interface Button

3028‧‧‧使用者介面按鈕 3028‧‧‧user interface button

3030‧‧‧觀看器 3030‧‧‧Viewer

3142a至3142n‧‧‧超聲波探測頭 3142a to 3142n‧‧‧ Ultrasonic probe

3144a至3144n‧‧‧膝上型電腦 3144a to 3144n‧‧‧ laptop

3146‧‧‧影像/患者資訊散發伺服器 3146‧‧‧Image / Patient Information Distribution Server

3148‧‧‧結構化查詢語言(SQL)資料庫伺服器 3148‧‧‧ Structured Query Language (SQL) Database Server

3152‧‧‧手持式設備或其他運算器件 3152‧‧‧Handheld device or other computing device

3264‧‧‧個人數位助理(PDA) 3264‧‧‧Personal Digital Assistant (PDA)

3266‧‧‧無線通信鏈路 3266‧‧‧Wireless communication link

3304‧‧‧記憶體 3304‧‧‧Memory

3306‧‧‧資料儲存器 3306‧‧‧Data Storage

3308‧‧‧匯流排 3308‧‧‧Bus

3310‧‧‧鏈路介面或資料介面電路/資料介面電路 3310‧‧‧link interface or data interface circuit / data interface circuit

3312‧‧‧第一連接 3312‧‧‧First connection

3314‧‧‧射頻(RF)電路/無線電介面 3314‧‧‧RF circuit / radio interface

3316‧‧‧無線電介面 3316‧‧‧ Radio Interface

3350‧‧‧射頻(RF)堆疊 3350‧‧‧Radio Frequency (RF) Stack

3370‧‧‧使用者介面電路 3370‧‧‧User Interface Circuit

3440‧‧‧示意圖 3440‧‧‧Schematic

3446‧‧‧診療所 3446‧‧‧ Clinic

3448‧‧‧診療所 3448‧‧‧ Clinic

3450‧‧‧診療所 3450‧‧‧ Clinic

3502‧‧‧二維影像視窗 3502‧‧‧Two-dimensional image window

3504‧‧‧二維影像掃描 3504‧‧‧Two-dimensional image scanning

3506‧‧‧彈性頻率掃描 3506‧‧‧Flexible Frequency Scan

3600‧‧‧平板電腦之觸控螢幕顯示器 3600‧‧‧Touch screen display for tablet

3604‧‧‧二維影像視窗 3604‧‧‧Two-dimensional image window

3606‧‧‧二維模式成像/二維影像 3606‧‧‧Two-dimensional imaging / two-dimensional imaging

3608‧‧‧運動模式成像 3608‧‧‧Sport mode imaging

3700‧‧‧平板電腦之觸控螢幕顯示器/平板電腦顯示器 3700‧‧‧Touch screen monitor / tablet monitor

3702‧‧‧彩色多普勒掃描資訊 3702‧‧‧Color Doppler Scan Information

3704‧‧‧彈性頻率控制項 3704‧‧‧Flexible Frequency Control

3706‧‧‧二維影像視窗 3706‧‧‧Two-dimensional image window

3708‧‧‧色彩編碼資訊 3708‧‧‧ Color Coded Information

3710‧‧‧二維影像 3710‧‧‧Two-dimensional image

3800‧‧‧平板電腦之觸控螢幕顯示器/平板電腦顯示器 3800‧‧‧Touch screen monitor / tablet monitor

3802‧‧‧二維影像 3802‧‧‧ 2D image

3804‧‧‧可調整之頻率控制項 3804‧‧‧adjustable frequency control

3806‧‧‧混合操作模式 3806‧‧‧Mixed operation mode

3808‧‧‧灰色陰影 3808‧‧‧Gray shade

3810‧‧‧時間/多普勒頻移 3810‧‧‧time / Doppler shift

3812‧‧‧樣本體積或樣本閘 3812‧‧‧Sample volume or sample gate

3900‧‧‧平板電腦之觸控螢幕顯示器/平板電腦顯示器 3900‧‧‧Touch screen monitor / tablet monitor

3902‧‧‧二維視窗 3902‧‧‧Two-dimensional window

3904‧‧‧色彩編碼資訊 3904‧‧‧ Color Coded Information

3906‧‧‧覆疊/色碼覆疊 3906‧‧‧ Overlay / Color Code Overlay

3908‧‧‧樣本體積/樣本閘 3908‧‧‧Sample volume / Sample gate

3910‧‧‧彈性頻率控制項 3910‧‧‧Flexible Frequency Control

3912‧‧‧時間/多普勒頻移 3912‧‧‧Time / Doppler shift

3916‧‧‧二維影像 3916‧‧‧Two-dimensional image

4000‧‧‧圖形使用者介面(GUI)主螢幕介面/用於使用者操作模式之螢幕介面/圖形使用者介面(GUI)主螢幕/主圖形使用者介面(GUI)主螢幕 4000‧‧‧GUI main screen interface / screen interface for user operation mode / graphic user interface (GUI) main screen / main graphic user interface (GUI) main screen

4002‧‧‧影像顯示視窗 4002‧‧‧Image display window

4004‧‧‧功能表列 4004‧‧‧Function List

4006‧‧‧影像控制列 4006‧‧‧Image Control Bar

4008‧‧‧深度控制觸控控制項 4008‧‧‧ Depth Control Touch Control

4010‧‧‧二維增益觸控控制項 4010‧‧‧Two-dimensional gain touch control

4012‧‧‧全螢幕觸控控制項 4012‧‧‧Full-screen touch control

4014‧‧‧文字觸控控制項 4014‧‧‧Text Touch Controls

4016‧‧‧分割螢幕觸控控制項 4016‧‧‧ Split screen touch controls

4018‧‧‧ENV觸控控制項 4018‧‧‧ENV Touch Controls

4022‧‧‧脈衝波多普勒(PWD)觸控控制項 4022‧‧‧Pulse Wave Doppler (PWD) Touch Control

4024‧‧‧凍結觸控控制項 4024‧‧‧Freeze touch controls

4026‧‧‧儲存觸控控制項 4026‧‧‧ Save touch controls

4028‧‧‧最佳化觸控控制項 4028‧‧‧Optimized touch controls

4100‧‧‧圖形使用者介面(GUI)功能表螢幕介面/用於使用者操作模式之螢幕介面/圖形使用者介面(GUI)主螢幕/主圖形使用者介面(GUI)功能表螢幕 4100‧‧‧GUI screen interface

4102‧‧‧影像顯示視窗 4102‧‧‧Image display window

4104‧‧‧功能表列 4104‧‧‧Function List

4108‧‧‧患者觸控控制項 4108‧‧‧patient touch controls

4110‧‧‧預設觸控控制項 4110‧‧‧Default touch controls

4112‧‧‧檢視觸控控制項 4112‧‧‧View touch controls

4114‧‧‧報告觸控控制項 4114‧‧‧Report Touch Controls

4116‧‧‧設定觸控控制項 4116‧‧‧Setting Touch Controls

4120‧‧‧影像控制列 4120‧‧‧Image Control Bar

4122‧‧‧深度控制觸控控制項 4122‧‧‧Depth Control Touch Controls

4124‧‧‧二維增益觸控控制項 4124‧‧‧Two-dimensional gain touch control

4126‧‧‧全螢幕觸控控制項 4126‧‧‧Full-screen touch controls

4128‧‧‧文字觸控控制項 4128‧‧‧Text Touch Controls

4130‧‧‧分割螢幕觸控控制項 4130‧‧‧ Split Screen Touch Controls

4132‧‧‧針視覺化ENV觸控控制項 4132‧‧‧Pin Visual ENV Touch Controls

4136‧‧‧脈衝波多普勒(PWD)觸控控制項 4136‧‧‧Pulse Wave Doppler (PWD) Touch Control

4138‧‧‧凍結觸控控制項 4138‧‧‧Freeze touch controls

4140‧‧‧儲存觸控控制項 4140‧‧‧Save touch controls

4142‧‧‧最佳化觸控控制項 4142‧‧‧Optimized touch controls

4200‧‧‧圖形使用者介面(GUI)患者資料螢幕介面/用於使用者操作模式之螢幕介面/圖形使用者介面(GUI)患者資料螢幕/患者資料螢幕 4200‧‧‧Graphical User Interface (GUI) Patient Data Screen Interface / Screen Interface for User Operation Mode / Graphical User Interface (GUI) Patient Data Screen / Patient Data Screen

4202‧‧‧功能表列 4202‧‧‧Function List

4204‧‧‧新的患者觸控螢幕控制項 4204‧‧‧ new patient touchscreen controls

4206‧‧‧新的研究觸控螢幕控制項 4206‧‧‧New research touch screen controls

4208‧‧‧研究清單觸控螢幕控制項 4208‧‧‧Research List Touchscreen Controls

4210‧‧‧工作清單觸控螢幕控制項 4210‧‧‧Task list touchscreen controls

4212‧‧‧編輯觸控螢幕控制項 4212‧‧‧Edit touch screen controls

4214‧‧‧患者資訊區段 4214‧‧‧ Patient Information Section

4216‧‧‧研究資訊區段 4216‧‧‧Research Information Section

4218‧‧‧影像控制列 4218‧‧‧Image Control Bar

4220‧‧‧接受研究觸控控制項 4220‧‧‧Research on touch controls

4222‧‧‧密切研究觸控控制項 4222‧‧‧Research closely on touch controls

4224‧‧‧列印觸控控制項 4224‧‧‧Print touch controls

4226‧‧‧列印預覽觸控控制項 4226‧‧‧Print preview touch controls

4228‧‧‧消除觸控控制項 4228‧‧‧Eliminate touch controls

4230‧‧‧二維觸控控制項 4230‧‧‧Two-dimensional touch control

4232‧‧‧凍結觸控控制項 4232‧‧‧Freeze touch controls

4234‧‧‧儲存觸控控制項 4234‧‧‧Save touch controls

4300‧‧‧圖形使用者介面(GUI)患者資料螢幕介面/用於使用者操作模式之螢幕介面/預設螢幕 4300‧‧‧ Graphical User Interface (GUI) Patient Data Screen Interface / Screen Interface for User Operation Mode / Default Screen

4302‧‧‧功能表列 4302‧‧‧Function List

4304‧‧‧預設選擇模式 4304‧‧‧ Preset Selection Mode

4308‧‧‧影像控制列 4308‧‧‧Image Control Bar

4310‧‧‧儲存設定觸控控制項 4310‧‧‧Save Settings Touch Controls

4312‧‧‧刪除觸控控制項 4312‧‧‧ Delete touch control

4314‧‧‧彩色多普勒(CD)觸控控制項 4314‧‧‧Color Doppler (CD) Touch Control

4316‧‧‧脈衝波多普勒(PWD)觸控控制項 4316‧‧‧Pulse Wave Doppler (PWD) Touch Control

4318‧‧‧凍結觸控控制項 4318‧‧‧Freeze touch controls

4320‧‧‧儲存觸控控制項 4320‧‧‧Save touch controls

4322‧‧‧最佳化觸控控制項 4322‧‧‧Optimized touch controls

4400‧‧‧圖形使用者介面(GUI)檢視螢幕介面/用於使用者操作模式之螢幕介面/檢視螢幕 4400‧‧‧Graphical User Interface (GUI) View Screen Interface / Screen Interface for User Operation Mode

4402‧‧‧功能表列 4402‧‧‧Function List

4404‧‧‧預設擴展檢視 4404‧‧‧default expanded view

4406‧‧‧影像顯示視窗 4406‧‧‧Image display window

4408‧‧‧影像 4408‧‧‧Image

4410‧‧‧影像 4410‧‧‧Image

4412‧‧‧影像 4412‧‧‧Image

4414‧‧‧影像 4414‧‧‧Image

4416‧‧‧影像控制列 4416‧‧‧Image Control Bar

4418‧‧‧縮圖設定觸控控制項 4418‧‧‧Thumbnail settings for touch controls

4420‧‧‧同步觸控控制項 4420‧‧‧Sync Touch Controls

4422‧‧‧選擇觸控控制項 4422‧‧‧Select Touch Control

4424‧‧‧先前影像觸控控制項 4424‧‧‧Previous image touch controls

4426‧‧‧下一影像觸控控制項 4426‧‧‧Next Image Touch Control

4428‧‧‧二維影像觸控控制項 4428‧‧‧Two-dimensional image touch control

4430‧‧‧暫停影像觸控控制項 4430‧‧‧ Pause Image Touch Control

4432‧‧‧儲存影像觸控控制項 4432‧‧‧Save Image Touch Control

4500‧‧‧用於使用者操作模式之螢幕介面/報告螢幕 4500‧‧‧Screen interface / report screen for user operation mode

4502‧‧‧功能表列 4502‧‧‧Function List

4504‧‧‧報告擴展檢視 4504‧‧‧Expand Report

4506‧‧‧顯示螢幕 4506‧‧‧display

4508‧‧‧影像控制列 4508‧‧‧Image Control Bar

4510‧‧‧保存觸控控制項 4510‧‧‧Save Touch Controls

4512‧‧‧保存為觸控控制項 4512‧‧‧Save as touch control

4514‧‧‧列印觸控控制項 4514‧‧‧Print Touch Controls

4516‧‧‧列印預覽觸控控制項 4516‧‧‧Print Preview Touch Controls

4518‧‧‧密切研究觸控控制項 4518‧‧‧Research closely on touch controls

4520‧‧‧二維影像觸控控制項 4520‧‧‧Two-dimensional image touch control

4522‧‧‧凍結影像觸控控制項 4522‧‧‧ Freeze image touch control

4524‧‧‧儲存影像觸控控制項 4524‧‧‧Save Image Touch Controls

4600‧‧‧用於使用者操作模式之螢幕介面/檢視螢幕 4600‧‧‧Screen interface / view screen for user operation mode

4602‧‧‧功能表列 4602‧‧‧Function List

4604‧‧‧報考擴展檢視/設定擴展螢幕 4604‧‧‧Examination Extended View / Set Extended Screen

4606‧‧‧通用觸控控制項 4606‧‧‧ Universal Touch Controls

4608‧‧‧顯示觸控控制項 4608‧‧‧Display touch controls

4610‧‧‧量測觸控控制項 4610‧‧‧Measure Touch Controls

4612‧‧‧註釋觸控控制項 4612‧‧‧Annotation touch controls

4614‧‧‧列印觸控控制項 4614‧‧‧Print Touch Control

4616‧‧‧儲存/獲取觸控控制項 4616‧‧‧Save / Retrieve Touch Controls

4618‧‧‧醫學數位成像及通信(DICOM)觸控控制項 4618‧‧‧ Medical Digital Imaging and Communication (DICOM) Touch Controls

4620‧‧‧匯出觸控控制項 4620‧‧‧Export touch controls

4622‧‧‧研究資訊影像觸控控制項 4622‧‧‧Research Information Image Touch Controls

4624‧‧‧組態螢幕 4624‧‧‧Configuration screen

4626‧‧‧軟鍵銜接位置 4626‧‧‧ softkey connection position

4628‧‧‧影像控制列 4628‧‧‧Image Control Bar

4630‧‧‧縮圖設定觸控控制項 4630‧‧‧Thumbnail settings for touch controls

4632‧‧‧同步觸控控制項 4632‧‧‧Sync Touch Controls

4634‧‧‧選擇觸控控制項 4634‧‧‧Select Touch Control

4636‧‧‧先前影像觸控控制項 4636‧‧‧Previous image touch controls

4638‧‧‧下一影像觸控控制項 4638‧‧‧Next Image Touch Control

4640‧‧‧二維影像觸控控制項 4640‧‧‧Two-dimensional image touch control

4642‧‧‧暫停影像觸控控制項 4642‧‧‧Pause Image Touch Control

4644‧‧‧設定控制列 4644‧‧‧Set Control Bar

4650‧‧‧軟鍵控制箭頭 4650‧‧‧ soft key control arrow

4652‧‧‧軟鍵控制項 4652‧‧‧ soft key controls

4662‧‧‧軟鍵控制項 4662‧‧‧ soft key controls

4700‧‧‧用於使用者操作模式之螢幕介面/設定螢幕 4700‧‧‧Screen interface / setting screen for user operation mode

4702‧‧‧功能表列/組態螢幕 4702‧‧‧Menu bar / configuration screen

4704‧‧‧報告擴展檢視/設定擴展螢幕/追溯性獲取 4704‧‧‧Report Extended View / Set Extended Screen / Retrospective Acquisition

4706‧‧‧通用觸控控制項 4706‧‧‧Universal Touch Controls

4708‧‧‧顯示觸控控制項 4708‧‧‧Display touch controls

4710‧‧‧量測觸控控制項 4710‧‧‧Measure Touch Controls

4712‧‧‧註釋觸控控制項 4712‧‧‧ Annotation Touch Controls

4714‧‧‧列印觸控控制項 4714‧‧‧Print Touch Control

4716‧‧‧儲存/獲取觸控控制項 4716‧‧‧Save / Retrieve Touch Controls

4718‧‧‧醫學數位成像及通信(DICOM)觸控控制項 4718‧‧‧ Medical Digital Imaging and Communication (DICOM) Touch Controls

4720‧‧‧匯出觸控控制項 4720‧‧‧ Export touch controls

4722‧‧‧研究資訊影像觸控控制項 4722‧‧‧Research Information Image Touch Controls

4728‧‧‧影像控制列 4728‧‧‧Image Control Bar

4730‧‧‧縮圖設定觸控控制項 4730‧‧‧Thumbnail settings for touch controls

4732‧‧‧同步觸控控制項 4732‧‧‧Sync Touch Controls

4734‧‧‧選擇觸控控制項 4734‧‧‧Select Touch Control

4736‧‧‧先前影像觸控控制項 4736‧‧‧ Previous image touch controls

4738‧‧‧下一影像觸控控制項 4738‧‧‧Next image touch control

4740‧‧‧二維影像觸控控制項 4740‧‧‧Two-dimensional image touch control

4742‧‧‧暫停影像觸控控制項 4742‧‧‧Pause Image Touch Control

4744‧‧‧設定控制列 4744‧‧‧Set Control Bar

4880‧‧‧實體共用記憶體 4880‧‧‧physical shared memory

4882‧‧‧共用記憶體標頭結構 4882‧‧‧shared memory header structure

4884‧‧‧標頭陣列 4884‧‧‧Header Array

4886‧‧‧記憶體片段 4886‧‧‧Memory fragment

4888‧‧‧實體系統互斥 4888‧‧‧Entity systems are mutually exclusive

4890‧‧‧系統事件 4890‧‧‧System Event

4908‧‧‧物件傳送介面 4908‧‧‧ Object Transfer Interface

4916‧‧‧Object Factory介面 4916‧‧‧Object Factory interface

5170‧‧‧主螢幕 5170‧‧‧Main screen

5172‧‧‧功能表列 5172‧‧‧Function List

5174‧‧‧影像顯示視窗 5174‧‧‧Image display window

5176‧‧‧影像控制列 5176‧‧‧Image Control Bar

5178‧‧‧工具列 5178‧‧‧toolbar

5180‧‧‧工具列 5180‧‧‧toolbar

5182‧‧‧工具列 5182‧‧‧toolbar

5184‧‧‧工具列 5184‧‧‧toolbar

5186‧‧‧工具列 5186‧‧‧toolbar

5202‧‧‧嵌套層級檔案目錄 5202‧‧‧nested hierarchy file directory

5400‧‧‧配置 5400‧‧‧Configuration

5402‧‧‧y方向/y軸/y平面 5402‧‧‧y-direction / y-axis / y-plane

5404‧‧‧x方向/x軸/x平面 5404‧‧‧x direction / x axis / x plane

5406‧‧‧z軸/z平面 5406‧‧‧z-axis / z-plane

5408‧‧‧偏光軸 5408‧‧‧polarized axis

5410‧‧‧仰角軸 5410‧‧‧ elevation axis

5412‧‧‧配置 5412‧‧‧Configuration

5414‧‧‧超聲波影像 5414‧‧‧ Ultrasound Imaging

5420‧‧‧仰角軸 5420‧‧‧Elevation axis

5422‧‧‧偏光軸 5422‧‧‧polarized axis

5424‧‧‧超聲波影像 5424‧‧‧Ultrasonic image

5502‧‧‧頂部陣列 5502‧‧‧Top Array

5504‧‧‧底部陣列 5504‧‧‧Bottom array

5506‧‧‧高電壓驅動脈衝 5506‧‧‧High Voltage Drive Pulse

5508‧‧‧高電壓驅動脈衝 5508‧‧‧High Voltage Drive Pulse

5510‧‧‧高電壓驅動脈衝 5510‧‧‧High Voltage Drive Pulse

5512‧‧‧陣列 5512‧‧‧Array

5602‧‧‧高電壓脈衝 5602‧‧‧High Voltage Pulse

5604‧‧‧高電壓脈衝 5604‧‧‧High Voltage Pulse

5606‧‧‧高電壓脈衝 5606‧‧‧High Voltage Pulse

5610‧‧‧陣列 5610‧‧‧Array

5612‧‧‧頂部陣列 5612‧‧‧Top Array

5614‧‧‧底部陣列 5614‧‧‧ bottom array

5704‧‧‧選擇底部陣列 5704‧‧‧Select bottom array

5708‧‧‧選擇頂部陣列 5708‧‧‧Select top array

5710‧‧‧高電壓驅動器 5710‧‧‧High Voltage Driver

5712‧‧‧高電壓驅動器 5712‧‧‧High Voltage Driver

5802‧‧‧心尖兩腔室影像 5802‧‧‧ apical two chamber images

5804‧‧‧心尖四腔室影像 5804‧‧‧ Four-chamber apical image

5902‧‧‧影像/心尖2CH視圖 5902‧‧‧Image / apical 2CH view

5904‧‧‧影像/心尖4CH視圖 5904‧‧‧Image / apical 4CH view

6002‧‧‧步驟 6002‧‧‧step

6003‧‧‧步驟 6003‧‧‧step

6004‧‧‧步驟 6004‧‧‧step

6005‧‧‧步驟 6005‧‧‧step

6006‧‧‧步驟 6006‧‧‧step

6007‧‧‧步驟 6007‧‧‧step

6008‧‧‧步驟 6008‧‧‧step

6009‧‧‧步驟 6009‧‧‧step

6010‧‧‧步驟 6010‧‧‧step

A、a‧‧‧方法 A, a‧‧‧method

B、b‧‧‧方法 B, b‧‧‧ methods

C、c‧‧‧方法 C, c‧‧‧method

D、d‧‧‧方法 D, d‧‧‧method

藉由參考結合附圖獲得之以下描述將更明白及可更佳理解例示性實施例之前述內容及其他目的、態樣、特徵及優點,其中:The foregoing description and other objects, aspects, features, and advantages of the exemplary embodiments will be more clearly understood and better understood by referring to the following description taken in conjunction with the accompanying drawings, among which:

圖1係根據本發明之一例示性實施例之例示性醫療超聲波成像設備之一平面視圖; 1 is a plan view of an exemplary medical ultrasound imaging apparatus according to an exemplary embodiment of the present invention;

圖2A及圖2B係根據本發明之較佳實施例之醫療超聲波成像系統之側視圖; 2A and 2B are side views of a medical ultrasound imaging system according to a preferred embodiment of the present invention;

圖3A繪示根據本發明之較佳實施例之可經採用作為至醫療超聲波成像系統之使用者輸入之例示性單點及多點手勢; 3A illustrates exemplary single-point and multi-point gestures that can be adopted as user input to a medical ultrasound imaging system according to a preferred embodiment of the present invention;

圖3B繪示根據本發明之較佳實施例之用於操作一平板電腦超聲波系統之一處理程序流程圖; 3B illustrates a flowchart of a processing procedure for operating a tablet computer ultrasound system according to a preferred embodiment of the present invention;

圖3C至圖3K繪示調整波束成形及顯示操作之觸控螢幕手勢之細節; 3C to 3K illustrate details of touch screen gestures for adjusting beamforming and display operations;

圖4A至圖4C繪示根據本發明之較佳實施例之可實施於醫療超聲波成像系統上之觸控控制項之例示性子集; 4A to 4C illustrate an exemplary subset of touch control items that can be implemented on a medical ultrasound imaging system according to a preferred embodiment of the present invention;

圖5A及圖5B係根據本發明之較佳實施例之在醫療超聲波成像系統之一觸控螢幕顯示器上之具有一囊性病變之一肝臟之例示性表示; 5A and 5B are exemplary representations of a liver with a cystic lesion on a touch screen display of a medical ultrasound imaging system according to a preferred embodiment of the present invention;

圖5C及圖5D係圖5A及圖5B之觸控螢幕顯示器上之肝臟及囊性病變之例示性表示,包含對應於該肝臟之一經放大部分之一虛擬視窗; 5C and 5D are exemplary representations of liver and cystic lesions on the touch screen display of FIGS. 5A and 5B, including a virtual window corresponding to an enlarged portion of the liver;

圖6A係醫療超聲波成像系統之觸控螢幕顯示器上之一心臟之一心尖四(4)腔室視圖之一例示性表示; FIG. 6A is an exemplary representation of a view of one of the four apical four (4) chambers of a heart on a touch screen display of a medical ultrasound imaging system;

圖6B至圖6E繪示圖6A之觸控螢幕顯示器上之心臟之一左心室之一心內膜邊界之一例示性手動追蹤; 6B to 6E illustrate an exemplary manual tracking of one of the left ventricles and one of the endocardial boundaries of the heart on the touch screen display of FIG. 6A;

圖7A至圖7C繪示圖5C及圖5D之虛擬視窗內之肝臟上之囊性病變之尺寸之一例示性量測; 7A to 7C illustrate an exemplary measurement of the size of a cystic lesion on the liver in the virtual windows of FIGS. 5C and 5D;

圖8A至圖8C繪示圖5C及圖5D之虛擬視窗內之肝臟上之囊性病變之一例示性卡尺量測; 8A to 8C illustrate an exemplary caliper measurement of one of cystic lesions on the liver in the virtual windows of FIGS. 5C and 5D;

圖9A繪示附接至處理器殼體之複數個傳感器陣列之一者; 9A illustrates one of a plurality of sensor arrays attached to a processor housing;

圖9B展示根據一例示性實施例之在一超聲波應用程式內之一傳感器管理模組之一軟體流程圖; 9B shows a software flowchart of a sensor management module in an ultrasound application according to an exemplary embodiment;

圖9C展示關於例示性實施例之一針感測定位系統之一透視圖; FIG. 9C shows a perspective view of a pin feeler positioning system according to an exemplary embodiment; FIG.

圖9D展示關於例示性實施例之一針導件之一透視圖; 9D shows a perspective view of a needle guide in accordance with an exemplary embodiment;

圖9E展示關於例示性實施例之一針感測定位系統之一透視圖; FIG. 9E shows a perspective view of a pin feeler positioning system according to an exemplary embodiment; FIG.

圖9F繪示經組態以接納用於無線通信之一用戶識別模組(SIM)卡之一例示性系統; 9F illustrates an exemplary system configured to accept a subscriber identity module (SIM) card for wireless communication;

圖10A展示量測心臟壁運動之一例示性方法; FIG. 10A shows an exemplary method for measuring heart wall motion;

圖10B展示關於例示性實施例之一整合式超聲波探測頭之一示意性方塊圖; FIG. 10B shows a schematic block diagram of an integrated ultrasonic probe according to an exemplary embodiment; FIG.

圖10C展示關於例示性實施例之一整合式超聲波探測頭之一替代示意性方塊圖; FIG. 10C shows an alternative schematic block diagram of an integrated ultrasonic probe according to one of the exemplary embodiments; FIG.

圖11係一超聲波引擎(即,前端超聲波特定電路)之一例示性實施例及例示性超聲波器件之一電腦主機板(即,主機電腦)之一例示性實施例之一詳細示意性方塊圖; 11 is a detailed schematic block diagram of an exemplary embodiment of an ultrasonic engine (ie, a front-end ultrasonic specific circuit) and a computer motherboard (ie, a host computer) of an exemplary ultrasonic device;

圖12描繪包含組裝成一垂直堆疊組態之一多晶片模組之一電路板之一示意性側視圖; 12 depicts a schematic side view of a circuit board including a multi-chip module assembled in a vertically stacked configuration;

圖13係用於製造包含組裝成一垂直堆疊組態之一多晶片模組之一電路板之一例示性方法之一流程圖; 13 is a flowchart of an exemplary method for manufacturing a circuit board including a multi-chip module assembled into a vertically stacked configuration;

圖14A係包含四個垂直堆疊晶粒之一多晶片模組之一示意性側視圖,其中該等晶粒係藉由具有一2合1切割晶粒附著膜(D-DAF)之鈍化矽層彼此間隔分離; FIG. 14A is a schematic side view of one of a multi-chip module including four vertically stacked dies, wherein the dies are formed by a passivation silicon layer having a 2-in-1 cut die attach film (D-DAF) Separated from each other;

圖14B係包含四個垂直堆疊晶粒之一多晶片模組之一示意性側視圖,其中該等晶粒係藉由作為晶粒至晶粒間隔件之基於DA膜之黏著劑彼此間隔分離; 14B is a schematic side view of one of a multi-chip module including four vertically stacked dies, where the dies are separated from each other by a DA film-based adhesive as a die-to-die spacer;

圖14C係包含四個垂直堆疊晶粒之一多晶片模組之一示意性側視圖,其中該等晶粒係藉由作為晶粒至晶粒間隔件之基於DA膏或膜之黏著劑彼此間隔分離; FIG. 14C is a schematic side view of one of a multi-chip module including four vertically stacked dies, wherein the dies are spaced from each other by a DA paste or film-based adhesive as a die-to-die spacer Separation

圖15係使用(a)具有一2合1切割晶粒附著膜(D-DAF)之鈍化矽層、(b)DA膏、(c)厚DA膜及(d)包含一2合1之D-DAF之膜包線(FOW)之晶粒至晶粒堆疊之另一例示性方法之一流程圖; Figure 15 uses (a) a passivation silicon layer with a 2-in-1 cut die attach film (D-DAF), (b) a DA paste, (c) a thick DA film, and (d) a 2-in-1 D -A flow chart of another exemplary method of die-to-die stacking of DOW's film envelope (FOW);

圖16係包含垂直整合成一垂直堆疊組態之一超聲波傳輸/接收IC晶片、一放大器IC晶片及一超聲波波束成形器IC晶片之一多晶片模組之一示意性側視圖; 16 is a schematic side view of a multi-chip module including an ultrasonic transmitting / receiving IC chip, an amplifier IC chip, and an ultrasonic beamformer IC chip vertically integrated into a vertical stack configuration;

圖17係一超聲波引擎(即,前端超聲波特定電路)之一例示性實施例及經提供作為一單板完整超聲波系統之一電腦主機板(即,主機電腦)之一例示性實施例之一詳細示意性方塊圖; FIG. 17 is a detailed view of an exemplary embodiment of an ultrasonic engine (ie, a front-end ultrasonic specific circuit) and an exemplary embodiment of a computer motherboard (ie, a host computer) provided as a single-board complete ultrasound system Schematic block diagram

圖18係根據例示性實施例提供之一例示性可攜式超聲波系統之一透視圖; FIG. 18 is a perspective view of an exemplary portable ultrasound system provided according to an exemplary embodiment; FIG.

圖19繪示呈現於圖18之例示性可攜式超聲波系統之一觸控螢幕顯示器上之一主圖形使用者介面(GUI)之一例示性視圖; FIG. 19 illustrates an exemplary view of a main graphical user interface (GUI) presented on a touch screen display of the exemplary portable ultrasound system of FIG. 18; FIG.

圖20A係根據本發明之另一較佳實施例之醫療超聲波成像系統之一俯視圖; 20A is a top view of a medical ultrasound imaging system according to another preferred embodiment of the present invention;

圖20B係根據本發明之另一實施例之經組態以接納一無線SIM卡之醫療超聲波成像系統之一俯視圖; 20B is a top view of a medical ultrasound imaging system configured to receive a wireless SIM card according to another embodiment of the present invention;

圖21繪示根據本發明之較佳實施例之用於一平板電腦超聲波系統之一較佳推車系統; 21 illustrates a preferred cart system for a tablet computer ultrasound system according to a preferred embodiment of the present invention;

圖22繪示根據本發明之較佳實施例之用於一模組化超聲波成像系統之較佳推車(cart)系統; 22 illustrates a preferred cart system for a modular ultrasound imaging system according to a preferred embodiment of the present invention;

圖23A繪示根據本發明之較佳實施例之用於一模組化超聲波成像系統之較佳推車系統; FIG. 23A illustrates a preferred cart system for a modular ultrasound imaging system according to a preferred embodiment of the present invention; FIG.

圖23B繪示根據本發明之另一實施例之用於經組態以接納一無線SIM卡之一模組化超聲波成像系統之一替代推車系統; 23B illustrates an alternative cart system for a modular ultrasound imaging system configured to receive a wireless SIM card according to another embodiment of the present invention;

圖24繪示根據本發明之較佳實施例之用於一模組化超聲波成像系統之較佳推車系統; 24 illustrates a preferred cart system for a modular ultrasound imaging system according to a preferred embodiment of the present invention;

圖25A至圖25B繪示用於平板電腦超聲波器件之一多功能銜接基底; 25A to 25B illustrate a multifunctional interface substrate for a tablet computer ultrasonic device;

圖26A繪示根據本發明之一實施例組態之一整合式探測頭系統; 26A illustrates an integrated probe head system configured according to an embodiment of the present invention;

圖26B繪示根據本發明之一實施例之在探測頭與主機電腦之間之一無線通信鏈路; 26B illustrates a wireless communication link between a probe and a host computer according to an embodiment of the present invention;

圖26C繪示根據本發明之一實施例之一無線超聲波系統; 26C illustrates a wireless ultrasound system according to an embodiment of the present invention;

圖27繪示根據本發明之一實施例之一替代無線超聲波系統; 27 illustrates an alternative wireless ultrasound system according to an embodiment of the present invention;

圖28繪示根據本發明之另一實施例組態之一替代整合式探測頭系統; FIG. 28 illustrates an alternative integrated probe system configured according to another embodiment of the present invention; FIG.

圖29繪示根據本發明之一實施例之對藉由一超聲波成像系統產生之影像之無線存取之佈建; FIG. 29 illustrates the deployment of wireless access to an image generated by an ultrasound imaging system according to an embodiment of the present invention; FIG.

圖30繪示根據本發明之一實施例之與一個人電腦通信之一影像觀看器; FIG. 30 illustrates an image viewer communicating with a personal computer according to an embodiment of the present invention; FIG.

圖31繪示一例示性超聲波影像收集及散發系統; FIG. 31 illustrates an exemplary ultrasound image collection and distribution system;

圖32繪示根據本發明之一實施例之具有在遠端運算器件與探測頭之間之一無線通信鏈路之一超聲波成像系統; FIG. 32 illustrates an ultrasound imaging system having a wireless communication link between a remote computing device and a detection head according to an embodiment of the present invention; FIG.

圖33繪示用於無線操作之資料處理及儲存系統; Figure 33 illustrates a data processing and storage system for wireless operation;

圖34係繪示根據本發明之一實施例之整合一超聲波系統之一成像及遠距醫療系統之一示意圖; 34 is a schematic diagram showing an imaging and telemedicine system integrated with an ultrasound system according to an embodiment of the present invention;

圖35繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一2D成像操作模式; 35 illustrates a 2D imaging operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention;

圖36繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一運動操作模式; 36 illustrates a motion operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention;

圖37繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一彩色多普勒(color Doppler)操作模式; FIG. 37 illustrates a color Doppler operation mode using a modular ultrasound imaging system according to an embodiment of the present invention; FIG.

圖38繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一脈衝波多普勒(pulse-wave Doppler)操作模式; FIG. 38 illustrates a pulse-wave Doppler operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention; FIG.

圖39繪示根據本發明之一實施例之使用一模組化超聲波成像系統之一個三重掃描操作模式; FIG. 39 illustrates a triple scan operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention; FIG.

圖40繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI主螢幕介面(Home Screen interface); FIG. 40 illustrates a home screen interface (GUI) for a user operation mode of a modular ultrasound imaging system according to an embodiment of the present invention; FIG.

圖41繪示根據本發明之另一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI功能表螢幕介面; 41 illustrates a GUI function screen interface of a user operation mode for using a modularized ultrasound imaging system according to another embodiment of the present invention;

圖42繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI患者資料螢幕介面; FIG. 42 illustrates a GUI patient data screen interface for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention; FIG.

圖43繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI預設螢幕介面; FIG. 43 illustrates a GUI preset screen interface for a user operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention; FIG.

圖44繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI檢視螢幕介面; 44 illustrates a GUI viewing screen interface for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention;

圖45繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI報告螢幕介面; 45 illustrates a GUI report screen interface for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention;

圖46A至圖46C繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI設定顯示螢幕介面; 46A to 46C illustrate a GUI setting display screen interface for a user operation mode using a modular ultrasonic imaging system according to an embodiment of the present invention;

圖47繪示根據本發明之一實施例之用於使用一模組化超聲波成像系統之一使用者操作模式之一GUI設定儲存/獲取螢幕介面; FIG. 47 illustrates a GUI setting storage / acquisition screen interface for a user operation mode using a modular ultrasound imaging system according to an embodiment of the present invention; FIG.

圖48係繪示根據本發明之一實施例之實體共用記憶體之結構之一方塊圖; 48 is a block diagram showing a structure of a physical shared memory according to an embodiment of the present invention;

圖49繪示致能超聲波與非超聲波操作之間之通信之一共用記憶體系統; FIG. 49 illustrates a shared memory system that enables communication between ultrasonic and non-ultrasonic operations; FIG.

圖50係根據本發明之一實施例組態之一圖形使用者介面之一視圖; 50 is a view of a graphical user interface configured according to an embodiment of the present invention;

圖51繪示根據本發明之一實施例之一圖形使用者介面之一主螢幕顯示器; FIG. 51 illustrates a home screen display of a graphical user interface according to an embodiment of the present invention; FIG.

圖52A至圖52C展示根據本發明之另一實施例之一圖形使用者介面之一交替顯示器; 52A-52C show an alternate display of a graphical user interface according to another embodiment of the present invention;

圖53A至圖53B繪示根據本發明之一實施例之一圖形使用者介面之一患者資料夾及影像資料夾; 53A-53B illustrate a patient folder and an image folder of a graphical user interface according to an embodiment of the present invention;

圖54A至圖54C繪示根據本發明之一較佳實施例之包括兩個一維、ID多元件陣列之XY雙平面探測頭; 54A to 54C illustrate an XY dual-plane detection head including two one-dimensional ID multi-element arrays according to a preferred embodiment of the present invention;

圖55繪示根據本發明之一實施例之一雙平面影像形成xy探測頭之操作; FIG. 55 is a diagram illustrating an operation of forming a xy probe of a dual plane image according to an embodiment of the present invention; FIG.

圖56繪示根據本發明之另一實施例之一雙平面影像形成xy探測頭之操作; FIG. 56 is a diagram illustrating an operation of forming a xy probe of a dual plane image according to another embodiment of the present invention; FIG.

圖57繪示根據本發明之一實施例之一雙平面影像形成xy探測頭之一高電壓驅動電路; 57 illustrates a high-voltage driving circuit of a dual-plane image forming xy detection head according to an embodiment of the present invention;

圖58A至圖58B繪示根據本發明之一實施例之左心室情況之同時雙平面評估;及 58A-58B illustrate simultaneous biplane assessment of left ventricular conditions according to an embodiment of the present invention; and

圖59A至圖59B繪示根據本發明之較佳實施例之射血分率探測頭量測技術; 59A to 59B illustrate measurement techniques of a ejection fraction detecting head according to a preferred embodiment of the present invention;

圖60繪示根據本發明之一實施例之用於無線傳送資料至一可攜式超聲波成像器件及自該可攜式超聲波成像器件無線傳送資料之一例示性方法。 FIG. 60 illustrates an exemplary method for wirelessly transmitting data to and from a portable ultrasonic imaging device according to an embodiment of the present invention.

Claims (37)

一種行動醫療超聲波成像器件,其包括: 一傳感器探測頭,其包含一傳感器陣列; 一平板電腦殼體,該殼體具有一前面板;該殼體中之一電腦,該電腦包含至少一處理器及至少一記憶體;使用一圖形使用者介面(GUI)以顯示一超聲波影像之一觸控螢幕顯示器,該觸控螢幕顯示器定位於該前面板上;該電腦通信地連接至一超聲波波束成形器處理電路,該超聲波波束成形器處理電路自該傳感器陣列接收影像信號,該電腦可回應於來自該觸控螢幕顯示器之一第一手勢輸入而操作以變更該超聲波波束成形器處理電路之一操作;及 其中該圖形使用者介面包含相對於一超聲波影像顯示視窗在該顯示器上所顯示之複數個圖標,且其中該該觸控螢幕顯示器包含掃描深度之觸控致動控制。A mobile medical ultrasound imaging device includes: A sensor detection head including a sensor array; A tablet computer casing having a front panel; a computer in the casing, the computer including at least a processor and at least a memory; using a graphical user interface (GUI) to display an ultrasound image; A touch screen display, the touch screen display is positioned on the front panel; the computer is communicatively connected to an ultrasonic beamformer processing circuit, the ultrasonic beamformer processing circuit receives image signals from the sensor array, and the computer can Operating in response to a first gesture input from the touch screen display to change an operation of the ultrasonic beamformer processing circuit; and The graphical user interface includes a plurality of icons displayed on the display relative to an ultrasonic image display window, and the touch screen display includes touch actuation control of scan depth. 如請求項1之器件,其中該第一手勢輸入對應於該觸控螢幕顯示器上之一移動手勢。The device of claim 1, wherein the first gesture input corresponds to a movement gesture on the touch screen display. 如請求項1之器件,其中該傳感器陣列包括一雙平面傳感器陣列The device of claim 1, wherein the sensor array comprises a dual planar sensor array 如請求項3之器件,其進一步包括一第二輸入,該第二輸入包含抵靠該觸控螢幕顯示器之一點兩下手勢。The device as claimed in claim 3, further comprising a second input including a two-tap gesture against the touch screen display. 如請求項3之器件,其進一步包括回應於來自該觸控螢幕顯示器之該第二輸入,顯示一虛擬視窗之一區域內部之一第一游標,該虛擬視窗顯示一經放大影像。The device of claim 3, further comprising displaying a first cursor inside an area of a virtual window in response to the second input from the touch screen display, the virtual window displaying an enlarged image. 如請求項5之器件,其進一步包括在該電腦處接收來自該觸控螢幕顯示器之一第三輸入,該第三輸入在該虛擬視窗之該區域內部接收。The device of claim 5, further comprising receiving a third input from the touch screen display at the computer, the third input being received within the region of the virtual window. 如請求項6之器件,其中該第三輸入對應於該觸控螢幕顯示器上之一拖曳手勢。The device of claim 6, wherein the third input corresponds to a drag gesture on the touch screen display. 如請求項6之器件,其進一步包括回應於來自該觸控螢幕顯示器之該第三輸入,將該第一游標移動至該虛擬視窗之該區域內部之一第一部位。The device of claim 6, further comprising, in response to the third input from the touch screen display, moving the first cursor to a first portion inside the region of the virtual window. 如請求項1之器件,其中一進一步輸入對應於抵靠該觸控螢幕顯示器之一按壓手勢。As in the device of claim 1, one of the further inputs corresponds to a pressing gesture against one of the touch screen displays. 如請求項9之器件,其進一步包括在該電腦處接收來自該觸控螢幕顯示器之一第二進一步輸入,該第二進一步輸入與進一步輸入實質上同時接收。The device of claim 9, further comprising receiving a second further input from the touch screen display at the computer, the second further input and the further input being received substantially simultaneously. 如請求項1之器件,其進一步包括藉由在該平板電腦殼體中之一波束成形器處理電路操作之複數個傳感器陣列。The device of claim 1, further comprising a plurality of sensor arrays operated by a beamformer processing circuit in the tablet computer housing. 如請求項10之器件,其進一步包括回應於來自該觸控螢幕顯示器之該第二進一步輸入,將該第一游標固定於該虛擬視窗之該區域內部之該第一部位處。The device of claim 10, further comprising, in response to the second further input from the touch screen display, fixing the first cursor at the first portion inside the region of the virtual window. 如請求項12之器件,其進一步包括藉由該電腦執行至少部分基於該第一部位處之該第一游標之對該超聲波影像之至少一量測。The device of claim 12, further comprising performing, by the computer, at least one measurement of the ultrasound image based at least in part on the first cursor at the first location. 如請求項12之器件,其進一步包括在該電腦處接收來自該觸控螢幕顯示器之一第三進一步輸入。The device of claim 12, further comprising receiving a third further input from the touch screen display at the computer. 如請求項14之器件,其中該第三進一步輸入對應於抵靠該觸控螢幕顯示器之一點兩下手勢。The device as claimed in claim 14, wherein the third further input corresponds to a gesture of two clicks against the touch screen display. 如請求項14之器件,其進一步包括回應於來自該觸控螢幕顯示器之該第三進一步輸入,顯示該虛擬視窗之該區域內部之一第二部位處之一第二游標。The device of claim 14, further comprising displaying a second cursor at a second location inside the region of the virtual window in response to the third further input from the touch screen display. 如請求項16之器件,其中該電腦處理至少部分基於該第一及該第二游標在該虛擬視窗之該區域內部之該等各自部位之關於該超聲波影像之至少一量測。If the device of claim 16, wherein the computer processes at least one measurement of the ultrasound image based at least in part on the respective parts of the first and second cursors within the region of the virtual window. 如請求項9之器件,其中該進一步輸入對應於抵靠該觸控螢幕顯示器之一按壓及拖曳手勢。The device of claim 9, wherein the further input corresponds to a press and drag gesture against one of the touch screen displays. 如請求項1之器件,其進一步包括一針導引器件。The device of claim 1, further comprising a needle guide device. 如請求項1之器件,其進一步包括複數個傳感器連接器且該傳感器可操作以自複數個經連接之傳感器選擇一傳感器。The device of claim 1, further comprising a plurality of sensor connectors and the sensor is operable to select a sensor from the plurality of connected sensors. 如請求項19之器件,其進一步包括在該顯示器上之一第二視窗中同時操作一成像程序及一經網路連線之通信協定。The device of claim 19, further comprising a communication protocol for simultaneously operating an imaging program and a network connection in a second window on the display. 如請求項1之器件,其進一步包括使用一傳感器連接器連接至該殼體之一傳感器陣列,該傳感器陣列與該傳感器探測頭中之該超聲波波束成形器處理電路通信。The device of claim 1, further comprising a sensor array connected to the housing using a sensor connector, the sensor array communicating with the ultrasonic beamformer processing circuit in the sensor detection head. 如請求項1之器件,其中該殼體具有小於2500立方公分之一體積。The device of claim 1, wherein the housing has a volume of less than 2500 cubic centimeters. 如請求項1之器件,其進一步包括將該器件連接至諸如網際網路之一公共存取網路之一無線網路連接。The device of claim 1, further comprising connecting the device to a wireless network connection such as a public access network such as the Internet. 如請求項1之器件,其進一步包括一無線卡埠及一卡讀取器。The device of claim 1, further comprising a wireless card port and a card reader. 一種操作可攜式醫療超聲波成像設備之方法,該可攜式醫療超聲波成像系統包含:一傳感器探測頭;在一平板電腦外觀尺寸中之一殼體,該殼體具有一前面板;安置於該殼體中之一電腦,該電腦包含至少一處理器及至少一記憶體;一觸控螢幕顯示器,其用於顯示一超聲波影像,該觸控螢幕顯示器經安置於該前面板上;該電腦連接至一超聲波波束成形器電路,該觸控螢幕顯示器可通信耦合至該電腦,該方法包括以下步驟: 在該電腦處接收來自該觸控螢幕顯示器之一第一輸入; 回應於來自該觸控螢幕顯示器之該第一輸入,追蹤經顯示的該超聲波影像之一特徵; 在該電腦處接收來自該觸控螢幕顯示器之一第二輸入,該第二輸入與該第一輸入之一部分實質上同時接收;及 回應於來自該觸控螢幕顯示器之該第二輸入,完成經顯示的該超聲波影像之該特徵之追蹤。A method for operating a portable medical ultrasonic imaging device. The portable medical ultrasonic imaging system includes: a sensor detection head; a casing in a tablet computer appearance size, the casing has a front panel; A computer in the housing, the computer including at least a processor and at least a memory; a touch screen display for displaying an ultrasonic image, the touch screen display is arranged on the front panel; the computer is connected To an ultrasonic beamformer circuit, the touch screen display can be communicatively coupled to the computer. The method includes the following steps: Receiving a first input from the touch screen display at the computer; In response to the first input from the touch screen display, tracking a feature of the displayed ultrasound image; Receiving a second input from the touch screen display at the computer, the second input being received substantially simultaneously with a portion of the first input; and In response to the second input from the touch screen display, tracking of the feature of the displayed ultrasound image is completed. 如請求項26之方法,其中該第一輸入對應於抵靠該觸控螢幕顯示器之一按壓及拖曳手勢。The method of claim 26, wherein the first input corresponds to a press and drag gesture against one of the touch screen displays. 如請求項26之方法,其中該第二輸入對應於抵靠該觸控螢幕顯示器之一點選手勢。The method of claim 26, wherein the second input corresponds to a click gesture against the touch screen display. 如請求項26之方法,其進一步包括在該電腦處接收一第三輸入以致動與該觸控螢幕顯示器一起操作之一圖形使用者介面之一圖標。The method of claim 26, further comprising receiving a third input at the computer to activate an icon of a graphical user interface operating with the touch screen display. 如請求項29之方法,其中該第三輸入對應於抵靠該觸控螢幕顯示器之一點兩下手勢。The method of claim 29, wherein the third input corresponds to a double-tap gesture against the touch screen display. 如請求項29之方法,其進一步包括回應於來自該觸控螢幕顯示器之該第三輸入,顯示該觸控螢幕顯示器之一區域內部之一第一游標。The method of claim 29, further comprising displaying a first cursor inside an area of the touch screen display in response to the third input from the touch screen display. 如請求項26之方法,其進一步包括在該電腦處接收來自該觸控螢幕顯示器之一第四輸入。The method of claim 26, further comprising receiving a fourth input from the touch screen display at the computer. 如請求項32之方法,其中該第四輸入對應於該觸控螢幕顯示器上之一拖曳手勢。The method of claim 32, wherein the fourth input corresponds to a drag gesture on the touch screen display. 如請求項32之方法,其進一步包括回應於來自該觸控螢幕顯示器之該第四輸入,將該第一游標移動至該觸控螢幕顯示器之該區域內部之一第一部位。The method of claim 32, further comprising, in response to the fourth input from the touch screen display, moving the first cursor to a first portion inside the area of the touch screen display. 如請求項26之方法,其進一步包括該超聲波影像之該特徵之追蹤,其包含從該觸控螢幕顯示器上之一顯示視窗之一區域內部之一第一部位處之一第一游標開始。The method of claim 26, further comprising tracking the feature of the ultrasound image, which includes starting with a first cursor at a first location inside a region of a display window on the touch screen display. 如請求項26之方法,其進一步包括藉由該電腦執行至少部分基於該超聲波影像之該預定特徵之該追蹤之對該超聲波影像之至少一量測。The method of claim 26, further comprising performing, by the computer, at least one measurement of the ultrasound image based at least in part on the tracking of the predetermined feature of the ultrasound image. 如請求項26之方法,其進一步包括使用經插入至具有一卡讀取器電路之該平板電腦中之一卡連接至一無線資料網路。The method of claim 26, further comprising connecting to a wireless data network using a card inserted into the tablet computer having a card reader circuit.
TW108112049A 2013-09-25 2014-09-25 Tablet ultrasound system TWI710356B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/037,106 2013-09-25
US14/037,106 US9877699B2 (en) 2012-03-26 2013-09-25 Tablet ultrasound system

Publications (2)

Publication Number Publication Date
TW201927247A true TW201927247A (en) 2019-07-16
TWI710356B TWI710356B (en) 2020-11-21

Family

ID=51703410

Family Applications (2)

Application Number Title Priority Date Filing Date
TW108112049A TWI710356B (en) 2013-09-25 2014-09-25 Tablet ultrasound system
TW103133359A TWI659727B (en) 2013-09-25 2014-09-25 Tablet ultrasound system

Family Applications After (1)

Application Number Title Priority Date Filing Date
TW103133359A TWI659727B (en) 2013-09-25 2014-09-25 Tablet ultrasound system

Country Status (2)

Country Link
TW (2) TWI710356B (en)
WO (1) WO2015048327A2 (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9877699B2 (en) 2012-03-26 2018-01-30 Teratech Corporation Tablet ultrasound system
US10667790B2 (en) 2012-03-26 2020-06-02 Teratech Corporation Tablet ultrasound system
EP3122257B1 (en) * 2014-03-27 2021-02-17 B-K Medical ApS Ultrasound imaging system touchscreen user interface
US20190365350A1 (en) * 2016-11-16 2019-12-05 Teratech Corporation Portable ultrasound system
EP3360486A1 (en) 2017-02-13 2018-08-15 Koninklijke Philips N.V. Ultrasound evaluation of anatomical features
US10856843B2 (en) 2017-03-23 2020-12-08 Vave Health, Inc. Flag table based beamforming in a handheld ultrasound device
US11446003B2 (en) 2017-03-27 2022-09-20 Vave Health, Inc. High performance handheld ultrasound
US10469846B2 (en) 2017-03-27 2019-11-05 Vave Health, Inc. Dynamic range compression of ultrasound images
US11531096B2 (en) 2017-03-23 2022-12-20 Vave Health, Inc. High performance handheld ultrasound
CN107728980A (en) * 2017-09-30 2018-02-23 广州广电银通金融电子科技有限公司 Veneer double-screen intelligent terminal control method is avoided the peak hour with device, intelligence joins cabinet system
IL277247B1 (en) * 2018-04-24 2024-03-01 Supersonic Imagine Ultrasound imaging system
US11237867B2 (en) 2018-04-27 2022-02-01 Mitsubishi Electric Corporation Determining an order for launching tasks by data processing device, task control method, and computer readable medium
CN109512457B (en) * 2018-10-15 2021-06-29 东软医疗系统股份有限公司 Method, device and equipment for adjusting gain compensation of ultrasonic image and storage medium
US11543508B2 (en) * 2018-11-30 2023-01-03 Fujifilm Sonosite, Inc. System and method for time-gain compensation control
CN109480903A (en) * 2018-12-25 2019-03-19 无锡祥生医疗科技股份有限公司 Imaging method, the apparatus and system of ultrasonic diagnostic equipment
WO2021020038A1 (en) * 2019-07-26 2021-02-04 富士フイルム株式会社 Ultrasonic diagnostic device and method for controlling ultrasonic diagnostic device
GB2595469A (en) * 2020-05-26 2021-12-01 Sezanne Marine Ltd An imaging system head, the imaging system, and associated methods
WO2022035931A1 (en) * 2020-08-12 2022-02-17 DTEN, Inc. Mode control and content sharing

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9402601B1 (en) 1999-06-22 2016-08-02 Teratech Corporation Methods for controlling an ultrasound imaging procedure and providing ultrasound images to an external non-ultrasound application via a network
US6969352B2 (en) 1999-06-22 2005-11-29 Teratech Corporation Ultrasound probe with integrated electronics
US7066887B2 (en) 2003-10-21 2006-06-27 Vermon Bi-plane ultrasonic probe
EP2267482A1 (en) * 2003-11-26 2010-12-29 Teratech Corporation Modular portable ultrasound systems
EP1817653A1 (en) * 2004-10-12 2007-08-15 Koninklijke Philips Electronics N.V. Ultrasound touchscreen user interface and display
US20090198132A1 (en) * 2007-08-10 2009-08-06 Laurent Pelissier Hand-held ultrasound imaging device having reconfigurable user interface
US9414804B2 (en) * 2007-08-24 2016-08-16 General Electric Company Diagnostic imaging device having protective facade and method of cleaning and disinfecting same
US20090177086A1 (en) * 2008-01-09 2009-07-09 Erik Normann Steen Method and apparatus for selectively enhancing ultrasound image data
TWI406684B (en) * 2008-01-16 2013-09-01 Univ Chang Gung Apparatus and method for real-time temperature measuring with the focused ultrasound system
WO2010051587A1 (en) * 2008-11-07 2010-05-14 Signostics Limited Dynamic control of medical device user interface
TWI380014B (en) * 2008-11-18 2012-12-21 Ind Tech Res Inst An ultrasonic imaging equipment and method
JP5566766B2 (en) * 2009-05-29 2014-08-06 株式会社東芝 Ultrasonic diagnostic apparatus, image display apparatus, image display method, and display method
TWI378255B (en) * 2009-09-30 2012-12-01 Pai Chi Li Ultrasonic image processing system and ultrasonic image processing method thereof
KR101948645B1 (en) * 2011-07-11 2019-02-18 삼성전자 주식회사 Method and apparatus for controlling contents using graphic object

Also Published As

Publication number Publication date
WO2015048327A3 (en) 2015-07-02
TW201531283A (en) 2015-08-16
TWI710356B (en) 2020-11-21
WO2015048327A8 (en) 2016-01-14
TWI659727B (en) 2019-05-21
WO2015048327A2 (en) 2015-04-02

Similar Documents

Publication Publication Date Title
TWI659727B (en) Tablet ultrasound system
US20220304661A1 (en) Tablet ultrasound system
US20160228091A1 (en) Tablet ultrasound system
US20210052256A1 (en) Ultrasound probe with integrated electronics
US20210015456A1 (en) Devices and Methods for Ultrasound Monitoring
US20190365350A1 (en) Portable ultrasound system
US20190336101A1 (en) Portable ultrasound system
US11547382B2 (en) Networked ultrasound system and method for imaging a medical procedure using an invasive probe
US10847264B2 (en) Resource management in a multi-modality medical system
JP6081446B2 (en) Multidisciplinary medical sensing system and method
US20230181160A1 (en) Devices and methods for ultrasound monitoring
JP2014519861A5 (en)
TWI834668B (en) Portable ultrasound system
TW202004774A (en) Portable ultrasound system