CN112741648A - Method and system for multi-mode ultrasound imaging - Google Patents

Method and system for multi-mode ultrasound imaging Download PDF

Info

Publication number
CN112741648A
CN112741648A CN202011106392.1A CN202011106392A CN112741648A CN 112741648 A CN112741648 A CN 112741648A CN 202011106392 A CN202011106392 A CN 202011106392A CN 112741648 A CN112741648 A CN 112741648A
Authority
CN
China
Prior art keywords
mode
imaging system
ultrasound imaging
ultrasound
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011106392.1A
Other languages
Chinese (zh)
Inventor
叶莲娜·维克托洛夫娜·齐姆巴连科
迈克尔·沃什伯恩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GE Precision Healthcare LLC
Original Assignee
GE Precision Healthcare LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GE Precision Healthcare LLC filed Critical GE Precision Healthcare LLC
Publication of CN112741648A publication Critical patent/CN112741648A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/50Clinical applications
    • A61B6/505Clinical applications involving diagnosis of bone
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/06Measuring blood flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/468Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means allowing annotation or message recording
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Physiology (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Dentistry (AREA)
  • Hematology (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides a method and system for multi-mode ultrasound imaging. Various methods and systems are provided for automatically or semi-automatically adjusting one or more ultrasound imaging parameters for imaging in a second mode based on an image obtained in a first mode. In one example, a method comprises: the method includes operating the ultrasound imaging system in a first operating mode, determining an anatomical structure imaged by the ultrasound imaging system in the first operating mode, and adjusting an imaging parameter of the ultrasound imaging system in a second operating mode based on the first operating mode and the anatomical structure imaged in the first operating mode in response to an operating mode transition request.

Description

Method and system for multi-mode ultrasound imaging
Technical Field
Embodiments of the subject matter disclosed herein relate to ultrasound imaging.
Background
Medical diagnostic ultrasound imaging systems typically include a set of selectable imaging modes, such as B-mode and color flow doppler mode. The ultrasound imaging system may operate in a selected imaging mode and may be adjusted to operate in a different imaging mode according to user preferences. For B-mode imaging, the ultrasound imaging system generates a two-dimensional image of the tissue, where the brightness of the pixels corresponds to the intensity of the echoes. Alternatively, in color flow imaging mode, the doppler effect is used to detect the presence of blood flow in the body. The flow velocity at a given location in the vessel can be estimated by using the measured doppler shift and correcting the doppler angle between the ultrasound beam and the vessel orientation.
Disclosure of Invention
In one embodiment, a method includes operating an ultrasound imaging system in a first operating mode, determining an anatomical structure imaged by the ultrasound imaging system in the first operating mode, and adjusting imaging parameters of the ultrasound imaging system in a second operating mode based on the first operating mode and the anatomical structure imaged in the first operating mode in response to an operating mode transition request.
It should be appreciated that the brief description above is provided to introduce in simplified form a selection of concepts that are further described in the detailed description. It is not meant to identify key or essential features of the claimed subject matter, the scope of which is defined uniquely by the claims that follow the detailed description. Furthermore, the claimed subject matter is not limited to implementations that solve any disadvantages noted above or in any part of this disclosure.
Drawings
This patent or patent application document contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the office upon request and payment of the necessary fee.
The disclosure will be better understood from a reading of the following description of non-limiting embodiments with reference to the attached drawings, in which:
figure 1 illustrates an exemplary ultrasound imaging system according to one embodiment.
Fig. 2 shows a flow chart illustrating a method for controlling ultrasound imaging parameters based on user input.
Fig. 3 shows an ultrasound image of a first anatomical structure acquired in a first imaging mode.
Fig. 4 shows an ultrasound image of the first anatomical structure of fig. 3 acquired in the second imaging mode without assisted selection of ultrasound imaging parameters.
Fig. 5 shows an ultrasound image of the first anatomical structure of fig. 3-4 acquired in a second imaging mode with assisted selection of ultrasound imaging parameters.
Fig. 6 shows an ultrasound image of a second anatomical structure acquired in a first imaging mode.
Fig. 7 shows an ultrasound image of the second anatomical structure of fig. 6 acquired in a second imaging mode without assisted selection of ultrasound imaging parameters.
Fig. 8 shows an ultrasound image of the second anatomical structure of fig. 6-7 acquired in a second imaging mode with assisted selection of ultrasound imaging parameters.
Detailed Description
The following description relates to various embodiments of ultrasound imaging using an ultrasound imaging system, such as the ultrasound imaging system shown in fig. 1. The ultrasound imaging system is configured to operate in at least a first imaging mode and a second imaging mode, such as B-mode and color flow doppler mode. The ultrasound imaging system may transition between the two modes in response to input from an operator of the ultrasound imaging system, such as a clinician. Following a request to transition from the first mode to the second mode, the ultrasound imaging system generates recommended imaging parameters for operating in the second imaging mode based on images acquired while operating in the first imaging mode, as shown in the flow chart of fig. 2. The recommended imaging parameters may be generated according to an algorithm stored in a memory of the ultrasound imaging system and may be based on the anatomy being imaged.
For example, when imaging a first anatomical structure in a first mode as shown in fig. 3, the operator may transition the ultrasound imaging system to imaging in a second mode, and the ultrasound imaging system may provide recommended imaging parameters (e.g., imaging parameter values) for imaging the same structure in the second imaging mode, as shown in fig. 5. The recommended imaging parameters may be accepted by the operator or the operator may reject the recommended imaging parameters for continued imaging in the second mode without the recommended imaging parameters, as shown in fig. 4. When a different second anatomical structure is imaged in the first mode, as shown in fig. 6, transitioning the ultrasound imaging system to the second mode may provide different recommended imaging parameters relative to the conditions under which the first anatomical structure is imaged, as shown in fig. 8. The operator may optionally reject the recommended imaging parameters and continue imaging in the second mode without the recommended imaging parameters, as shown in fig. 7.
By providing recommended imaging parameters when transitioning between imaging modes, the image quality of an ultrasound imaging system may be improved. For example, providing recommended imaging parameters based on the anatomical structure being imaged may increase the clarity of various features of the anatomical structure in images generated by the ultrasound imaging system. Furthermore, by providing recommended imaging parameters when transitioning between imaging modes, an operator may more easily select imaging parameters that will increase image sharpness without having to individually adjust each imaging parameter. Thus, the cognitive load of the operator can be reduced, and the amount of time to image the subject can be reduced.
Referring now to fig. 1, a schematic diagram of an ultrasound imaging system 100 is shown, according to one embodiment. The ultrasound imaging system 100 includes a transmit beamformer 101 and a transmitter 102 that drive elements 104 within a transducer array of an ultrasound probe 106 to transmit pulsed ultrasound signals into the body of a subject (e.g., a patient, not shown). The ultrasound probe 106 may, for example, comprise a linear array probe, a curved array probe, a fan probe, or any other type of ultrasound probe configured to acquire both two-dimensional (2D) B-mode data and 2D color blood flow data, or both 2D B mode data and another ultrasound mode that detects blood flow velocity in the direction of the blood vessel axis. Thus, the elements 104 of the ultrasound probe 106 may be arranged in a one-dimensional (1D) or 2D array. The pulsed ultrasonic signals are backscattered from structures in the body, such as blood cells or muscle tissue, to produce echoes that return to the elements 104. The echoes are converted into electrical signals or ultrasound data by the elements 104, and the electrical signals are received by the receiver 108. The electrical signals representing the received echoes pass through a receive beamformer 110 which outputs ultrasound data. According to some implementations, the probe 106 may include electronic circuitry to perform all or part of transmit beamforming and/or receive beamforming. For example, all or part of the transmit beamformer 101, the transmitter 102, the receiver 108, and the receive beamformer 110 may be located within the ultrasound probe 106. In this disclosure, the term "scan" or "in-scan" may also be used to refer to acquiring data by the process of transmitting and receiving ultrasound signals. In the present disclosure, the terms "data" and "ultrasound data" may be used to refer to one or more data sets acquired with an ultrasound imaging system.
The user interface 115 may be used to control the operation of the ultrasound imaging system 100, including for controlling the entry of patient data, for changing scanning or display parameters, for selecting various modes, operations, and parameters, and so forth. The user interface 115 may include one or more of the following: a rotator, a mouse, a keyboard, a trackball, hard keys linked to a particular action, soft keys that may be configured to control different functions, a graphical user interface displayed on the display device 118 (in embodiments where the display device 118 comprises a touch-sensitive display device or touch screen), and the like. In some examples, user interface 115 may include a proximity sensor configured to detect objects or gestures within a few centimeters of the proximity sensor. The proximity sensor may be located on the display device 118 or as part of a touch screen. The user interface 115 may include, for example, a touch screen positioned in front of the display device 118, or the touch screen may be separate from the display device 118.
Physical controls of the user interface 115, such as buttons, sliders, knobs, keyboards, mice, trackballs, etc., may be included alone or in combination with graphical user interface icons displayed on the display device 118. The display device 118 may be configured to display a Graphical User Interface (GUI) according to instructions stored in the memory 120. The GUI may include user interface icons representing commands and instructions. The user interface icons of the GUI are configured such that a user can select a command associated with each particular user interface icon in order to initiate the various functions controlled by the GUI. For example, various user interface icons may be used to represent windows, menus, buttons, cursors, scroll bars, and the like. According to embodiments in which the user interface 115 comprises a touch screen, the touch screen may be configured to interact with a GUI displayed on the display device 118. The touch screen may be a single-touch screen configured to detect a single point of contact at a time, or the touch screen may be a multi-touch screen configured to detect multiple points of contact at a time. For embodiments in which the touchscreen is a multi-touch screen, the touchscreen may be configured to detect multi-touch gestures involving contact from two or more fingers of the user at a time. The touch screen may be a resistive touch screen, a capacitive touch screen, or any other type of touch screen configured to receive input from a stylus or one or more fingers of a user. According to other embodiments, the touch screen may comprise an optical touch screen that uses techniques such as infrared light or other frequencies of light to detect one or more points of contact initiated by a user.
According to various embodiments, the user interface 115 may include off-the-shelf consumer electronic devices, such as smart phones, tablets, laptops, and the like. For the purposes of this disclosure, the term "off-the-shelf consumer electronic device" is defined as an electronic device designed and developed for general consumer use, rather than specifically designed for a medical environment. According to some embodiments, the consumer electronics device may be physically separated from the rest of the ultrasound imaging system 100. The consumer electronic devices may communicate with the processor 116 via wireless protocols such as Wi-Fi, bluetooth, Wireless Local Area Network (WLAN), near field communication, etc. According to one embodiment, the consumer electronic device may communicate with the processor 116 through an open Application Programming Interface (API).
The ultrasound imaging system 100 also includes a processor 116 to control the transmit beamformer 101, the transmitter 102, the receiver 108, and the receive beamformer 110. The processor 116 is configured to receive input from the user interface 115. The receive beamformer 110 may comprise a conventional hardware beamformer or a software beamformer according to various embodiments. If the receive beamformer 110 is a software beamformer, the receive beamformer 110 may include one or more of the following: a Graphics Processing Unit (GPU), a microprocessor, a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or any other type of processor capable of performing logical operations. The receive beamformer 110 may be configured to perform conventional beamforming techniques as well as techniques such as Retrospective Transmit Beamforming (RTB). If the receive beamformer 110 is a software beamformer, the processor 116 may be configured to perform some or all of the functions associated with the receive beamformer 110.
The processor 116 is in electronic communication with the ultrasound probe 106. For purposes of this disclosure, the term "electronic communication" may be defined to include both wired and wireless communications. The processor 116 may control the ultrasound probe 106 to acquire data. The processor 116 controls which of the elements 104 are active and the shape of the beam emitted from the ultrasound probe 106. The processor 116 is also in electronic communication with a display device 118, and the processor 116 may process the data into images for display on the display device 118. According to one embodiment, the processor 116 may include a CPU. According to other embodiments, the processor 116 may include other electronic components capable of performing processing functions, such as a GPU, a microprocessor, a DSP, a Field Programmable Gate Array (FPGA), or any other type of processor capable of performing logical operations.
According to other embodiments, the processor 116 may include a plurality of electronic components capable of performing processing functions. For example, the processor 116 may include two or more electronic components selected from a list of electronic components including: CPU, DSP, FPGA and GPU. According to another embodiment, the processor 116 may further include a complex demodulator (not shown) that demodulates the RF data and generates raw data. In another embodiment, demodulation may be performed earlier in the processing chain. The processor 116 is adapted to perform one or more processing operations according to a plurality of selectable ultrasound modalities on the data. As the echo signals are received, the data may be processed in real time during the scanning session.
For the purposes of this disclosure, the term "real-time" is defined to include processes that are performed without any intentional delay. For example, embodiments may acquire images at a real-time rate of 7 volumes/second to 20 volumes/second. The ultrasound imaging system 100 can acquire 2D data for one or more planes at a significantly faster rate. However, it should be understood that the real-time volume rate may depend on the length of time it takes to acquire data per volume for display. Thus, when acquiring relatively large volumes of data, the real-time volume rate may be slow. Thus, some embodiments may have a real-time volume rate significantly faster than 20 volumes/second, while other embodiments may have a real-time volume rate slower than 7 volumes/second. The data may be temporarily stored in a buffer (not shown) during the scan session and processed in a less real-time manner in a real-time or offline operation. Some embodiments of the invention may include a plurality of processors (not shown) to process processing tasks processed by the processor 116 according to the exemplary embodiments described above. For example, a first processor may be used to demodulate and extract the RF signal, while a second processor may be used to further process the data prior to displaying the image. It should be understood that other embodiments may use different processor arrangements.
The ultrasound imaging system 100 may acquire data continuously at a volume rate of, for example, 10Hz to 30 Hz. Images generated from the data may be refreshed at a similar frame rate. Other embodiments may acquire and display data at different rates. For example, some embodiments may acquire data at a volume rate of less than 10Hz or greater than 30Hz, depending on the size of the volume and the intended application. A memory 120 is included for storing the processed volume of acquired data. In an exemplary embodiment, the memory 120 has sufficient capacity to store at least several seconds of ultrasound data volume. The data volumes are stored in a manner that facilitates retrieval according to their order or time of acquisition. Memory 120 may include any known data storage media.
Optionally, embodiments of the invention may be implemented using contrast agents. When ultrasound contrast agents, including microbubbles, are used, contrast imaging generates enhanced images of anatomical structures and blood flow in the body. After acquiring data using a contrast agent, image analysis includes separating harmonic components and linear components, enhancing the harmonic components, and generating an ultrasound image by using the enhanced harmonic components. Separation of the harmonic components from the received signal is performed using a suitable filter. The use of contrast agents for ultrasound imaging is well known to those skilled in the art and will therefore not be described in detail.
In various embodiments of the present invention, the data may be processed by the processor 116 through other or different mode-dependent modules (e.g., B-mode, colorflow doppler mode, M-mode, color M-mode, spectral doppler, elastography, TVI, strain rate, etc.) to form 2D or 3D data. For example, one or more modules may generate B-mode, color flow doppler mode, M-mode, color M-mode, spectral doppler, elastography, TVI, strain rate, combinations thereof, and the like. The image lines and/or volumes are stored and timing information indicative of the time at which the data was acquired in the memory may be recorded. These modules may include, for example, a scan conversion module to perform a scan conversion operation to convert the image volume from beam space coordinates to display space coordinates. A video processor module may be provided that reads the image volume from memory and displays the image in real time as the patient is being operated on. The video processor module may store images in an image memory, from which the images are read and displayed.
As mentioned above, the ultrasound probe 106 may comprise a linear probe or a curved array probe. Fig. 1 further depicts a longitudinal axis 188 of the ultrasound probe 106. The longitudinal axis 188 of the ultrasound probe 106 extends through and parallel to the handle of the ultrasound probe 106. Further, the longitudinal axis 188 of the ultrasound probe 106 is perpendicular to the array face of the elements 104.
The ultrasound imaging system 100 is configured to provide recommended imaging parameters to a user (e.g., an operator, such as a clinician) of the ultrasound imaging system 100 during a condition in which the user transitions operation of the ultrasound imaging system 100 between imaging modes. For example, a user may transition the ultrasound imaging system 100 from operating in a first imaging mode (e.g., B-mode) to operating in a second imaging mode (e.g., color flow doppler mode). In response to the user-input request to transition to the second imaging mode, the ultrasound imaging system 100 provides recommended imaging parameters for imaging with the recommended imaging parameters in the second imaging mode based on the anatomical structure imaged while in the first imaging mode.
In some examples, the ultrasound imaging system 100 may periodically track the anatomy being imaged while operating in the first imaging mode, and may store the tracked anatomy in memory. As one example, at a given time when imaging in the first imaging mode (referred to herein as t1), the ultrasound imaging system may analyze a predetermined number of images (e.g., five images) stored in memory that are acquired by the ultrasound imaging system in sequence immediately prior to time t 1. The ultrasound imaging system may determine the anatomical structure (e.g., kidney) imaged at time t1 based on images acquired prior to time t1 according to an algorithm or model (e.g., a depth learning model) stored in memory of the ultrasound imaging system, as described further below, and the determined anatomical structure may be stored in memory as the anatomical structure currently being imaged. After a period of time (e.g., 5 seconds) has elapsed after time t1, at time t2 the ultrasound imaging system may again determine the anatomical structure being imaged based on images acquired in sequence immediately prior to time t2 (e.g., during the elapsed time between time t1 and time t 2), and the ultrasound imaging system 100 may store the determined anatomical structure in memory as the anatomical structure currently being imaged. In response to a user request to transition the ultrasound imaging system 100 from imaging in the first imaging mode to imaging in the second imaging mode, the ultrasound imaging system 100 may provide recommended imaging parameters based on the most recently determined anatomical structure stored in the memory.
As another example, the ultrasound imaging system 100 may not periodically track the anatomy being imaged, but instead may determine the anatomy being imaged in response to a user request to transition from the first imaging mode to the second imaging mode. For example, when a user inputs a request to transition from a first imaging mode to a second imaging mode, the ultrasound imaging system 100 may analyze a first predetermined number of images (e.g., five images) acquired in sequence immediately prior to the transition request in order to determine the anatomical structure being imaged. The ultrasound imaging system 100 then provides recommended imaging parameters for the second imaging mode to the user based on the anatomical structure determined as a result of the analysis of the images acquired while imaging in the first imaging mode. In some examples, if the ultrasound imaging system 100 is unable to determine the anatomical structure being imaged based on a first predetermined number of images analyzed by the ultrasound imaging system 100 in response to the transition request (e.g., due to a large amount of noise and/or anatomical variation associated with the images), the ultrasound imaging system 100 may expand the number of images analyzed to a second, larger number of images (e.g., 10 images) that are acquired in sequence immediately prior to the transition request, and may analyze the images in an attempt to determine the anatomical structure being imaged. In some examples, the ultrasound imaging system 100 may repeat this process each time the number of analyzed images is expanded until the anatomy being imaged is determined.
The recommended imaging parameters provided by the ultrasound imaging system 100 in response to a mode transition request (e.g., a user request to transition from one imaging mode to another) are based on the anatomy being imaged and the mode in which the user requested the transition. For example, when transitioning from a first mode (e.g., B-mode) to a second mode (e.g., color flow doppler mode), the ultrasound imaging system 100 determines the anatomy being imaged, as described above. The ultrasound imaging system 100 then provides recommended imaging parameters for imaging in the second mode based on the anatomy being imaged. However, the recommended imaging parameters provided when transitioning from the first mode to the second mode may be different than the recommended imaging parameters provided when transitioning from the second mode to the first mode. For example, in a situation where the anatomical structure being imaged is a kidney (as one non-limiting example) and the user inputs a request to transition from B-mode to color flow doppler mode, the ultrasound imaging system 100 may provide a first set of recommended imaging parameters (e.g., first imaging presets), wherein the first set of recommended imaging parameters is configured to improve imaging quality in color flow doppler mode. In a situation where the anatomy being imaged is the same kidney but the user inputs a request to transition from color flow doppler mode to B mode, the ultrasound imaging system may provide a second set of recommended imaging parameters (which may be different from the first set of imaging parameters), where the second set of imaging parameters is configured to improve imaging quality in B mode. Although B-mode and color flow doppler mode are described above as examples, other modes are possible (e.g., M-mode, color M-mode, spectral doppler mode, etc.).
The ultrasound imaging system 100 determines the anatomical structure being imaged via one or more algorithms stored in memory (e.g., image processing algorithms such as edge detection, machine learning models, deep neural networks, etc.) and provides recommended imaging parameters to the user using the determination of the anatomical structure being imaged with the one or more algorithms. For example, the ultrasound imaging system 100 may identify features of images associated with various anatomical structures and/or regions of the body (such as bones, blood vessels, organs, etc.) acquired during imaging (e.g., during a scan of a patient) based on the shape, relative proximity, apparent depth, orientation, etc. of the features. Based on the identified features, the ultrasound imaging system 100 may determine the anatomical structure being imaged. As one example, in a situation in which a user operates the ultrasound imaging system 100 to image the aorta of a patient, the ultrasound imaging system 100 may determine that the aorta is being imaged based on the motion and/or orientation of the aorta relative to surrounding structures (such as the brachiocephalic artery) via a deep learning classification model trained to identify the motion and/or orientation of the aorta and other anatomical structures.
In some embodiments, the ultrasound imaging system 100 may perform the verification of the anatomical structure being imaged immediately after the transition from the first imaging mode to the second imaging mode. For example, in response to an imaging mode transition request (e.g., a user input indicating a need to transition from a first imaging mode (such as B-mode) to a second imaging mode (such as color flow doppler mode)), as described above, the ultrasound imaging system 100 may analyze a predetermined number of images (e.g., five images) stored in memory, which are acquired by the ultrasound imaging system in sequence immediately prior to the transition request, which may be referred to as a first set of images. The ultrasound imaging system may determine the anatomical structure (e.g., the kidney) being imaged after the transition request and before transitioning from the first imaging mode to the second imaging mode, where the determination is based on images acquired according to an algorithm or model (e.g., a depth learning model) stored in a memory of the ultrasound imaging system before the transition request.
After determining the anatomical structure being imaged, the ultrasound imaging system may transition from the first imaging mode to the second imaging mode. However, after transitioning to the second imaging mode, the ultrasound imaging system 100 may perform verification (e.g., re-determination) of the anatomical structure being imaged by sequentially acquiring a second set of images (e.g., via the probe 106) while operating in the second imaging mode. In some examples, the second set of images may include the same number of images as the first set of images, while in other examples, the second set of images may include a different number of images (e.g., ten images).
The algorithm or model that determines the anatomical structure being imaged prior to the transition request may additionally analyze image information (e.g., color information) of one or more images in the second set of images acquired after the transition to the second imaging mode and compare the image information to an expected amount of image information in order to verify that the anatomical structure imaged when in the first imaging mode is the same as the anatomical structure imaged when in the second imaging mode. For example, in the case where the second imaging mode is a color flow doppler mode, the algorithm or model may compare the amount of color information in each image in the second set of images to an expected amount of color information, where the expected amount of color information may be different for different anatomical structures. In some examples, the expected amount of color information may be stored in a look-up table in a memory of the controller (e.g., where the input is the anatomical structure being imaged and where the output is the expected amount of color information), and in some examples, the expected amount of color information may be a function of both the anatomical structure being imaged and imaging parameters, such as depth of the anatomical structure, patient age, patient weight, and the like. In other examples where the second imaging mode is different from the color flow doppler mode (e.g., M-mode), the algorithm or model may validate the anatomical structure being imaged differently when in the second imaging mode, such as comparing elasticity information of the anatomical structure in each image in the second set of images to expected elasticity, where the expected elasticity may be different for different anatomical structures. Other examples are possible.
After determining the anatomical structure being imaged based on the first set of images, the ultrasound imaging system 100 stores the determined anatomical structure in memory. After verifying the anatomical structure imaged in the second imaging mode as described above, the ultrasound imaging system 100 may compare the results of the verification (e.g., the anatomical structure determined based on the second set of images) with the anatomical structure determined when imaged in the first imaging mode (e.g., the anatomical structure determined based on the first set of images before transitioning to the second imaging mode and in response to the transition request). If the anatomical structure has not changed (e.g., stored in memory and the anatomical structure determined via the first set of images matches the anatomical structure determined via the second set of images), the ultrasound imaging system 100 may retain the anatomical structure stored in memory (e.g., without changing the anatomical structure stored in memory). However, if the anatomical structure has changed (e.g., the anatomical structure stored in the memory does not match the anatomical structure determined via the second set of images), the ultrasound imaging system 100 may update the anatomical structure stored in the memory to the anatomical structure determined via the second set of images and may update the recommended imaging parameters generated by the ultrasound imaging system 100 using the updated anatomical structure. In further examples, if the anatomical structure determined from the second set of images does not match the anatomical structure determined in the first set of images (the anatomical structure in the second set of images cannot be determined), the system has a high confidence that the ultrasound probe has not moved significantly (e.g., based on feedback from a probe motion sensor, an indoor camera, or other probe motion tracking mechanism), and the mismatched or unidentified anatomical structure in the second set of images is attributable to recommended imaging parameters that generate low quality images (e.g., recommended based on the first set of images). In such examples, the system may recommend different imaging parameters or may request the operator to update the imaging parameters.
In some examples, one or more algorithms (e.g., a deep neural network) used by the ultrasound imaging system 100 to provide recommended imaging parameters in response to imaging mode transition requests may analyze previous inputs by a user of the ultrasound imaging system 100 to provide more accurate recommended parameters. For example, in a situation in which a user operating the ultrasound imaging system 100 inputs an imaging mode transition request, the ultrasound imaging system 100 provides recommended imaging parameters based on the anatomy being imaged, as described above. If the user rejects the recommended imaging parameters (e.g., selects imaging parameters other than the recommended imaging parameters), the ultrasound imaging system 100 may adjust future recommended imaging parameters based on the imaging parameters selected by the user. As one example, one or more algorithms of the ultrasound imaging system 100 may identify: during imaging of the kidney, the user frequently rejects recommended imaging parameters provided by the ultrasound imaging system 100 and instead enters customized imaging parameters. Thus, the ultrasound imaging system 100 may adjust the recommended imaging parameters to be closer to the values of the user-entered imaging parameters in the plurality of transition requests. However, in other examples, recommended imaging parameters for different anatomical structures may be predetermined (e.g., stored in a data table), and the ultrasound imaging system 100 may provide the same recommended imaging parameters for a given anatomical structure being imaged, regardless of previous user input.
As described above, the recommended imaging parameters provided by the ultrasound imaging system 100 are based on the determined anatomical structure being imaged. In some examples, the anatomical structure being imaged may be a less localized region of the body, such as the abdomen, as determined by the ultrasound imaging system 100. In other examples, the anatomical structure being imaged may be a more localized structure within a region, such as the appendix within the abdomen. As such, as determined by the ultrasound imaging system 100, the anatomical structure being imaged may include both a less localized region (such as the abdomen) and a more localized structure (such as the appendix). When a user requests to transition between imaging modes, if the ultrasound imaging system 100 has determined that a structure is being imaged (e.g., based on images acquired in sequence immediately prior to the transition request, as described above), the ultrasound imaging system 100 may base the recommended imaging parameters on a more localized structure. However, in situations where the ultrasound imaging system 100 is unable to determine more localized structures being imaged (e.g., due to increased noise, movement, etc.), the ultrasound imaging system 100 may instead provide recommended imaging parameters based on less localized regions being imaged (e.g., the abdomen).
Further, in some examples, the probe 106 of the ultrasound imaging system 100 may include a position sensor configured to sense a position of the probe 106 relative to one or more reference positions. For example, in determining the anatomy being imaged, the position sensor of the probe 106 may continuously track movement of the probe 106 (e.g., rotation, translation, orientation, etc. of the probe 106) relative to the position of the probe 106. The position sensor may transmit position information to the ultrasound imaging system 100, and in response to an imaging mode transition request, the ultrasound imaging system 100 may utilize the position information in conjunction with an analysis of a predetermined number of images acquired in sequence immediately prior to the transition request (as described above) in order to increase the accuracy of the determination of the anatomical structure being imaged.
In some examples, in a condition in which the position of the probe 106 is tracked via the position sensor and the probe 106 has not moved between independent imaging mode transition requests, the ultrasound imaging system 106 may determine that the same anatomical structure is being imaged at each transition request. For example, in response to a first imaging mode transition request (e.g., transition from B-mode to color flow doppler mode) at time t1, the ultrasound imaging system 100 determines the anatomical structure being imaged based on a predetermined number of images acquired in sequence immediately prior to the first transition request, as described above. During a request, the probe 106 may remain at a given position. After a period of time (e.g., 10 seconds, 20 seconds, etc.) has elapsed, the operator may enter a second imaging mode transition request (e.g., to transition from color flow doppler mode back to B mode) at time t2 without moving the probe 106 from the given position. Because the probe 106 has not moved after the first transition request (e.g., not moved between time t1 and time t 2), the ultrasound imaging system 100 may determine that the anatomical structure being imaged at time t2 is the same as the anatomical structure being imaged at time t1 in response to the second imaging mode transition request.
In some examples, the ultrasound imaging system 100 may additionally analyze an amount of image information of one or more images acquired after the transition to the second imaging mode and compare the amount of image information to an expected amount of image information in order to update the recommended imaging parameters. For example, the ultrasound imaging system 100 may acquire a second set of sequential images after transitioning to the second imaging mode (e.g., similar to the second set of images described above), and in a situation in which the second imaging mode is color flow doppler mode, the ultrasound imaging system 100 may compare the amount of color information in each image in the second set of images to an expected amount of color information, where the expected amount of color information may be different for different anatomical structures (similar to the example described above). For example, the expected amount of color information may be a function of both the anatomy being imaged and the imaging parameters used to acquire the second set of images in the second imaging mode. If the amount of color information is different than the expected amount of color information, the ultrasound imaging system 100 may update the recommended imaging parameters based on the difference between the acquired amount of color information and the expected amount of color information. As one example, the ultrasound imaging system 100 may modify (e.g., update) a given parameter (e.g., pulse repetition frequency) of the recommended imaging parameters via a function stored in memory, with the difference between the acquired amount of color information and the expected amount of color information as one input, an unmodified recommended pulse repetition frequency as a second input, and with the updated recommended pulse repetition frequency as an output. Other examples are possible. In further examples, the intensity and/or distribution of color information in the acquired color flow image may provide a verification check as to whether the correct anatomical structure is identified. For example, when imaging a kidney in color flow doppler mode, color information may be distributed over the kidney with inflow/inflow colors being distributed/present somewhat evenly (see, e.g., fig. 8). In contrast, when the aorta is imaged in color flow doppler mode, the color information may be less evenly distributed and may include a concentrated region of inflow and a concentrated region of outflow (see, e.g., fig. 5). If the color information is not distributed as expected, the method may determine that the initial anatomy identification is incorrect and re-identify the anatomy (e.g., using the second set of images) and/or ask the operator to confirm the imaged anatomy, from which the imaging parameters may be adjusted (e.g., if the anatomy is incorrect and thus the wrong imaging parameters were applied).
Referring now to fig. 2, a flow chart is shown illustrating a method 200 for controlling the operation of an ultrasound imaging system. In at least one example, the ultrasound imaging system referenced by the method 200 is the ultrasound imaging system 100 described above with reference to fig. 1, and the method 200 may be implemented by the ultrasound imaging system 100. In some embodiments, the method 200 may be implemented as executable instructions in a memory of an ultrasound imaging system (such as the memory 120 of fig. 1).
At 202, an operating mode of the ultrasound imaging system is determined. The operating mode may include an imaging mode of a probe of an ultrasound imaging system (e.g., probe 106 shown in fig. 1 and described above). The operating mode may include any of the imaging modes described above, such as B-mode, colorflow doppler mode, M-mode, colorm-mode, spectral doppler, elastography, TVI, strain rate, and the like. For each mode of operation, the ultrasound imaging system may control the probe in different ways (e.g., control signals transmitted to or process signals received from the transducer of the probe in different ways). Determining the operating mode may include determining an imaging mode selected by a user (e.g., an operator, such as a clinician) of the ultrasound imaging system via a user input device (e.g., a button or other physical control of the probe, an input to a GUI displayed at a touchscreen display device, etc.). In some examples, the user input device may be the user interface 115 referenced above with respect to fig. 1. The selected imaging mode may be stored in a memory of the ultrasound imaging system, and the determination of the operational mode may include retrieving the selected imaging mode from the memory.
At 204, one or more ultrasound images and/or rings are acquired (e.g., via an ultrasound probe) and the image set is stored in memory. The memory may be a memory of an ultrasound imaging system, such as memory 120 described above with reference to fig. 1. The image set may include a predetermined number of images and/or rings (e.g., five images, ten images, twenty images, etc.) acquired in sequence by the ultrasound imaging system. In some examples, the image set may be continuously updated by the ultrasound imaging system as new images are acquired. For example, the image set may include five images acquired in sequence by the ultrasound imaging system. Acquiring the sixth image may result in the oldest image being deleted from the image set such that the images stored within the image set are the most recently acquired images in sequential order (e.g., acquisition order).
At 206, it is determined whether an operating mode transition is requested. The operational mode transition includes a user input of the ultrasound imaging system to adjust from a first imaging mode (e.g., B-mode) to a second, different imaging mode (e.g., color flow doppler mode). As described above, this mode of operation may be selected by the user via a user input device, which may include a button or other physical control of the probe, an input to a GUI displayed at a touch screen display device, or the like. The user input device may be the user interface 115 referenced above with respect to fig. 1. As one example, a user may operate the ultrasound imaging system in the first imaging mode and may press a button on a probe (e.g., the probe 106 shown in fig. 1) in order to input a transition request to operate the ultrasound imaging system in the second mode. As another example, a user may operate the ultrasound imaging system in the first imaging mode and may enter a mode transition at a GUI of a display device of the ultrasound imaging system (e.g., via a mouse, keyboard, trackball, etc., or via touching a configuration in which the display device includes a touchscreen) in order to transition (adjust) the ultrasound imaging system to operate in the second imaging mode.
If an operational mode transition is not requested at 206, ultrasound imaging parameters are maintained at 208. The holding parameters may include not transitioning an imaging mode of the ultrasound imaging system (e.g., holding the ultrasound imaging system in a first imaging mode and not transitioning the ultrasound imaging system to a second imaging mode).
However, if an operational mode transition is requested at 206, the anatomy and/or scan plane being imaged is determined based on the stored image set at 210. For example, at 206, a user of the ultrasound imaging system may input a selection to transition operation of the ultrasound imaging system from a first mode to a second mode. Thus, at 210, the ultrasound imaging system analyzes the set of images stored in memory to determine the anatomy being imaged. The determination of the anatomy being imaged may be similar to the example described above with reference to fig. 1. For example, the ultrasound imaging system may utilize one or more algorithms (e.g., machine learning models, deep neural networks, etc.) to analyze the image set and determine the anatomy being imaged, where the analysis of the image set may include determining various anatomical structures and/or regions of the body, such as bones, vessels, organs, etc., shown in the image based on the shape, relative proximity, apparent depth, orientation, etc., of the features. In addition, the view or scan plane being imaged may be identified. For example, the system may determine that a heart is being imaged and then further identify that a four-chamber view of the heart is currently being imaged (e.g., as opposed to a two-chamber view, a three-chamber view, etc.). By analyzing the set of images instead of a single image, the confidence in the detected anatomical structure may be increased. Furthermore, motion of one or more structures in the image may be detected by comparing the location, shape, size, etc. of the identified structures on the set of images, which may help identify certain anatomical features. Further, a first number of images (e.g., five) may be analyzed when switching from the first mode of operation to the second mode of operation, while a different number of images (e.g., ten) may be analyzed when switching from the second mode of operation to the first mode of operation. For example, when operating in color flow doppler mode, the image quality of underlying anatomical features may not be as high as when imaging in B-mode, and therefore more images may be analyzed when switching from color flow doppler mode than when switching from B-mode in order to increase the confidence in the anatomical structure detection.
At 212, recommended ultrasound imaging parameters are generated based on the requested operational mode transition and the determined anatomical structure and/or scan plane being imaged, and the recommended ultrasound imaging parameters are displayed at a display device. Similar to the example described above with reference to the ultrasound imaging system 100 shown in fig. 1, the recommended ultrasound imaging parameters may be different for different operating modes and different anatomical structures and/or scan planes. For example, in a situation where the anatomical structure being imaged is a kidney (as one non-limiting example) and the user inputs a request to transition from B mode to color flow doppler mode, a first set of recommended imaging parameters may be generated, wherein the first set of recommended imaging parameters is configured to improve imaging quality in color flow doppler mode. In a situation where the anatomy being imaged is the same kidney but the user inputs a request to transition from color flow doppler mode to B mode, the ultrasound imaging system may provide a second set of recommended imaging parameters (which may be different from the first set of imaging parameters), where the second set of imaging parameters is configured to improve imaging quality in B mode. The image acquired by the ultrasound imaging system in color flow doppler mode may include gray scale information (e.g., monochromatic image data) as well as color information (e.g., color flow image data rendered according to a color reference, such as color reference 703 shown in fig. 7 and described below).
In some examples, when switching from color flow doppler mode to B mode, the ultrasound imaging system may analyze only grayscale information or only color flow information of images acquired when operating in color flow doppler mode to generate recommended imaging parameters for B mode. Further, when transitioning between other modes (such as transitioning from an elastography mode to a TVI mode), the ultrasound imaging system may analyze only a portion of the image information acquired in the operating mode prior to the transition (e.g., the elastography mode) in order to generate recommended imaging parameters for the transitioned mode (e.g., the TVI mode).
In some examples, when determining an anatomical structure being imaged in response to a transition request to transition from color flow doppler mode to B mode, the ultrasound imaging system may analyze only the grayscale information portion of the image acquired in color flow doppler mode or only the color flow information to determine the imaged anatomical structure. Similarly, when transitioning between other modes (such as transitioning from an elastography mode to a TVI mode), the ultrasound imaging system may analyze only a portion of the image information acquired in the operating mode prior to the transition (e.g., the elastography mode) in order to determine the anatomy being imaged. Analyzing only a portion of the image information may reduce the load (e.g., processing load and/or analysis time) on the ultrasound imaging system.
In further examples, the above-described anatomical structure determination may only work as expected for images acquired in certain modes (such as B-mode images). When transitioning from an imaging mode in which image information is not suitable for anatomy detection (e.g., M-mode), the method may include transiently operating in the imaging mode in which suitable images may be acquired for anatomy detection, such as transiently operating in B-mode to acquire a set of images, which may then be input into a model to determine the anatomy being imaged. Once the images for anatomy detection are acquired, a mode switch to the requested mode of operation may be initiated. In some examples, prior to inputting the images into the model to detect anatomical structures, B-mode images may be displayed and the operator may be requested to confirm that the imaged view/scan plane is the desired view/scan plane.
The recommended imaging parameters may be displayed at a display device, such as display device 118 of fig. 1. The user may interact with the display device via a user interface (e.g., user interface 115 of fig. 1) to confirm or reject the recommended imaging parameters. Examples of recommended imaging parameters based on the requested mode transition and the anatomy being imaged that may be displayed at the display device are illustrated by fig. 5 and 8 and described further below.
The recommended imaging parameters may be stored in a lookup table in the memory of the ultrasound system or other data structure indexed by the anatomy and the requested imaging mode. In other examples, the recommended imaging parameters may be determined by a model that uses the identified anatomical structure as input to determine the recommended imaging parameters. The model may also use previous user imaging parameters as input, which may allow the model to be customized according to the preferred imaging parameters of a particular user. In some examples, the model may use as input the imaging parameters used to acquire the set of images in the first mode of operation and the identified anatomical structure to determine the recommended imaging parameters. For example, the imaging parameters (e.g., depth, frequency) used to acquire the B-mode image may provide an indication of certain patient-specific features, such as patient thickness, which may affect image quality, but may not be apparent from the B-mode image/identified anatomical structure itself. For example, by considering imaging parameters for acquiring images in a first mode of operation, recommended imaging parameters for a second mode of operation may be fine-tuned based on the particular patient.
In some examples, the image itself may be input into a model that determines recommended imaging parameters. The model may be selected from a plurality of models based on the anatomy being imaged, the current imaging mode, and the requested imaging mode. For example, when a kidney is imaged in B mode and a user requests imaging in color flow doppler mode, a first model may be selected, and when an aorta is being imaged in B mode and a user requests imaging in color flow doppler mode, a second model may be selected; the third model may be selected when the kidney is imaged in color flow doppler mode and the user requests imaging in B mode. Each model may be a machine learning model (e.g., a convolutional neural network) that is trained using images acquired in a first mode and recommended imaging parameters and image quality for a second mode as a baseline true label. For example, the training data used to train the first kidney-specific model may include a plurality of data sets, where each data set includes a first image of the kidney acquired in a first mode (e.g., B-mode) and a second image of the kidney acquired immediately after the first image is acquired but in a second mode (e.g., color flow doppler mode). The imaging parameters used to acquire the second image may be included as a baseline true annotation of the first image, and an expert (e.g., physician) annotation indicating the relative image quality of the second image.
At 214, it is determined whether the user accepts the recommended ultrasound imaging parameters. For example, displaying the recommended imaging parameters at 212 may include displaying a button at a GUI of the ultrasound imaging system, wherein a first button is configured to confirm (e.g., accept) the recommended ultrasound imaging parameters, and wherein a second button is configured to reject (e.g., discard) the recommended ultrasound imaging parameters. Determining whether the recommended ultrasound imaging parameters are accepted by the user may include determining whether the user has entered a command via the user interface device to accept the recommended ultrasound imaging parameters (e.g., selecting the first button) or to decline the recommended ultrasound imaging parameters (e.g., selecting the second button).
If the user does not accept the recommended ultrasound imaging parameters at 214, the ultrasound imaging parameters are maintained at 208. For example, maintaining the parameters may include not adjusting the ultrasound imaging parameters based on the recommended ultrasound imaging parameters (e.g., maintaining the ultrasound imaging parameters utilized prior to the operating mode transition request). In some examples, in response to the user declining to recommend ultrasound imaging parameters, a menu or other graphical user interface feature may be displayed via which the user may select/adjust any desired ultrasound imaging parameters. In some examples, default ultrasound imaging parameters for the imaging mode may be utilized in response to the user declining to recommend the ultrasound imaging parameters.
However, if the user accepts the recommended ultrasound imaging parameters at 214, then the operation of the ultrasound imaging system is adjusted based on the recommended ultrasound imaging parameters at 216. For example, prior to the operational mode transition request, the probe of the ultrasound imaging system may be operated in B mode at a first (e.g., default) Pulse Repetition Frequency (PRF), such as 2.5 pulses per second. In response to an operational mode transition request to transition to imaging in color flow doppler mode, the ultrasound imaging system determines the anatomy being imaged (as described above) and generates a recommended PRF based on imaging the anatomy in color flow doppler mode. As one non-limiting example, the recommended PRF may be 2.1 pulses per second. In response to determining that the user has accepted the recommended ultrasound imaging parameters, the first PRF value is replaced with the recommended PRF value such that operation of the probe is adjusted from the first PRF to the recommended PRF (e.g., from 2.5 pulses per second to 2.1 pulses per second). Although PRF is described herein as an exemplary parameter, it is recommended that the ultrasound imaging parameters may include one or more other parameters (e.g., packet size, wall filter settings, spatial filter settings, etc.). Adjusting the operation of the ultrasound imaging system using the recommended imaging parameters may increase the sharpness of the image generated by the ultrasound imaging system. Non-limiting exemplary images and parameters are described below with reference to fig. 3-8.
Referring now collectively to fig. 3-5, different ultrasound images of a first anatomical structure acquired via an ultrasound imaging system are shown. The ultrasound images shown in fig. 3-5 may be acquired by the ultrasound imaging system 100 shown in fig. 1 and described above. The first anatomical structure shown in the ultrasound images of fig. 3-5 is the aorta of the patient. The ultrasound images shown in fig. 3-5 may be displayed at a display device of an ultrasound imaging system, such as the display device 118 of the ultrasound imaging system 100 shown in fig. 1 and described above.
Fig. 3 shows an ultrasound image 300 of the aorta 302 of a patient, wherein the ultrasound image 300 is acquired while the ultrasound imaging system is operating in the B mode. The ultrasound image 300 may represent an image acquired by an ultrasound imaging system prior to an imaging mode transition request, where the imaging mode transition request is similar to the transition request described above. For example, the ultrasound image 300 may be one of a set of images acquired sequentially when imaged by the ultrasound imaging system in B mode before transitioning the ultrasound imaging system to imaging in color flow doppler mode. The imaging mode transition request may include receiving an input from an operator (e.g., a clinician) of the ultrasound imaging system at a user input device (e.g., user interface 115 shown in fig. 1 and described above) indicating a desired transition from imaging in the B mode to imaging in the color flow doppler mode. Although not shown in fig. 3, the image 300 may be displayed by a display device along with imaging parameters and/or other imaging data (e.g., patient information). Further, in some examples, an operator of the ultrasound imaging system may update or change the imaging parameters by entering the updated parameters into the ultrasound imaging system via a user interface. As one example, changing a Pulse Repetition Frequency (PRF) value of an ultrasound imaging system may include selecting a PRF field of a GUI displayed at a display device via one or more input devices (e.g., a mouse, a keyboard, a touch screen, etc.), and inputting the updated PRF value via the one or more input devices. Exemplary GU features are described below with reference to fig. 4-5.
Referring to FIG. 4, an ultrasound image 400 of the aorta 302 of a patient is shown, wherein the ultrasound image 400 is acquired while the ultrasound imaging system is operating in a colorflow Doppler mode. In particular, the ultrasound imaging system acquires an ultrasound image 400 (as described above with reference to fig. 1-2) without using recommended imaging parameters generated by the ultrasound imaging system. For example, ultrasound image 400 may be acquired by an ultrasound imaging system in a situation where the operator rejects recommended imaging parameters generated by the ultrasound imaging system after an imaging mode transition request (e.g., a request to transition from B-mode to color flow doppler mode input by the operator). Fig. 4 additionally shows a GUI 402 comprising a list of imaging parameters 404 and color references 403. The imaging parameters 404 are imaging parameters of the ultrasound imaging system during the acquisition of the image 400. The imaging parameters 404 may be default imaging parameters (e.g., unadjusted imaging parameters) associated with the color flow doppler mode. The ultrasound imaging system does not adjust the imaging parameters 404 based on the anatomical structure being imaged (e.g., the aorta 302). The color reference 403 may be used by an operator or other clinician (e.g., cardiologist) in conjunction with the color flow map 405 to determine the direction and/or velocity of blood flowing within the anatomy being imaged.
The sharpness of the colorflow map 405 (e.g., the resolution or tone scale amount of the colorflow map 405) may be generated by the values of the imaging parameters 404. Although the ultrasound imaging system may utilize default imaging parameter values in situations where the operator rejects the recommended imaging parameter values, the default imaging parameter values may not provide color flow maps 405 of increased clarity for various anatomical structures. For example, when imaging a first anatomical structure (e.g., the aorta 302) using default imaging parameter values, the sharpness of the color flow map may be different relative to a condition in which a second anatomical structure (e.g., the kidney) is imaged using default imaging parameter values. Further, in some examples, the default imaging parameter values may not provide sufficient image clarity for various anatomical features (e.g., an operator may have difficulty viewing the anatomy being imaged or diagnosing a patient due to poor image clarity caused by the default imaging parameter values). Although the operator may manually input different imaging parameter values in an attempt to increase image sharpness, the method of manually inputting values may increase the cognitive load of the operator and increase imaging time. Further, manually inputting values may include trial and error methods in which an operator attempts a variety of different imaging parameter values due to uncertainty as to whether a particular imaging parameter value should be increased, decreased, or maintained, which may otherwise increase imaging time and operator cognitive load.
However, the ultrasound imaging system is configured to generate recommended imaging parameter values in response to an imaging mode transition request, similar to the example described above. By accepting the recommended imaging parameter values, the sharpness of the images generated by the ultrasound imaging system may be increased, resulting in reduced imaging time, reduced cognitive load on the operator, and/or improved accuracy of patient diagnosis. As an example of an image with increased sharpness due to recommended imaging parameters generated by an ultrasound imaging system, fig. 5 shows an ultrasound image 500 of the aorta 302. Ultrasound image 500 is provided for relative comparison with ultrasound image 400 shown in figure 4. As shown in fig. 5, GUI 402 includes updated imaging parameters 504, wherein unadjusted (e.g., default) imaging parameter values are shown in dashed text, and wherein recommended (e.g., adjusted) imaging parameter values generated by the ultrasound imaging system are shown in bold text. The ultrasound image 500 is generated by the ultrasound imaging system using the recommended imaging parameter values. Thus, the sharpness (e.g., the amount of resolution and/or tone scale) of the colorflow map 505 is increased relative to the colorflow map 405 of the ultrasound image 400 shown in FIG. 4. As shown in fig. 5, the recommended imaging parameters may include a PRF of 2.3 (instead of a PRF of 2.6 for the default parameter), a wall filter of 263 (instead of 191 for the default parameter), a spatial filter of 5 (instead of 3 for the default parameter), and a packet size of 8 (instead of 12 for the default parameter).
Further, as described above, the ultrasound imaging system determines the anatomy being imaged based on images acquired prior to transitioning to color flow doppler imaging mode (e.g., based on images acquired when imaging in B mode immediately prior to transitioning to imaging in color flow doppler mode). In some examples, GUI 402 may display the name of the anatomical structure being imaged via anatomical structure identification field 506. In addition, the GUI 402 may display a notification area 508 that includes a confirm button 510 and a decline button 512. The operator may input a selection to the notification area 508 to accept the recommended imaging parameter value (e.g., by selecting the confirm button 510 to replace the default imaging parameter value with the recommended imaging parameter value) or to reject the recommended imaging parameter value (e.g., by selecting the reject button 512 to revert to the default imaging parameter value).
Referring now to FIG. 6, an ultrasound image 600 is shown. Ultrasound image 600 is acquired while imaging a patient in B-mode using the same ultrasound imaging system described above with reference to fig. 3-5. However, while the ultrasound image 300 shown in FIG. 3 illustrates the aorta 302 of a patient, the ultrasound image 600 shown in FIG. 6 illustrates the kidney 602 of a patient. The ultrasound image 600 may be one of a plurality of ultrasound images that are sequentially acquired by the ultrasound imaging system when operating in the B mode.
Fig. 7 shows an ultrasound image 700 of a patient's kidney 602, where the ultrasound image 700 was acquired while operating the ultrasound imaging system in color flow doppler imaging mode with imaging parameters 704. The imaging parameters 704 may be default (e.g., unadjusted) imaging parameters of the ultrasound imaging system. For example, in response to the ultrasound imaging mode transitioning from imaging in B-mode to imaging in color flow doppler mode, the ultrasound imaging system generates recommended imaging parameters for color flow doppler mode based on the anatomical structure being imaged (e.g., the kidney 602), wherein the anatomical structure is determined based on images acquired while operating in B-mode (e.g., similar to the example described above). The ultrasound image 700 shown in fig. 7 is an image acquired without using the recommended imaging parameters. As one example, the operator may reject the recommended imaging parameters and may proceed with imaging using default imaging parameters (e.g., imaging parameters 704), which may result in a reduction in the clarity of the ultrasound image 700 (e.g., a reduction in the resolution and/or amount of tone scale of the colorflow map 705). In some examples, the imaging parameters 704 may be the same as the imaging parameters 404 shown in fig. 4.
The GUI 702 includes a color reference 703 similar to the color reference 403 shown in fig. 4 and described above. The color reference 703 may be used by an operator or other clinician (e.g., cardiologist) in conjunction with the color flow map 705 to determine the direction and/or velocity of blood flowing within the anatomy being imaged (e.g., the kidney 602).
Referring now to fig. 8, an ultrasound image 800 is shown. The ultrasound image 800 is acquired by the ultrasound imaging system using recommended imaging parameters generated by the ultrasound imaging system based on the anatomy being imaged (e.g., the kidney 602) and in response to an imaging mode transition request (e.g., an operator input indicating that the ultrasound imaging system is desired to transition from imaging in B-mode to imaging in color flow doppler mode). The ultrasound image 800 has increased image clarity relative to the ultrasound image 700 acquired without using the recommended imaging parameters.
As shown in fig. 8, GUI 702 includes updated imaging parameters 804, where unadjusted (e.g., default) imaging parameter values are shown in dashed text, and where recommended (e.g., adjusted or updated) imaging parameter values generated by the ultrasound imaging system are shown in bold text. Similar to the example described above with reference to fig. 5, GUI 702 may display, via anatomy identification field 806, a name of an anatomy (e.g., a kidney) being imaged. Further, the GUI 702 may display a notification area 808 including a confirm button 810 and a reject button 812, similar to the notification area 506, the confirm button 510, and the reject button 512 of fig. 5, respectively.
The recommended imaging parameters (e.g., recommended imaging parameter values) generated by the ultrasound imaging system as shown in fig. 8 are based on the anatomy (e.g., kidney) being imaged. Thus, the recommended imaging parameters shown in fig. 8 differ from the recommended imaging parameters shown in fig. 5, since the recommended imaging parameters shown in fig. 5 are based on imaging of different anatomical structures (e.g., the aorta). For example, recommended parameters for the kidney may include a PRF of 1, a wall filter of 108, a spatial filter of 3, and a packet size of 12. In situations where a different third anatomy is being imaged (e.g., different from the aorta and the kidney), the ultrasound imaging system provides recommended imaging parameter values based on the third anatomy, which may be different from the recommended imaging parameters shown in fig. 5 and 8. In this way, the recommended imaging parameters provided by the ultrasound imaging system may increase the clarity of images acquired by the ultrasound imaging system for a variety of different anatomical structures.
The technical effect of generating recommended imaging parameters based on the anatomy being imaged is to increase the clarity of images acquired by the ultrasound imaging system while reducing the cognitive load on the operator of the ultrasound system.
As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural said elements or steps, unless such exclusion is explicitly recited. Furthermore, references to "one embodiment" of the present invention are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments "comprising," "including," or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property. The terms "including" and "in … are used as shorthand, language equivalents of the respective terms" comprising "and" wherein ". Furthermore, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to impose numerical requirements or a particular positional order on their objects.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the relevant art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those of ordinary skill in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.

Claims (20)

1. A method, comprising:
operating the ultrasound imaging system in a first mode of operation;
determining an anatomical structure imaged by the ultrasound imaging system in the first mode of operation; and
adjusting imaging parameters of the ultrasound imaging system in a second mode of operation based on the first mode of operation and the anatomical structure imaged in the first mode of operation in response to an operating mode transition request.
2. The method of claim 1, wherein determining the anatomical structure imaged by the ultrasound imaging system in the first mode comprises acquiring a plurality of sequential ultrasound images of the anatomical structure while operating the ultrasound imaging system in the first mode.
3. The method of claim 2, wherein determining the anatomical structure imaged by the ultrasound imaging system in the first mode further comprises analyzing the plurality of sequential ultrasound images via a machine learning model stored in a memory of the ultrasound imaging system.
4. The method of claim 1, further comprising: in response to the operating mode transition request and while operating in the second operating mode, determining the anatomical structure imaged by the ultrasound imaging system in the second operating mode based on images acquired by the ultrasound imaging system in the second operating mode, and further adjusting the imaging parameters if the anatomical structure imaged by the ultrasound imaging system in the second operating mode is different from the anatomical structure imaged by the ultrasound imaging system in the first operating mode.
5. The method of claim 1, wherein the operational mode transition request comprises selecting the second operational mode via a user interface device of the ultrasound imaging system.
6. The method of claim 1, wherein adjusting imaging parameters of the ultrasound imaging system in the second operating mode based on the first operating mode and the anatomical structure imaged in the first operating mode comprises generating recommended imaging parameters for the second operating mode based on the anatomical structure imaged in the first operating mode.
7. The method of claim 6, wherein generating recommended imaging parameters for the second mode of operation based on the anatomical structure imaged in the first mode of operation further comprises: acquiring at least one image in the second operation mode, comparing a determined amount of image information of the at least one image with an expected amount of image information, and updating the recommended imaging parameter based on a difference between the determined amount of image information and the expected amount of image information.
8. The method of claim 6, further comprising displaying the recommended imaging parameters at a display device of the ultrasound imaging system.
9. The method of claim 8, wherein adjusting imaging parameters of the ultrasound imaging system in the second mode of operation comprises replacing default imaging parameters with the recommended imaging parameters.
10. The method of claim 1, wherein determining the anatomical structure imaged by the ultrasound imaging system in the first mode of operation comprises determining a position of a probe of the ultrasound imaging system via a probe position sensor while operating the ultrasound imaging system in the first mode of operation.
11. The method of claim 1, wherein the first mode of operation comprises a B-mode and the second mode of operation comprises a color flow doppler mode.
12. An ultrasound imaging system comprising:
an ultrasonic probe; and
a controller having computer readable instructions stored on a non-transitory memory that, when executed, cause the controller to:
in response to a request to transition from a first mode of operation to a second mode of operation, acquiring a plurality of images with the ultrasound probe while in the first mode of operation;
determining an anatomical structure currently being imaged by the ultrasound imaging system based on one or more of the plurality of images; and
acquiring one or more images with the ultrasound probe in the second mode of operation, the one or more images acquired in the second mode of operation being acquired with imaging parameters of the ultrasound imaging system selected based on the determined anatomical structure.
13. The ultrasound imaging system of claim 12, wherein the imaging parameters of the ultrasound imaging system selected based on the determined anatomical structure are automatically selected by the ultrasound imaging system.
14. The ultrasound imaging system of claim 12, wherein the imaging parameters of the ultrasound imaging system selected based on the determined anatomical structure are automatically identified by the ultrasound imaging system and are selected to accept the imaging parameters in response to user input to acquire the one or more images in the second mode of operation.
15. The ultrasound imaging system of claim 12 wherein the first mode of operation comprises a B-mode and the second mode of operation comprises a color flow doppler mode.
16. A method, comprising:
acquiring one or more first mode images via an ultrasound probe of an ultrasound imaging system operating in a first mode;
determining that a user has requested to operate the ultrasound imaging system in a second mode;
automatically adjusting one or more imaging parameters of the ultrasound imaging system based on the one or more first mode images in response to the request; and
acquiring, via the ultrasound probe of the ultrasound imaging system operating in the second mode, one or more second mode images with the one or more adjusted imaging parameters.
17. The method of claim 16, wherein the first mode comprises a B-mode and the second mode comprises a color flow doppler mode.
18. The method of claim 17, wherein automatically adjusting one or more imaging parameters of the ultrasound imaging system based on the one or more first mode images comprises automatically adjusting one or more of a pulse repetition frequency, a wall filter, a spatial filter, and a data packet size of the ultrasound imaging system.
19. The method of claim 16, wherein automatically adjusting one or more imaging parameters of the ultrasound imaging system based on the one or more first mode images comprises automatically adjusting the one or more imaging parameters of the ultrasound imaging system based on anatomical features identified in the one or more first mode images.
20. The method of claim 16, wherein automatically adjusting one or more imaging parameters of the ultrasound imaging system based on the one or more first mode images comprises automatically adjusting the one or more imaging parameters of the ultrasound imaging system based on an identified scan plane of the one or more first mode images.
CN202011106392.1A 2019-10-29 2020-10-15 Method and system for multi-mode ultrasound imaging Pending CN112741648A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/667,747 2019-10-29
US16/667,747 US11602332B2 (en) 2019-10-29 2019-10-29 Methods and systems for multi-mode ultrasound imaging

Publications (1)

Publication Number Publication Date
CN112741648A true CN112741648A (en) 2021-05-04

Family

ID=75586669

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011106392.1A Pending CN112741648A (en) 2019-10-29 2020-10-15 Method and system for multi-mode ultrasound imaging

Country Status (2)

Country Link
US (1) US11602332B2 (en)
CN (1) CN112741648A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114271853A (en) * 2021-12-23 2022-04-05 武汉中旗生物医疗电子有限公司 Ultrasonic equipment imaging mode parameter control method, device, equipment and storage medium
CN114271853B (en) * 2021-12-23 2024-05-03 武汉中旗生物医疗电子有限公司 Ultrasonic equipment imaging mode parameter control method, device, equipment and storage medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021186313A (en) * 2020-06-01 2021-12-13 キヤノン株式会社 Failure determining device for ultrasonic diagnostic apparatus, failure determining method and program
WO2023036778A1 (en) * 2021-09-07 2023-03-16 Koninklijke Philips N.V. Defining preset parameter values for an ultrasound imaging system
EP4145458A1 (en) * 2021-09-07 2023-03-08 Koninklijke Philips N.V. Defining preset parameter values for an ultrasound imaging system

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5315999A (en) * 1993-04-21 1994-05-31 Hewlett-Packard Company Ultrasound imaging system having user preset modes
US20030045795A1 (en) * 2001-08-24 2003-03-06 Steinar Bjaerum Method and apparatus for improved spatial and temporal resolution in ultrasound imaging
US6719700B1 (en) * 2002-12-13 2004-04-13 Scimed Life Systems, Inc. Ultrasound ranging for localization of imaging transducer
US20050131300A1 (en) * 2003-12-15 2005-06-16 Siemens Medical Solutions Usa, Inc. Automatic optimization for ultrasound medical imaging
US20070055153A1 (en) * 2005-08-31 2007-03-08 Constantine Simopoulos Medical diagnostic imaging optimization based on anatomy recognition
US20100240992A1 (en) * 2009-03-23 2010-09-23 Imsonic Medical, Inc. Method and apparatus for an automatic ultrasound imaging system
CN102414575A (en) * 2009-04-28 2012-04-11 皇家飞利浦电子股份有限公司 Spectral doppler ultrasound imaging device and method for automaticly controlling same
JP2014226296A (en) * 2013-05-22 2014-12-08 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic apparatus
US20150065877A1 (en) * 2013-08-30 2015-03-05 General Electric Company Method and system for generating a composite ultrasound image
CN104661599A (en) * 2012-09-19 2015-05-27 柯尼卡美能达株式会社 Ultrasound diagnostic device, ultrasound diagnostic device control method, and ultrasound diagnostic device control apparatus
US20150272547A1 (en) * 2014-03-31 2015-10-01 Siemens Medical Solutions Usa, Inc. Acquisition control for elasticity ultrasound imaging
US20150294497A1 (en) * 2012-05-31 2015-10-15 Koninklijke Philips N.V. Ultrasound imaging system and method for image guidance procedure
CN105120762A (en) * 2013-03-15 2015-12-02 无锡祥生医学影像有限责任公司 Systems and methods to detect and present interventional devices via ultrasound imaging
CN105518482A (en) * 2013-08-19 2016-04-20 优胜医疗有限公司 Ultrasound imaging instrument visualization
US20160143629A1 (en) * 2014-11-20 2016-05-26 General Electric Company Method and system for snapshot support for real-time ultrasound
US20170119352A1 (en) * 2015-10-30 2017-05-04 Carestream Health, Inc. Ultrasound display method
CN107003394A (en) * 2014-12-10 2017-08-01 通用电气公司 Enhancing visualization method and system for individual images in real time scan
US20180055479A1 (en) * 2016-08-23 2018-03-01 Carestream Health, Inc. Ultrasound system and method
US20180210632A1 (en) * 2017-01-20 2018-07-26 General Electric Company Method and ultrasound imaging system for adjusting an ultrasound image with a touch screen
CN108498117A (en) * 2017-02-23 2018-09-07 通用电气公司 Indicate the method and ultrasonic image-forming system of the ultrasound data obtained using different imaging patterns
US20190012432A1 (en) * 2017-07-05 2019-01-10 General Electric Company Methods and systems for reviewing ultrasound images

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030045797A1 (en) * 2001-08-28 2003-03-06 Donald Christopher Automatic optimization of doppler display parameters
US7221972B2 (en) * 2003-08-29 2007-05-22 Siemens Medical Solutions Usa, Inc. Ultrasound system with protocol-driven user interface
US7903852B2 (en) * 2004-07-27 2011-03-08 Koninklijke Philips Electronics N.V. Automatic determination of parameters of an imaging geometry
US20080130972A1 (en) * 2006-11-30 2008-06-05 General Electric Company Storing imaging parameters
KR101055589B1 (en) * 2007-03-23 2011-08-23 삼성메디슨 주식회사 Ultrasound System and Method for Forming Ultrasound Images
US8096949B2 (en) * 2008-07-02 2012-01-17 U-Systems, Inc. User interface for ultrasound mammographic imaging
US9848849B2 (en) * 2008-08-21 2017-12-26 General Electric Company System and method for touch screen control of an ultrasound system
US8147410B2 (en) * 2009-03-23 2012-04-03 The Hong Kong Polytechnic University Method and apparatus for ultrasound imaging and elasticity measurement
US8235905B2 (en) * 2009-05-26 2012-08-07 General Electric Company System and method for automatic ultrasound image optimization
US8526669B2 (en) * 2010-10-18 2013-09-03 General Electric Company Method for multiple image parameter adjustment based on single user input
US8715187B2 (en) 2010-12-17 2014-05-06 General Electric Company Systems and methods for automatically identifying and segmenting different tissue types in ultrasound images
JP6176818B2 (en) * 2011-12-06 2017-08-09 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic apparatus and coordinate conversion program
EP2934335A4 (en) * 2012-12-21 2016-07-20 Volcano Corp Adaptive interface for a medical imaging system
US10154826B2 (en) 2013-07-17 2018-12-18 Tissue Differentiation Intelligence, Llc Device and method for identifying anatomical structures
US20150351726A1 (en) * 2014-06-05 2015-12-10 Siemens Medical Solutions Usa, Inc. User event-based optimization of B-mode ultrasound imaging
US9918701B2 (en) * 2014-09-03 2018-03-20 Contextvision Ab Methods and systems for automatic control of subjective image quality in imaging of objects
US10194888B2 (en) * 2015-03-12 2019-02-05 Siemens Medical Solutions Usa, Inc. Continuously oriented enhanced ultrasound imaging of a sub-volume
US9924927B2 (en) * 2016-02-22 2018-03-27 Arizona Board Of Regents On Behalf Of Arizona State University Method and apparatus for video interpretation of carotid intima-media thickness
US20180064403A1 (en) * 2016-09-06 2018-03-08 Toshiba Medical Systems Corporation Medical image diagnostic apparatus
JP6733445B2 (en) * 2016-09-13 2020-07-29 コニカミノルタ株式会社 Ultrasonic diagnostic device, ultrasonic image generation method and program
US10709422B2 (en) * 2016-10-27 2020-07-14 Clarius Mobile Health Corp. Systems and methods for controlling visualization of ultrasound image data
US10713537B2 (en) * 2017-07-01 2020-07-14 Algolux Inc. Method and apparatus for joint image processing and perception
US11382601B2 (en) * 2018-03-01 2022-07-12 Fujifilm Sonosite, Inc. Method and apparatus for annotating ultrasound examinations
CN111971688A (en) * 2018-04-09 2020-11-20 皇家飞利浦有限公司 Ultrasound system with artificial neural network for retrieving imaging parameter settings of relapsing patients
US11529116B2 (en) * 2018-06-19 2022-12-20 Fujifilm Sonosite, Inc. Ultrasound imaging system with automatic image saving
US11965959B2 (en) * 2018-10-23 2024-04-23 Koninklijke Philips N.V. Adaptive ultrasound flow imaging
US11064977B2 (en) * 2019-01-04 2021-07-20 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Preset free imaging for ultrasound device
US20210059644A1 (en) * 2019-04-16 2021-03-04 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Retrospective multimodal high frame rate imaging
US20210052255A1 (en) 2019-08-19 2021-02-25 Bay Labs, Inc. Ultrasound guidance dynamic mode switching

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5315999A (en) * 1993-04-21 1994-05-31 Hewlett-Packard Company Ultrasound imaging system having user preset modes
US20030045795A1 (en) * 2001-08-24 2003-03-06 Steinar Bjaerum Method and apparatus for improved spatial and temporal resolution in ultrasound imaging
US6719700B1 (en) * 2002-12-13 2004-04-13 Scimed Life Systems, Inc. Ultrasound ranging for localization of imaging transducer
US20050131300A1 (en) * 2003-12-15 2005-06-16 Siemens Medical Solutions Usa, Inc. Automatic optimization for ultrasound medical imaging
US20070055153A1 (en) * 2005-08-31 2007-03-08 Constantine Simopoulos Medical diagnostic imaging optimization based on anatomy recognition
US20100240992A1 (en) * 2009-03-23 2010-09-23 Imsonic Medical, Inc. Method and apparatus for an automatic ultrasound imaging system
CN102414575A (en) * 2009-04-28 2012-04-11 皇家飞利浦电子股份有限公司 Spectral doppler ultrasound imaging device and method for automaticly controlling same
US20150294497A1 (en) * 2012-05-31 2015-10-15 Koninklijke Philips N.V. Ultrasound imaging system and method for image guidance procedure
CN104661599A (en) * 2012-09-19 2015-05-27 柯尼卡美能达株式会社 Ultrasound diagnostic device, ultrasound diagnostic device control method, and ultrasound diagnostic device control apparatus
CN105120762A (en) * 2013-03-15 2015-12-02 无锡祥生医学影像有限责任公司 Systems and methods to detect and present interventional devices via ultrasound imaging
JP2014226296A (en) * 2013-05-22 2014-12-08 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic apparatus
CN105518482A (en) * 2013-08-19 2016-04-20 优胜医疗有限公司 Ultrasound imaging instrument visualization
US20150065877A1 (en) * 2013-08-30 2015-03-05 General Electric Company Method and system for generating a composite ultrasound image
US20150272547A1 (en) * 2014-03-31 2015-10-01 Siemens Medical Solutions Usa, Inc. Acquisition control for elasticity ultrasound imaging
US20160143629A1 (en) * 2014-11-20 2016-05-26 General Electric Company Method and system for snapshot support for real-time ultrasound
CN107003394A (en) * 2014-12-10 2017-08-01 通用电气公司 Enhancing visualization method and system for individual images in real time scan
US20170119352A1 (en) * 2015-10-30 2017-05-04 Carestream Health, Inc. Ultrasound display method
US20180055479A1 (en) * 2016-08-23 2018-03-01 Carestream Health, Inc. Ultrasound system and method
US20180210632A1 (en) * 2017-01-20 2018-07-26 General Electric Company Method and ultrasound imaging system for adjusting an ultrasound image with a touch screen
CN108498117A (en) * 2017-02-23 2018-09-07 通用电气公司 Indicate the method and ultrasonic image-forming system of the ultrasound data obtained using different imaging patterns
US20190012432A1 (en) * 2017-07-05 2019-01-10 General Electric Company Methods and systems for reviewing ultrasound images

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114271853A (en) * 2021-12-23 2022-04-05 武汉中旗生物医疗电子有限公司 Ultrasonic equipment imaging mode parameter control method, device, equipment and storage medium
CN114271853B (en) * 2021-12-23 2024-05-03 武汉中旗生物医疗电子有限公司 Ultrasonic equipment imaging mode parameter control method, device, equipment and storage medium

Also Published As

Publication number Publication date
US11602332B2 (en) 2023-03-14
US20210121158A1 (en) 2021-04-29

Similar Documents

Publication Publication Date Title
US10558350B2 (en) Method and apparatus for changing user interface based on user motion information
EP2898830B1 (en) Method and ultrasound apparatus for displaying ultrasound image corresponding to region of interest
US10842466B2 (en) Method of providing information using plurality of displays and ultrasound apparatus therefor
EP2702947B1 (en) Apparatuses for computer aided measurement and diagnosis during ultrasound imaging
US20170090571A1 (en) System and method for displaying and interacting with ultrasound images via a touchscreen
US20140164965A1 (en) Ultrasound apparatus and method of inputting information into same
CN112741648A (en) Method and system for multi-mode ultrasound imaging
US9401018B2 (en) Ultrasonic diagnostic apparatus and method for acquiring a measurement value of a ROI
US11607200B2 (en) Methods and system for camera-aided ultrasound scan setup and control
EP3050515B1 (en) Ultrasound apparatus and method of operating the same
US20150209012A1 (en) Method and ultrasound apparatus for displaying ultrasound image
EP3673813B1 (en) Ultrasound diagnosis apparatus
US11793482B2 (en) Ultrasound imaging apparatus, method of controlling the same, and computer program product
US20180210632A1 (en) Method and ultrasound imaging system for adjusting an ultrasound image with a touch screen
CN107809956B (en) Ultrasound device and method of operating the same
US11096667B2 (en) Ultrasound imaging apparatus and method of controlling the same
US20200187908A1 (en) Method and systems for touchscreen user interface controls
CN111265248B (en) Ultrasonic imaging system and method for measuring volumetric flow rate
US20200229795A1 (en) Method and systems for color flow imaging of arteries and veins
KR102244069B1 (en) Method and ultrasound apparatus for displaying location information of a bursa
US20190183453A1 (en) Ultrasound imaging system and method for obtaining head progression measurements
US20190114812A1 (en) Method and ultrasound imaging system for emphasizing an ultrasound image on a display screen
US20230380808A1 (en) Ultrasonic imaging method and ultrasonic imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination