CN114983468A - Imaging system and method using a real-time inspection completeness monitor - Google Patents

Imaging system and method using a real-time inspection completeness monitor Download PDF

Info

Publication number
CN114983468A
CN114983468A CN202210607970.2A CN202210607970A CN114983468A CN 114983468 A CN114983468 A CN 114983468A CN 202210607970 A CN202210607970 A CN 202210607970A CN 114983468 A CN114983468 A CN 114983468A
Authority
CN
China
Prior art keywords
imaging
image data
views
specified
protocol
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210607970.2A
Other languages
Chinese (zh)
Inventor
斯韦恩·阿恩·奥瑟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Publication of CN114983468A publication Critical patent/CN114983468A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4411Device being modular
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/481Diagnostic techniques involving the use of contrast agent, e.g. microbubbles introduced into the bloodstream
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • A61B8/543Control of the diagnostic device involving acquisition triggered by a physiological signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The invention provides an imaging system and method utilizing a real-time inspection completeness monitor. The invention relates to an ultrasound imaging system (100) and method accessing an imaging protocol for an ultrasound imaging session. The imaging protocol includes specified views to be obtained to complete the imaging protocol. Image data is acquired with an ultrasound imaging system (100) and artificial intelligence is used to identify a portion of the image data corresponding to at least one of the designated views of the imaging protocol (402). The identified portions of the image data corresponding to the one or more specified views of an imaging protocol are automatically saved and a graphical completion progress indicator of the imaging protocol is displayed, the graphical completion progress indicator indicating one or more of: the designated view of the imaging protocol has been acquired and/or one or more additional designated views of the imaging protocol have not been acquired.

Description

Imaging system and method using a real-time inspection completeness monitor
This application is a divisional application of the invention patent application entitled "imaging System and method utilizing a real-time inspection completeness monitor" having application number 201910547364.4.
Technical Field
The subject matter disclosed herein relates generally to imaging systems.
Background
The imaging system may generate image data representative of the imaged body. Some imaging systems are real-time imaging systems that can generate and display images of a body while continuously obtaining image data. An ultrasound imaging system is an example of such a real-time imaging system. These types of imaging systems differ from other imaging systems that capture image data of a body, which is then displayed to an operator of the imaging system after an imaging session is completed (e.g., all of the required body image data has been obtained).
During an imaging session, an operator of the imaging system may wish to obtain certain views of the body being imaged. For example, an operator using an ultrasound imaging system may wish to obtain an apical two-chamber view of a human heart, an apical four-chamber view of a human heart, and an apical long-axis view of a human heart to complete an imaging session. However, the operator may forget which views have been acquired, may forget which views have not been acquired, and/or may be distracted by one view and begin acquiring other views that are not needed to complete the imaging session (while not acquiring the views needed to complete the imaging session). This may occur where an operator is required to complete several different imaging protocols on one person during a single imaging session. When the operator is required to obtain many different views simultaneously under different parameters of a number of different protocols, it may be difficult for the operator to keep track of the parameters and views that have been obtained. Thus, the imaging session may be terminated without obtaining all of the required body views. This may require an additional imaging session to be performed, which may interrupt and delay the imaging session of others.
Disclosure of Invention
In one embodiment, a method of ultrasound imaging includes accessing an imaging protocol for an ultrasound imaging session. The imaging protocol includes one or more specified views to be obtained to complete the imaging protocol. The designated view may refer to a designated imaging view, a designated acquisition view, or a designated insonification view. The method also includes acquiring image data with an ultrasound imaging system. The image data comprises a plurality of different acquired views from a plurality of different locations. The method also includes automatically identifying (with artificial intelligence) a portion of the image data corresponding to one of the one or more specified views of the imaging protocol, automatically storing (in memory) the portion of the image data corresponding to one of the one or more specified views of the imaging protocol, and displaying (on a display device) a graphical completion progress indicator of the imaging protocol indicating one or more of: the one of the one or more specified views of the imaging protocol have already been acquired, or one or more additional specified views of the imaging protocol have not yet been acquired.
In one embodiment, an ultrasound imaging system includes an ultrasound imaging probe configured to acquire image data during an ultrasound imaging session. The image data includes a plurality of different acquired views from a plurality of different locations. The imaging system also includes one or more processors configured to access an imaging protocol for the ultrasound imaging session. The imaging protocol includes one or more specified views to be obtained to complete the imaging protocol. The one or more processors are further configured to automatically identify a portion of the image data corresponding to one of the one or more specified views of the imaging protocol. The imaging system also includes a memory configured to automatically store a portion of the image data corresponding to one of the one or more specified views of the imaging protocol. The one or more processors are configured to instruct the display device to display a graphical completion progress indicator of the imaging protocol that indicates one or more of: the one of the one or more specified views of the imaging protocol has already been acquired, or one or more additional specified views of the imaging protocol have not yet been acquired.
In one embodiment, a method of imaging includes accessing an imaging protocol for an ultrasound imaging session. The imaging protocol includes one or more specified views to be obtained to complete the imaging protocol. The method also includes acquiring image data with an imaging system. The image data includes a plurality of different acquired views from a plurality of different locations. The method further includes automatically identifying (with artificial intelligence) a portion of the image data corresponding to one of the one or more specified views of the imaging protocol, automatically storing (in a memory) the portion of the image data corresponding to one of the one or more specified views of the imaging protocol, and displaying (on a display device) a graphical completion progress indicator of the imaging protocol and the portion of the image data corresponding to one of the one or more specified views of the imaging protocol, the graphical completion progress indicator indicating one or more of: the one of the one or more specified views of the imaging protocol has already been acquired, or one or more additional specified views of the imaging protocol have not yet been acquired.
Drawings
The inventive subject matter described herein will be better understood by reading the following description of non-limiting embodiments with reference to the accompanying drawings, in which:
FIG. 1 is a schematic diagram of an ultrasound imaging system according to one embodiment;
FIG. 2 shows a flow chart of a method of automatically capturing an imaging body view required by an imaging protocol while continuously obtaining image data of the body;
FIG. 3 illustrates an example of a graphical completion progress indicator displayed on the user interface shown in FIG. 1;
FIG. 4 illustrates an additional example of a graphical completion progress indicator displayed on the user interface shown in FIG. 1; and
FIG. 5 illustrates additional examples of graphical completion progress indicators displayed on the user interface shown in FIG. 1.
Detailed Description
The subject matter described herein relates to imaging systems and methods that access an imaging protocol that dictates which views of the body are to be obtained. The imaging protocol optionally specifies conditions under which these views are to be obtained, such as physiological parameters of the body being imaged and/or acquisition parameters (e.g., settings) of the imaging system. The imaging system is controlled to obtain image data, and the imaging system may automatically determine whether any portion of the image data obtained by the imaging system contains a view required by the imaging protocol. The imaging system may optionally determine whether the conditions required by the protocol are also met. The imaging system may automatically determine whether the desired view is obtained without requiring an operator of the imaging system to recognize or identify the view in the image data. The imaging system may automatically save a portion of the image data containing the desired view in memory and may update a graphical completion progress indicator displayed on a user interface (e.g., an electronic display device). The indicator may indicate which views required by the imaging protocol have been captured and/or which views required by the imaging protocol remain to be captured. This can help train the operator what additional body views are needed.
At least one technical effect of the subject matter described herein is automatically identifying and capturing (e.g., saving in memory) portions of image data that include body views required by an imaging protocol while image data continues to be obtained. Another technical effect is to track and inform which views required by the protocol have been obtained and/or which additional views required by the protocol remain to be obtained.
Fig. 1 is a schematic diagram of an ultrasound imaging system 100 according to one embodiment. The ultrasound imaging system 100 includes a transmit beamformer 101 and a transmitter 102 which drives elements 104 within a probe 106 to transmit pulsed ultrasound signals into a body (not shown). According to one embodiment, the probe 106 may be a two-dimensional matrix array probe. However, according to other embodiments, another type of probe capable of acquiring four-dimensional ultrasound data may be used. The four-dimensional ultrasound data may include ultrasound data, such as a plurality of three-dimensional volumes acquired over a period of time. The four-dimensional ultrasound data may include information showing how the three-dimensional volume changes over time. Alternatively, a 1D array probe or a linear array probe may be used. Alternatively, the system 100 may not acquire four-dimensional ultrasound data, but may obtain another type of imaging data, such as a temporal motion ultrasound pattern, a planar video loop, or the like.
The pulsed ultrasonic signals are backscattered from structures in the body, such as blood cells or muscle tissue, to produce echoes that return to the elements 104. The echoes are converted into electrical signals or ultrasound data by the elements 104, and the electrical signals are received by the receiver 108. The electrical signals representing the received echoes pass through a receive beamformer 110 which outputs ultrasound data. The probe 106 may include electronic circuitry to perform all or part of transmit and/or receive beamforming. For example, all or part of the transmit beamformer 101, transmitter 102, receiver 108 and receive beamformer 110 may be located within the probe 106. Scanning may include acquiring data through the process of transmitting and receiving ultrasound signals. The data generated by the probe 106 may include one or more data sets acquired with an ultrasound imaging system. The user interface 115 may be used to control the operation of the ultrasound imaging system 100, including for controlling the input of patient data, for changing scanning or display parameters, and the like. One example of a user interface 115 may be an electronic display device, such as a monitor, touch screen, or the like. The user interface 115 may optionally include one or more input devices, such as a keyboard, an electronic mouse, speakers, and the like.
The ultrasound imaging system 100 also includes one or more processors 116 that control the transmit beamformer 101, the transmitter 102, the receiver 108, and the receive beamformer 110. The processor 116 is in electronic communication with the probe 106 via one or more wired and/or wireless connections. The processor 116 may control the probe 106 to acquire data. The processor 116 controls which of the elements 104 are active and the shape of the beam emitted from the probe 106. The processor 116 is also in electronic communication with a display device 118, and the processor 116 may process the data into images for display on the display device 118. The processor 116 may include one or more Central Processing Units (CPUs) according to one embodiment. According to other embodiments, the processor 116 may include one or more other electronic components capable of performing processing functions, such as one or more digital signal processors, Field Programmable Gate Arrays (FPGAs), graphics boards, and/or integrated circuits. According to other embodiments, the processor 116 may include a number of electronic components capable of performing processing functions. For example, the processor 116 may include two or more electronic components selected from a list of electronic components, including: one or more central processing units, one or more digital signal processors, one or more field programmable gate arrays, and/or one or more graphics boards. According to another embodiment, the processor 116 may also include a complex demodulator (not shown) that demodulates the radio frequency data and generates raw data. In another embodiment, demodulation may be performed earlier in the processing chain.
The processor 116 is adapted to perform one or more processing operations according to a plurality of selectable ultrasound modalities on the data. In one embodiment, processor 116 may include, or may be communicatively coupled with, one or more Graphics Processing Units (GPUs) for performing analysis of image data as described herein. When echo signals are received, the data may be processed in real-time during a scanning session, for example by processing the data without any intentional delay, or while additional data is acquired during the same imaging session for the same patient. For example, one embodiment may acquire images at a real-time rate of 7 to 20 volumes per second. However, the real-time volume rate may depend on the length of time required to acquire each volume of data for display. Thus, when acquiring relatively large volumes of data, the real-time volume rate may be slow. Some embodiments may have a real-time volume rate that is significantly faster than twenty volumes per second, while other embodiments may have a real-time volume rate that is less than seven volumes per second.
The data may be temporarily stored in a buffer (not shown) during the scan session and processed in a less real-time manner in a real-time or offline operation. Some embodiments of the inventive subject matter may include multiple processors (not shown) to process processing tasks processed by the processor 116 in accordance with the exemplary embodiments described above. For example, a first processor may be used to demodulate and extract the RF signal, while a second processor may be used to further process the data prior to displaying the image. It should be understood that other embodiments may use different processor arrangements.
The ultrasound imaging system 100 may acquire data continuously at a rate of, for example, ten to 200 hertz. Images generated from the data may be refreshed at similar frame rates. Other embodiments may collect and display data at different rates. For example, some embodiments may acquire data at a rate of less than ten hertz or greater than 200 hertz, depending on the volume size and intended application.
A memory 120 is included for storing the processed image data. In one embodiment, the memory 120 has sufficient capacity to store a large volume of ultrasound data for at least a few seconds or minutes. The image data is stored in a manner that facilitates retrieval according to its acquisition sequence or time. The memory 120 may include any known data storage medium, such as one or more tangible and non-transitory computer-readable storage media (e.g., one or more computer hard drives, disk drives, universal serial bus drives, etc.).
Alternatively, one or more embodiments of the inventive subject matter described herein can be implemented with a contrast agent. When ultrasound contrast agents, including microbubbles, are used, contrast imaging produces enhanced images of anatomical structures and blood flow in the body. After acquiring data using the contrast agent, image analysis includes separating harmonic components and linear components, enhancing the harmonic components, and generating an ultrasound image by using the enhanced harmonic components. Separation of the harmonic components from the received signal is performed using a suitable filter.
In various embodiments of the present invention, the processor 116 may process the data through other or different mode-dependent modules (e.g., B-mode, color doppler, M-mode, color M-mode, spectral doppler, elastography, TVI, strain rate, etc.) to form two-dimensional or three-dimensional image data. For example, one or more modules may generate B-mode, color doppler, M-mode, color M-mode, spectral doppler, elastography, TVI, strain rate, combinations thereof, and the like. The image beam and/or volume is stored and timing information indicative of the time at which data was acquired in the memory may be recorded. These modules may include, for example, a scan conversion module to perform a scan conversion operation to convert the image volume from beam space coordinates to display space coordinates. The video processor module may read the image volume from memory and display the image in real time as the patient is being operated on. The video processor module may store images in an image memory, from which the images are read and displayed.
Fig. 2 shows a flow diagram of a method 200 of automatically capturing an imaging body view required by an imaging protocol while continuously obtaining image data of the body. The method 200 may represent operations performed by the processor 116 to automatically track which views required by an imaging protocol have been obtained and/or which views required by the protocol remain to be captured. At 202, an imaging protocol is accessed. The imaging protocol may be accessed by downloading or otherwise obtaining a copy of the protocol from memory 120. Alternatively, the protocol may be provided from an operator of the imaging system 100 through the user interface 115.
The imaging protocol may be a list, table, or other memory structure that indicates or otherwise specifies the body views to be obtained to complete the imaging protocol. The views in the imaging protocol may specify the orientation of the image of the anatomical structure in the body being imaged. For example, the imaging protocol may require obtaining an apical two-chamber view, an apical four-chamber view, an apical long axis view, and a parasternal short axis view at papillary muscle level (SAX), among others.
The imaging protocol may require that some or all of the specified views be obtained in a specified order. For example, the imaging protocol may require that an apical two-chamber view be obtained before an apical four-chamber view, which is obtained before an infracostal view. Alternatively, the imaging protocol does not require that the views be obtained in a specified order. For example, the protocol may require that the aortic valve view, the subcostal view, and the 4CH view be obtained in any order or sequence.
The imaging protocol may specify views to be obtained for successful completion of an imaging session. The imaging session may begin when the imaging system 100 is activated and the probe 106 begins capturing image data of the body. The imaging session may continue as long as the body continues to be imaged by the probe 106, and may terminate when the probe 106 stops capturing image data of the body and/or the imaging system 100 is deactivated. For example, the imaging session may continue as long as the body continues to be imaged by the imaging system 100.
Alternatively, the imaging protocol may specify prerequisites that must be met before or while acquiring one or more specified views of the protocol. These conditions may include one or more physiological parameters of the person being imaged. The physiological parameter may include a specified heart rate, a specified heart rate range, a specified breathing rate range, and the like. For example, the imaging protocol may require that the person's heart rate be at least one hundred breaths per minute when obtaining a 4CH view, between eighty and ninety breaths per minute when obtaining an underbody view, that the person's breathing rate be no greater than 20 breaths per minute, and so on. These conditions may include one or more acquisition parameters of the imaging system 100. Acquisition parameters may include frame rate, frame rate range, resolution, ultrasound line density range, imaging width, imaging depth, ultrasound frequency, ultrasound pulse repetition frequency, ultrasound pulse length, power, and the like. The acquisition parameters may include an imaging mode, such as an ultrasound imaging mode. For example, the acquisition parameters may specify that image data is to be acquired using an ultrasound color flow mode, an ultrasound pulsed wave doppler mode, or the like.
The protocol may specify conditions between multiple views. For example, the protocol may require that two or more views (the same or different views) be obtained when the physiological parameter (e.g., heart rate) does not vary by more than a specified amount (e.g., 20%). If any view is obtained when the physiological parameter changes by more than this amount, the view does not meet the requirements of the protocol.
If the view required by the protocol is not obtained in the presence or during the conditions required by the protocol, the view does not meet or comply with the requirements of the protocol. However, if the view is obtained in the presence of or during the conditions required by the protocol, then the view does meet or conform to the requirements of the protocol.
At 204, the method 200 optionally includes displaying a completion progress indicator. The indicator informs the operator of the imaging system 100 of the completeness of the imaging protocol used for the imaging session and may be displayed and updated (and displayed to the operator) at the same time that the imaging system 100 acquires image data. Thus, the indicator may provide a real-time completeness monitor for an examination of a person using the imaging system 100.
The indicator may be a graphical representation of the degree of completion of the imaging protocol and/or how much of the imaging protocol remains to be completed. Alternatively, the indicator may be a graphical representation of protocol views that have been acquired, protocol views that have not been acquired, conditions required to capture one or more views, and the like. The indicator may be displayed on the user interface 115 along with image data and/or views obtained by the probe 106. For example, the indicator may be displayed with the image data as the image data is being acquired.
FIG. 3 illustrates an example of graphical completion accuracy indicators 300, 302, 304 shown on the user interface 115. The indicators 300, 302, 304 may be displayed simultaneously with and alongside the partial view 306 of the image data. In the illustrated example, the indicator 300 represents how many images in the imaging protocol are specified to be associated with an Automatic Functional Imaging (AFI) protocol.
The indicator 302 represents how many images in the imaging protocol have been captured that are designated for examination or assessment of Myocardial Infarction (MI) according to criteria set by the American Society for Echocardiography (ASE). For example, one third of the images required by the ASE formulated criteria to assess the MI of a person have been acquired and two thirds of the images required by the guideline remain to be acquired.
The indicator 304 represents how many images in the imaging protocol have been captured that are designated for checking or evaluating Mitral Valve (MV) prolapse according to criteria set by ASE. For example, half of the images required by ASE to formulate guidelines to assess MV prolapse in humans have been obtained, and half of the images required by the guideline remain to be obtained.
FIG. 4 shows an additional example of a graphical completion progress indicator 400 shown on user interface 115. The pointer 400 may be displayed simultaneously with and alongside the partial view 406 of the image data. In the illustrated example, the indicator 400 is a textual list of views to be obtained, such as a 2CH, 4CH, apical long axis (APLAX) view, parasternal short axis view (PSAX), and Parasternal Long Axis (PLAX) view. Some of the views in the list of indicators 400 may be shown in different colors, brightnesses, fonts, etc. to indicate which views have been obtained and which views have not been obtained. For example, the indicator 400 shows that the terms 2CH, 4CH, and PLAX have different text (e.g., brighter or different colored text) than the terms APLAX and PSAX, indicating that the 2CH, 4CH, and PLAX views required for the imaging protocol have been obtained and the APLAX and PSAX views required for the imaging protocol have not been obtained. The interface 115 may optionally display a smaller version (e.g., thumbnail) of the saved portion 402 of the image data corresponding to the desired view of the imaging protocol.
FIG. 5 shows an additional example of a graphical completion progress indicator 500 shown on user interface 115. The indicator 500 may be displayed simultaneously with and alongside the partial view 406 of the image data. In the illustrated example, the indicator 500 is a text look-up table of views to be obtained, such as 2CH, 4CH, APLAX, PSAX, and PLAX. Indicator 500 is a checklist with one annular graphical icon (e.g., square, circle, etc.) next to each view in the checklist. An "X", check mark or other symbol may be placed in the icon corresponding to the protocol view that has been obtained. Those views of the protocol that have not yet been obtained may be shown without symbols in the graphical icon.
Returning to the description of the flow chart of method 200, at 206, image data is acquired. An operator of the imaging system 100 may move the probe 106 around the person being imaged to obtain image data of one or more anatomical structures (e.g., organs, blood vessels, bones, etc.) of the person being imaged. The image data may be presented on the user interface 115 while additional image data is acquired during the same imaging session. For example, the user interface 115 may display a real-time view of the ultrasound image data as the processor 116 acquires and processes the image data. The probe 106 of the imaging system 100 may be moved relative to the subject being imaged such that the image data generated by the processor 116 of the imaging system 100 includes several different views of the imaging subject from different locations. These different views may show different orientations of physiological structures in the body.
At 208, it is determined whether any portion of the acquired image data includes a specified view required by the imaging protocol. At least a portion of the imaging protocol may have been completed if the image data includes a specified view of the imaging protocol. Accordingly, the flow of method 200 may proceed toward 210. However, if the image data acquired so far does not include the specified view of the imaging protocol, the imaging protocol has not yet been completed. Accordingly, the flow of method 200 may return to 206. For example, additional image data may be acquired and examined in a looping manner to determine when a specified view of the protocol is obtained.
In one embodiment, determining whether a portion of the acquired image data includes a specified view of the imaging protocol is performed automatically by the processor 116. For example, the processor 116 may use artificial intelligence or other machine-based learning techniques to automatically determine whether the image data represents a specified view. The artificial intelligence of the processors 116 may be embodied in one or more neural networks formed by at least some of the processors 116.
The artificial neural network formed by at least some of the processors 116 includes artificial neurons or nodes that receive input image data and perform operations (e.g., functions) on the image data, selectively passing the results of the operations on to other neurons. The neural network may be operable to classify frames of the acquired image data. For example, the neural network may examine features of a frame of image data and determine whether the frame belongs to one or more different categories of frames, such as an apical two-chamber view, an apical four-chamber view, an apical long-axis view, and the like.
Alternatively, the neural network may identify objects in the frames of the image data and determine what view the frames represent based on which identified objects appear in the frames. Weight values may be associated with each vector (described below) in the neural network and the neurons, and these values constrain how the input image data relates to the output of the neurons. The weight values may be determined by training an iterative flow of image data through a neural network. For example, the weight values are established during a training phase in which the neural network learns how to identify a particular class of objects by typical input image data features or ground truth images of the objects under training.
The labeled training image may be image data in which all or most of the pixels or voxels forming the image data are associated with an object class. The object class is the type or category of objects that appear in the image data. For example, human tissue may be of one object class, human bone may be of another object class, blood vessels may be of another object class, and so forth. Pixels or voxels may be labeled (e.g., associated) with the probability that the pixel or voxel represents various classes of objects by a vector [ a b c d ], where the values of a, b, c, and d indicate the probability that the pixel or voxel represents each of the different classes of objects or things. In the labeled training image, a pixel or voxel labeled [ 1000 ] may indicate that the pixel or voxel represents 100% of the probability of at least a portion of an object of a first type (e.g., object class human tissue represented by probability a), that the pixel or voxel represents zero probability of at least a portion of an object of a second, different type (e.g., object class human bone represented by probability b), that the pixel or voxel represents zero probability of at least a portion of an object of a third, different type (e.g., object class blood vessel represented by probability c), and that the pixel or voxel represents zero probability of at least a portion of an object of a fourth, different type (e.g., object class representing no body part represented by probability d).
Artificial neurons in a neural network may examine individual pixels or voxels in the input image data. The processor 116 may use linear classification to calculate scores for different classes of objects. These scores may indicate the probability that a pixel or voxel represents a different class. For example, the score of a pixel or voxel may be represented as one or more of the above vectors. Each artificial neuron may apply a mathematical function (e.g., an activation function) to the same pixel or voxel, where the functions applied by different neurons affect the functions applied by other neurons, and different neurons apply different weights to different terms in the function than one or more or all other neurons. Application of the function generates a classification score for the pixel or voxel that can be used to identify an object in the input image data.
Neurons in a neural network examine characteristics of pixels or voxels, such as intensity, color, and the like, to determine scores for various pixels or voxels. After the neural network determines the fractional vector of pixels or voxels, the neural network examines the fractional vector of each pixel or voxel and determines which object class has the highest probability for each pixel or voxel or which object class has a higher probability for each pixel or voxel than the probability of one or more or all other object classes. For example, a pixel or voxel with a fractional vector of [ 0.60.150.050.2 ] indicates that the neural network calculates a probability of 60% that the pixel or voxel represents human tissue, 15% that the pixel or voxel represents human bone, 5% that the pixel or voxel represents blood vessels, and 20% that the first pixel or voxel does not represent anything (e.g., is not tissue, blood, or blood vessels). The processor 116 may determine that the pixel or voxel represents the object class having the highest or maximum of these probabilities. For example, the processor may determine that the pixel or voxel represents human tissue because of a probability of 60%. This process may be repeated for several or all other pixels or voxels in the image data.
Once the neural network identifies possible object classes represented by different pixels or voxels in the image data, the neural network may identify shapes formed by pixels or voxels representing the same object class. These identified shapes may be compared to template shapes (e.g., stored in memory 120) for different views associated with different anatomical structures. If the shape of the identified object class (e.g., blood vessel) more closely matches the shape template associated with the specified view of the blood vessel, the processor 116 may determine that the image data displays a view of the blood vessel.
At 210, it is determined whether conditions of the imaging protocol are satisfied. As described above, the imaging protocol may require that one or more physiological parameters and/or acquisition parameters be satisfied prior to capturing a view of the imaged body (e.g., saved in memory 120). For example, the imaging protocol may require that the patient's heart rate be within a specified range and the frame rate of the imaging system 100 be a specified rate. The heart rate or other physiological parameter of the patient may be measured by one or more sensors or input by an operator of the imaging system 100. The frame rate or other acquisition parameters may be determined by the processor 116 when the processor 116 controls the operation of the imaging system 100. If the imaging protocol includes one or more physiological, acquisition, or other conditions, and these conditions are not met, the desired views of the imaging protocol may not have been obtained. For example, if the patient's heart rate has not risen to the specified heart rate range required by the imaging protocol, a view of the imaging protocol may not have been obtained. Accordingly, the flow of method 200 may proceed toward 212. However, if the imaging protocol includes one or more physiological, acquisition, or other conditions, and these conditions are met, the desired views of the imaging protocol may be captured and stored. For example, if the patient's heart rate rises within a specified heart rate range required by the imaging protocol, a view of the imaging protocol may be obtained. Accordingly, the flow of method 200 may proceed toward 214.
At 212, physiological and/or acquisition parameters are altered. For example, the physiological parameters of the person being imaged and/or the acquisition parameters of the imaging system 100 may be changed to be within the range or equal to the conditions required by the imaging protocol. With respect to physiological parameter conditions, this may involve the processor 116 instructing the imaged person to increase (or decrease) their heart rate, such as by walking on a treadmill, sitting, etc. With respect to acquisition parameter conditions, this may involve the processor 116 changing one or more settings of the imaging system 100 to conform to the conditions of the imaging protocol. Once one or more conditions of the protocol are met or met, the flow of method 200 may proceed toward 214.
At 214, a portion of the image data corresponding to one or more specified views of the imaging protocol is stored. The processor 116 may automatically (e.g., without operator intervention) save a digital copy of the portion of the image data containing the view required by the imaging protocol in the memory 120. Alternatively, the processor 116 may instruct the interface 115 to display a notification in response to determining that the specified view of the imaging protocol has been obtained. The operator of the system 100 may then provide input instructing the processor 116 to save the specified view in the memory 120. Not all of the image data may be saved in the memory 120 because the image data may be continuously acquired and/or displayed to the operator while the probe 106 continues to acquire more body views. Instead, a subset or portion of the image data having the desired view for the imaging protocol may be saved in memory 120.
Optionally, the processor 116 may be configured to receive input from the operator to delete or remove the obtained imaging protocol view from the memory 120. For example, while a given view of an imaging protocol may be obtained, an operator may be dissatisfied with the appearance or other characteristics of the obtained view. The operator may provide input (e.g., via interface 115) indicating rejection of the acquired view and instructing processor 116 to delete the acquired view. The processor 116 may then require the operator to capture the view again before determining that the protocol is complete.
At 216, the method 200 optionally includes displaying a specified view of the imaging protocol from the acquired image data. For example, as described above, a graphical representation (such as a thumbnail view) of the saved portion 402 of image data may be displayed on the user interface 115.
At 218, the graphical completion progress indicator is updated or otherwise modified. The graphical completion progress indicator may be updated to display that at least one additional view required by the imaging protocol has been captured and saved. For example, the processor 116 may change one or more of the indicators 300, 302, 304 to indicate that a greater percentage of the corresponding imaging protocol has been completed. The manner in which one or more additional terms or words in the indicator 400 are displayed may be changed by the processor 116 to show that more views required by the imaging protocol have been obtained. The processor 116 may examine one or more boxes or circles in the checklist of the indicator 500 to show that more views required by the imaging protocol have been obtained.
At 220, it is determined whether the imaging protocol is complete. The processor 116 may examine the views required by the imaging protocol and the automatically saved portion of image data (e.g., at 214) to determine if all of the views required by the imaging protocol have been obtained. The imaging protocol may be completed if all views of the imaging protocol have been acquired. Accordingly, the flow of method 200 may proceed toward 222. However, the imaging protocol may not be completed if one or more additional views of the imaging protocol need to be obtained. Accordingly, the flow of method 200 may return to 210. For example, the method 200 may return to obtaining additional image data to determine whether more views are obtained as required by the imaging protocol.
Alternatively, if the operator of the system 100 attempts to end the imaging session before one or more imaging protocols are completed, the processor 116 may generate an alert that is displayed on the interface 115. For example, if one or more views of the imaging protocol have not been obtained but the operator attempts to close or exit the interface 115, the processor 116 may instruct the interface 115 to generate a visual and/or audible alert indicating that the operator still needs to obtain one or more views of the imaging protocol.
At 222, a notification of completion of the imaging protocol may be provided to the operator. For example, the processor 116 may change the manner in which the completion progress indicator is displayed on the user interface 115 to indicate that the imaging protocol is complete. Optionally, the method 200 may terminate after 222. Alternatively, the method 200 may return to one or more of the other operations described above.
In one embodiment, a method of ultrasound imaging includes accessing an imaging protocol for an ultrasound imaging session. The imaging protocol includes one or more specified views to be obtained to complete the imaging protocol. The method also includes acquiring image data with an ultrasound imaging system. The image data includes a plurality of different acquired views from a plurality of different locations. The method also includes automatically identifying (with artificial intelligence) a portion of the image data corresponding to one of the one or more specified views of the imaging protocol, automatically storing (in memory) the portion of the image data corresponding to one of the one or more specified views of the imaging protocol, and displaying (on a display device) a graphical completion progress indicator of the imaging protocol indicating one or more of: the one of the one or more specified views of the imaging protocol has already been acquired, or one or more additional specified views of the imaging protocol have not yet been acquired.
Optionally, the one or more specified views of the imaging protocol include one or more specified orientations of the image of the anatomical structure in the body being imaged.
Optionally, a portion of the image data corresponding to one of the one or more specified views is automatically identified using a neural network.
Optionally, the method further comprises displaying on the display device a portion of the image data corresponding to one or more specified views of the imaging protocol on the display device.
Optionally, the graphical completion progress indicator and the portion of the image data corresponding to the one or more specified views of the imaging protocol are displayed simultaneously on the display device.
Optionally, the imaging protocol further comprises one or more prerequisites that have to be fulfilled when acquiring the at least one or more specified views.
Optionally, the one or more prerequisites comprise a physiological parameter of the person being imaged.
Optionally, the physiological parameter is one or more of a specified heart rate, a specified heart rate range, a specified breathing rate or a specified breathing rate range.
Optionally, the one or more prerequisites comprise acquisition parameters of the ultrasound imaging system.
Optionally, the acquisition parameters include one or more of a specified frame rate, a specified frame rate range, a specified ultrasound line density, or a specified ultrasound line density range.
In one embodiment, an ultrasound imaging system includes an ultrasound imaging probe configured to acquire image data during an ultrasound imaging session. The image data includes a plurality of different acquired views from a plurality of different locations. The imaging system also includes one or more processors configured to access an imaging protocol for an ultrasound imaging session. The imaging protocol includes one or more specified views to be obtained to complete the imaging protocol. The one or more processors are further configured to automatically identify a portion of the image data corresponding to one of the one or more specified views of the imaging protocol. The imaging system also includes a memory configured to automatically store a portion of the image data corresponding to one of the one or more specified views of the imaging protocol. The one or more processors are configured to instruct the display device to display a graphical completion progress indicator of the imaging protocol that indicates one or more of: the one of the one or more specified views of the imaging protocol has already been acquired, or one or more additional specified views of the imaging protocol have not yet been acquired.
Optionally, the one or more specified views of the imaging protocol include one or more specified orientations of the image of the anatomical structure in the body being imaged.
Optionally, the one or more processors are further configured to instruct the display device to display a portion of the image data corresponding to one or more specified views of the imaging protocol on the display device.
Optionally, the imaging protocol further comprises one or more prerequisites that have to be fulfilled when acquiring the at least one or more specified views.
Optionally, the one or more prerequisites comprises one or more of physiological parameters of the person being imaged or acquisition parameters of the imaging probe.
In one embodiment, a method of imaging includes accessing an imaging protocol for an ultrasound imaging session. The imaging protocol includes one or more specified views to be obtained to complete the imaging protocol. The method also includes acquiring image data with an imaging system. The image data includes a plurality of different acquired views from a plurality of different locations. The method also includes automatically identifying (with artificial intelligence) a portion of the image data corresponding to one of the one or more specified views of the imaging protocol, automatically storing (in memory) the portion of the image data corresponding to one of the one or more specified views of the imaging protocol, and displaying (on a display device) a graphical completion progress indicator of the imaging protocol and the portion of the image data corresponding to one of the one or more specified views of the imaging protocol, the graphical completion progress indicator indicating one or more of: the one of the one or more specified views of the imaging protocol has already been acquired, or one or more additional specified views of the imaging protocol have not yet been acquired.
Optionally, a portion of the image data corresponding to one of the one or more specified views is automatically identified using a neural network.
Optionally, the imaging protocol further comprises one or more prerequisites that have to be fulfilled when acquiring the at least one or more specified views.
Optionally, the one or more prerequisites comprise a physiological parameter of the person being imaged.
Optionally, the one or more prerequisites comprise acquisition parameters of the ultrasound imaging system.
As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural said elements or steps, unless such exclusion is explicitly recited. Furthermore, references to "one embodiment" are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Furthermore, unless explicitly stated to the contrary, embodiments "comprising," "including," or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property.
It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. While the dimensions and types of materials described herein are intended to define the parameters of the invention, they are by no means limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reading the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms "including" and "in … are used as the plain-chinese equivalents of the respective terms" comprising "and" wherein ". Furthermore, in the following claims, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Furthermore, the limitations of the following claims are not written in a device-plus-function format, and are not intended to be interpreted based on 35u.s.c. § 112(f), unless and until such claim limitations explicitly use the phrase "device for …," followed by a functional statement without other structure.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.

Claims (12)

1. An ultrasound imaging method comprising:
accessing an imaging protocol for an ultrasound imaging session, the imaging protocol comprising a plurality of specified views obtained in any order to complete the imaging protocol, wherein the imaging protocol comprises one or more prerequisites that must be met when acquiring the plurality of specified views, the one or more prerequisites comprising at least one of acquisition parameters or physiological parameters;
displaying a graphical completion progress indicator of the imaging protocol on a display device, the indicator indicating at least one of: how many imaging protocols have been completed, how many imaging protocols are to be completed, one or more of the plurality of specified views have been acquired, or one or more additional specified views of the imaging protocols have not been acquired;
acquiring image data by using an ultrasonic imaging system;
automatically identifying a portion of the image data corresponding to one of the plurality of designated views of the imaging protocol using artificial intelligence;
automatically determining whether one or more prerequisites are met when acquiring a portion of the image data;
automatically reject a portion of the image data if the one or more pre-conditions are not satisfied at the time the portion of the image data is acquired; and
if the one or more prerequisites are met while the portion of the image data is acquired, automatically storing the portion of the image data and updating a graphical completion progress indicator displayed on the display device to show that one of a plurality of designated views of an imaging protocol has been stored.
2. The ultrasound imaging method of claim 1, further comprising displaying one or more additional graphical completion progress indicators concurrently with the displaying of the graphical completion progress indicator on the display device.
3. The ultrasound imaging method of claim 1, wherein the one or more prerequisites comprise acquisition parameters, the method further comprising automatically changing the acquisition parameters in response to the rejecting a portion of image data.
4. A method of ultrasound imaging as claimed in claim 3, wherein the acquisition parameters comprise one or more of a specified frame rate, a specified frame rate range, a specified ultrasound line density or a specified ultrasound line density range.
5. The ultrasound imaging method of claim 1, wherein the one or more prerequisites comprises a physiological parameter, the method further comprising automatically providing instructions to a person being imaged to change the physiological parameter in response to the rejecting a portion of image data.
6. The ultrasound imaging method of claim 5, wherein the physiological parameter comprises one or more of a specified heart rate, a specified heart rate range, a specified respiration rate, or a specified respiration rate range.
7. The ultrasound imaging method of claim 1, wherein the protocol specifies the same prerequisite in two or more of the plurality of specified views.
8. The ultrasound imaging method of claim 1, wherein the one or more prerequisites comprise acquisition parameters and physiological parameters.
9. A method of ultrasound imaging according to claim 1, wherein the imaging protocol comprises a first prerequisite for a first one of the multiple designated views and a second prerequisite for a second one of the multiple views, wherein the second prerequisite is different from the first prerequisite.
10. The ultrasound imaging method of claim 1, further comprising:
determining that an operator of the imaging system terminated the imaging session before the imaging protocol was completed; and
an alert is displayed on the display device informing the operator that one or more specified views of the imaging protocol have not been acquired.
11. An ultrasound imaging system comprising:
an ultrasound imaging probe configured to acquire image data during an ultrasound imaging session, the image data comprising a plurality of different obtained views from a plurality of different locations;
a display device; and
a processor configured to perform the method of any one of claims 1-10.
12. A non-transitory computer readable medium storing a computer program having at least one code section executable by a machine to cause the machine to perform the steps of the method of any one of claims 1-10.
CN202210607970.2A 2018-06-22 2019-06-24 Imaging system and method using a real-time inspection completeness monitor Pending CN114983468A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16/015,454 US20190388060A1 (en) 2018-06-22 2018-06-22 Imaging system and method with live examination completeness monitor
US16/015,454 2018-06-22
CN201910547364.4A CN110623685B (en) 2018-06-22 2019-06-24 Imaging system and method using a real-time inspection completeness monitor

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910547364.4A Division CN110623685B (en) 2018-06-22 2019-06-24 Imaging system and method using a real-time inspection completeness monitor

Publications (1)

Publication Number Publication Date
CN114983468A true CN114983468A (en) 2022-09-02

Family

ID=68968783

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210607970.2A Pending CN114983468A (en) 2018-06-22 2019-06-24 Imaging system and method using a real-time inspection completeness monitor
CN201910547364.4A Active CN110623685B (en) 2018-06-22 2019-06-24 Imaging system and method using a real-time inspection completeness monitor

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201910547364.4A Active CN110623685B (en) 2018-06-22 2019-06-24 Imaging system and method using a real-time inspection completeness monitor

Country Status (2)

Country Link
US (2) US20190388060A1 (en)
CN (2) CN114983468A (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10685439B2 (en) * 2018-06-27 2020-06-16 General Electric Company Imaging system and method providing scalable resolution in multi-dimensional image data
CN115334975A (en) * 2020-03-27 2022-11-11 皇家飞利浦有限公司 System and method for imaging and measuring epicardial adipose tissue
US20220280133A1 (en) * 2021-03-04 2022-09-08 GE Precision Healthcare LLC Method and system for automatically detecting an ultrasound image view and focus to provide measurement suitability feedback
CN113887677B (en) * 2021-12-07 2022-03-01 武汉大学 Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla
US11944501B2 (en) 2022-02-16 2024-04-02 GE Precision Healthcare LLC Systems and methods for automatic measurements of medical images
WO2024013114A1 (en) * 2022-07-11 2024-01-18 Koninklijke Philips N.V. Systems and methods for imaging screening
US11931206B2 (en) 2022-07-13 2024-03-19 GE Precision Healthcare LLC Ultrasound imaging system and method for saving a sequence of images as a cineloop

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6953433B2 (en) * 2003-08-29 2005-10-11 Siemens Medical Solutions Usa, Inc. Protocol controller for a medical diagnostic imaging system
US20060074287A1 (en) * 2004-09-30 2006-04-06 General Electric Company Systems, methods and apparatus for dual mammography image detection
US8869259B1 (en) * 2011-05-19 2014-10-21 Zscaler, Inc. Cloud based inspection of secure content avoiding man-in-the-middle attacks
KR101431781B1 (en) * 2012-06-20 2014-08-20 삼성전자주식회사 X-ray image apparatus and control method for the same
US8777856B2 (en) * 2012-06-26 2014-07-15 General Electric Company Diagnostic system and method for obtaining an ultrasound image frame
EP3089670A4 (en) * 2014-01-02 2017-10-11 Metritrack, Inc. System and method for tracking completeness of co-registered medical image data
JP6461673B2 (en) * 2015-03-27 2019-01-30 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic apparatus, control program and system thereof
AU2017230722B2 (en) * 2016-03-09 2022-08-11 EchoNous, Inc. Ultrasound image recognition systems and methods utilizing an artificial intelligence network
US20170273663A1 (en) * 2016-03-24 2017-09-28 Elwha Llc Image processing for an ultrasonic fetal imaging device
US10794873B2 (en) * 2016-04-04 2020-10-06 The Boeing Company Signal processing for ultrasound imaging systems
WO2017181288A1 (en) * 2016-04-21 2017-10-26 The University Of British Columbia Echocardiographic image analysis
CN109310396B (en) * 2016-06-20 2021-11-09 蝴蝶网络有限公司 Automatic image acquisition for assisting a user in operating an ultrasound device
KR101922180B1 (en) * 2016-12-09 2018-11-26 삼성메디슨 주식회사 Ultrasonic image processing apparatus and method for processing of ultrasonic image

Also Published As

Publication number Publication date
US20190388060A1 (en) 2019-12-26
CN110623685A (en) 2019-12-31
CN110623685B (en) 2022-06-28
US20210369241A1 (en) 2021-12-02

Similar Documents

Publication Publication Date Title
CN110623685B (en) Imaging system and method using a real-time inspection completeness monitor
US11354791B2 (en) Methods and system for transforming medical images into different styled images with deep neural networks
CN109758178B (en) Machine-assisted workflow in ultrasound imaging
CN106236133B (en) Method and apparatus for displaying ultrasound image
US10832405B2 (en) Medical image processing apparatus with awareness of type of subject pattern
CN102415902B (en) Ultrasonic diagnostic apparatus and ultrasonic image processng apparatus
KR20150111697A (en) Method and ultrasound apparatus for recognizing an ultrasound image
US11564663B2 (en) Ultrasound imaging apparatus and control method thereof
US11593933B2 (en) Systems and methods for ultrasound image quality determination
CN111683600A (en) Apparatus and method for obtaining anatomical measurements from ultrasound images
CN111345847A (en) Method and system for managing beamforming parameters based on tissue density
CN112971844A (en) Ultrasonic image acquisition quality evaluation method and ultrasonic imaging equipment
CN112603361A (en) System and method for tracking anatomical features in ultrasound images
CN114246611A (en) System and method for adaptive interface for ultrasound imaging system
CN112672695A (en) Method and system for performing fetal weight estimation
CN114680929A (en) Ultrasonic imaging method and system for measuring diaphragm
JP2020103883A (en) Ultrasound imaging system and method for displaying target object quality level
KR20180034974A (en) Method and ultrasound system for setting region of interest at contrast ultrasound image
CN111053572A (en) Method and system for motion detection and compensation in medical images
EP3838162A1 (en) Systems and methods for assessing a placenta
JP2022543540A (en) Ultrasound system sound power control using image data
KR102617378B1 (en) Method for measuring medical indicator and ultrasound diagnosis apparatus for the same
EP3848892A1 (en) Generating a plurality of image segmentation results for each node of an anatomical structure model to provide a segmentation confidence value for each node
US20230316520A1 (en) Methods and systems to exclude pericardium in cardiac strain calculations
US20240070817A1 (en) Improving color doppler image quality using deep learning techniques

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination