CN117618021A - Ultrasound imaging system and related workflow system and method - Google Patents

Ultrasound imaging system and related workflow system and method Download PDF

Info

Publication number
CN117618021A
CN117618021A CN202311770122.4A CN202311770122A CN117618021A CN 117618021 A CN117618021 A CN 117618021A CN 202311770122 A CN202311770122 A CN 202311770122A CN 117618021 A CN117618021 A CN 117618021A
Authority
CN
China
Prior art keywords
ultrasonic
image
section
ultrasound
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311770122.4A
Other languages
Chinese (zh)
Inventor
温博
邹耀贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN202311770122.4A priority Critical patent/CN117618021A/en
Publication of CN117618021A publication Critical patent/CN117618021A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/523Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Human Computer Interaction (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The application discloses a section processing method of an ultrasonic imaging workflow and an ultrasonic imaging workflow system. The method has the advantages that the ultrasonic image data are automatically identified, the type of the section corresponding to the ultrasonic image is determined, the type of the section of the ultrasonic image is matched with the section in an automatic workflow protocol, the operations of selecting and starting the section and selecting frames can be reduced for doctors, and further the working efficiency of the doctor is improved. In addition, the doctor can keep own scanning habit and is hardly influenced by the flow of automatic workflow, and the doctor is more humanized. In one embodiment, the body position map, the annotation and the automatic measurement are automatically added to the matched ultrasonic section, the images are stored at the preset corresponding standard section in the automatic workflow, and the operations of measurement, image storage and the like are reduced for doctors, so that the working efficiency is further improved.

Description

Ultrasound imaging system and related workflow system and method
Technical Field
The present disclosure relates to the field of medical ultrasound imaging, and more particularly, to an ultrasound imaging system and method, and an ultrasound imaging workflow system and a method for processing a slice in an ultrasound imaging workflow.
Background
Along with the release of various ultrasonic examination guidelines and specifications, doctors in various examinations need to complete a series of scanning of standard sections to serve as diagnosis basis. The ultrasonic automatic workflow is organized together by a mode of providing a template protocol and related sections are organized together in a certain sequence or form, and part of the work is automatically completed by the system, so that the working efficiency of doctors can be improved and the operation flow of the doctors can be standardized.
The current ultrasonic automatic workflow system is shown in fig. 1, a group of standard sections required to be scanned by a doctor are preset in the automatic workflow, the sections are represented by section icons in an initial state, the doctor scans the standard sections one by one according to the sequence of the preset standard sections in the automatic workflow during scanning, or scans the standard sections in a non-sequence mode according to the condition of a patient or the habit of the doctor, and the doctor is required to actively freeze images, select the sections from videos, measure and store the images after scanning the standard sections. The section mark is in a finished state after the image is stored, and the system can automatically defrost the image after the image is stored, so that a doctor can conveniently scan the next section.
However, current ultrasonic automatic workflow systems have the following problems:
1) If the scanning sequence of the automatic workflow is not adopted, a doctor is required to manually match the currently scanned tangent plane with the preset tangent plane in the automatic workflow, so that the workload is increased;
2) Each section needs to manually dial the track ball by a doctor to search the optimal section image from multiple frames of video images;
3) Under the condition that continuous scanning or contrast scanning is required for related sections, a plurality of standard sections possibly exist at the same time, an image is automatically thawed after only one section is stored according to the existing ultrasonic automatic workflow system, and then other sections need to be scanned one by one again, so that labor is wasted repeatedly.
These are problems that need to be faced and solved to further improve the efficiency of the physician's work and to maintain the physician's scanning diagnostic habits. For emergency doctors, the emergency situation is faced, so that judgment on the condition of the patient needs to be completed in the shortest possible time to give instruction to the next treatment scheme of the patient. A more intelligent and efficient workflow is needed for the sonographer and emergency doctor using an automatic workflow to cope with heavy, urgent daily work.
Disclosure of Invention
In one embodiment, a method for processing a slice of an ultrasound imaging workflow is provided, the method comprising:
acquiring ultrasound image data, wherein the ultrasound image data comprises a single frame ultrasound image or comprises a sequence of ultrasound images;
Automatically identifying the section type of the single-frame ultrasonic image or automatically identifying the section type of the ultrasonic image sequence;
according to the identified section type of the single-frame ultrasonic image, automatically associating the single-frame ultrasonic image to a corresponding standard section in a workflow protocol to serve as an ultrasonic section image corresponding to the corresponding standard section; or automatically matching each frame of ultrasonic image in the ultrasonic image sequence with a corresponding standard section in the workflow protocol according to the identified section type of the ultrasonic sequence, and automatically associating the ultrasonic image with the highest matching degree to the corresponding standard section in the workflow protocol as an ultrasonic section image corresponding to the corresponding standard section; the workflow protocol comprises a preset set of standard cut planes.
In one embodiment, the workflow protocol further includes predefined operations for each standard facet, the operations including at least one of: adding notes, measuring, adding a body position map, adding a probe mark and storing; the method further comprises the steps of: and according to the operation predefined by the workflow protocol, automatically performing corresponding operation on the ultrasonic section image.
In one embodiment, the method further comprises: providing a man-machine interaction interface, wherein at least one of the following information is presented on the man-machine interaction interface: the ultrasound image data, the ultrasound sectional image, and the workflow protocol.
In one embodiment, the method further comprises: and receiving an input signal through the man-machine interaction interface, wherein the input signal is used for representing confirmation or denial of the ultrasonic section image.
In one embodiment, the method further comprises providing a replay selection control on the human-machine interaction interface to replay the sequence of ultrasound images and/or to redetermine ultrasound sectional images for each frame of ultrasound images replayed.
In one embodiment, the method further comprises: and providing a matching control on the man-machine interaction interface to match the selected ultrasonic image which is likely to be missed in recognition, and identifying the section type of the ultrasonic image which is likely to be missed in recognition.
In one embodiment, the method further comprises: and receiving at least one frame of ultrasonic image selected from the replayed ultrasonic image sequence, and automatically associating the at least one frame of ultrasonic image selected to a corresponding standard section in the workflow protocol through the matching control to serve as the ultrasonic section image corresponding to the corresponding standard section.
In one embodiment, the workflow protocol further includes various types of facet features pre-stored in a database; the step of automatically identifying the slice type of the ultrasound image sequence comprises:
extracting the characteristics of each frame of ultrasonic image in the ultrasonic image sequence;
classifying the extracted features corresponding to each frame of ultrasonic image according to the various types of the section features, thereby determining the section type of each frame of ultrasonic image.
In one embodiment, the feature extraction method includes a feature extraction method or a deep learning method of conventional digital image processing; the classifying includes classifying each frame of ultrasound images using a pattern recognition algorithm and determining a type of facet to which the each frame of ultrasound images belongs and/or a probability of belonging to the type of facet.
In one embodiment, the step of automatically associating the single frame ultrasound image with a corresponding standard slice in the workflow protocol as an ultrasound slice image corresponding to the corresponding standard slice includes: and searching a corresponding standard section in the workflow protocol according to the section type of the identified single-frame ultrasonic image, and automatically determining the single-frame ultrasonic image as the ultrasonic section image corresponding to the standard section.
In one embodiment, acquiring ultrasound image data includes:
transmitting ultrasonic waves to a target to be detected, receiving ultrasonic echoes of the ultrasonic waves returned from the target to be detected, obtaining ultrasonic echo signals, and carrying out beam synthesis and signal processing on the ultrasonic echo signals to obtain ultrasonic image data; or alternatively
The ultrasound image data is imported.
In one embodiment, an ultrasound imaging method is provided, the method comprising:
transmitting ultrasonic waves to a target to be detected, receiving ultrasonic echoes of the ultrasonic waves returned by the target to be detected, and obtaining ultrasonic echo signals;
obtaining an ultrasonic image according to the ultrasonic echo signal;
determining the type of the section to which the ultrasonic image belongs;
determining a target standard section from standard sections contained in the workflow protocol according to the section type;
the target normal slice is associated with the ultrasound image.
In one embodiment, the method further comprises: if the target standard section is not identified from the standard sections contained in the workflow protocol according to the section type, returning to the matching failure.
In one embodiment, an ultrasound imaging workflow system is provided, the system comprising a workflow processor and a display, wherein:
The workflow processor is used for:
acquiring ultrasound image data, wherein the ultrasound image data comprises a single frame ultrasound image or comprises a sequence of ultrasound images;
automatically identifying the section type of the single-frame ultrasonic image or automatically identifying the section type of the ultrasonic image sequence;
according to the identified section type of the single-frame ultrasonic image, automatically associating the single-frame ultrasonic image to a corresponding standard section in a workflow protocol to serve as an ultrasonic section image corresponding to the corresponding standard section; or automatically matching each frame of ultrasonic image in the ultrasonic image sequence with a corresponding standard section in the workflow protocol according to the identified section type of the ultrasonic sequence, and automatically associating the ultrasonic image with the highest matching degree to the corresponding standard section in the workflow protocol as an ultrasonic section image corresponding to the corresponding standard section; the workflow protocol comprises a set of preset standard cut planes;
the display is used for displaying a human-machine interface, and the human-machine interface comprises at least one of the ultrasonic image data, the ultrasonic section image and the information of the workflow protocol.
In one embodiment, the system further comprises:
An ultrasonic probe is provided with a probe body,
the transmitting and receiving circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to the target to be detected, receiving ultrasonic echoes returned from the target to be detected and forming echo signals;
and the beam synthesis and signal processing module is used for carrying out beam synthesis and signal processing on the echo signals to obtain the ultrasonic image data, wherein the ultrasonic image data comprises at least one frame of ultrasonic image.
In one embodiment, a matching control is provided on the man-machine interface for matching the selected ultrasound image that may be missed in recognition, and identifying the type of the section of the ultrasound image that may be missed in recognition.
In one embodiment, a frozen image control is provided on the human-machine interface, and the ultrasound sectional image is frozen or thawed by triggering the frozen image control.
In one embodiment, an ultrasound imaging system is provided, the system comprising:
an ultrasonic probe is provided with a probe body,
the transmitting and receiving circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to the target to be detected, receiving ultrasonic echoes of the ultrasonic waves returned from the target to be detected and obtaining ultrasonic echo signals;
the processor is used for obtaining an ultrasonic image according to the ultrasonic echo signal, determining the type of the section to which the ultrasonic image belongs, determining a target standard section from standard sections contained in a workflow protocol according to the type of the section, and associating the target standard section with the ultrasonic image;
And a display for displaying the ultrasound image.
In one embodiment, a readable storage medium is provided having a computer program stored thereon which, when executed, performs the steps of any of the methods described above.
The beneficial effects of the invention are as follows: the method has the advantages that the ultrasonic image data are automatically identified, the type of the section corresponding to the ultrasonic image is determined, the type of the section of the ultrasonic image is matched with the section in the protocol in the automatic workflow, the operations of selecting and starting the section and selecting the frame can be reduced for doctors, and further the working efficiency of the doctor is improved. In addition, the doctor can keep own scanning habit and is hardly influenced by the flow of automatic workflow, and the doctor is more humanized. In one embodiment, the body position map, the annotation and the automatic measurement are automatically added to the matched ultrasonic section, the images are stored at the preset corresponding standard section in the automatic workflow, and the operations of measurement, image storage and the like are reduced for doctors, so that the working efficiency is further improved.
Drawings
FIG. 1 is a schematic diagram of a current ultrasonic automated workflow system;
FIG. 2 is a schematic diagram of an ultrasound imaging workflow system of an embodiment of the present application;
FIG. 3 is a schematic diagram of an ultrasound imaging workflow system of yet another embodiment of the present application;
FIG. 4 is a schematic diagram of an ultrasound imaging workflow system of another embodiment of the present application;
FIG. 5 is a schematic diagram of an ultrasound imaging workflow system of yet another embodiment of the present application;
FIG. 6 is a flow chart of a method of slice processing of an ultrasound imaging workflow according to an embodiment of the present application;
FIG. 7 is a flow chart of a method of slice processing of an ultrasound imaging workflow according to another embodiment of the present application;
FIG. 8 is a flow chart of a method of slice processing of an ultrasound imaging workflow according to yet another embodiment of the present application;
FIG. 9 is a flow chart of a method of slice processing of an ultrasound imaging workflow according to yet another embodiment of the present application;
FIG. 10 is a schematic diagram of an ultrasound imaging system according to an embodiment of the present application;
fig. 11 is a flow chart of an ultrasound imaging method according to an embodiment of the present application.
Detailed Description
The invention will be described in further detail below with reference to the drawings by means of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present application. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, some operations associated with the present application have not been shown or described in the specification to avoid obscuring the core portions of the present application, and may not be necessary for a person skilled in the art to describe in detail the relevant operations based on the description in the specification and the general knowledge of one skilled in the art.
Furthermore, the described features, operations, or characteristics of the description may be combined in any suitable manner in various embodiments. Also, various steps or acts in the method descriptions may be interchanged or modified in a manner apparent to those of ordinary skill in the art. Thus, the various orders in the description and drawings are for clarity of description of only certain embodiments, and are not meant to be required orders unless otherwise indicated.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The terms "coupled" and "connected," as used herein, are intended to encompass both direct and indirect coupling (coupling), unless otherwise indicated.
An embodiment of the application provides an ultrasonic imaging method and system, wherein the method relates to the section processing of an ultrasonic imaging workflow, in the method, a doctor is intelligently identified to scan the content of an image, the section type corresponding to a current image is determined, the current image is matched with a section in a protocol in an automatic workflow, a body figure map, an annotation and automatic measurement are automatically added to the matched section, and the image is stored at a preset corresponding standard section in the automatic workflow. By the scheme, almost all operations of selecting and starting section, selecting frames, measuring, storing pictures and the like can be reduced for doctors, and the working efficiency can be greatly improved; in addition, the method and the device can enable doctors to keep the previous scanning habit, are hardly influenced by the flow of automatic workflow, and are more humanized.
In the ultrasound imaging method or system of the embodiment of the present application, the processing of the section in the ultrasound imaging workflow is performed for an ultrasound image or an ultrasound image sequence, and the acquisition mode of the ultrasound image or the ultrasound image sequence may be various, for example, may be obtained by real-time ultrasound scanning imaging, or may be obtained by an introduction mode. As for the import method, for example, image data may be imported using an optical disk, or image data may be imported using a U disk, or image data may be received from a network via a network, or the like, which is not limited to this invention. In the following, an ultrasound image sequence is taken as an example, where the ultrasound image sequence includes a plurality of frames of ultrasound images, and a single frame of ultrasound image may be regarded as a special case of the plurality of frames of images.
The ultrasound image sequence of an embodiment of the present application is obtained by real-time ultrasound scanning imaging. As shown in fig. 2, an ultrasound imaging workflow system 10 according to an embodiment of the present application includes: an ultrasound probe 110, transmit and receive circuitry 111, a beam forming and signal processing module 120, a workflow processor 130, and a display 140. In a specific implementation of the illustrated embodiment, a set of pulses focused by delay are sent to the ultrasonic probe 110 through the transmitting and receiving circuit 111, the ultrasonic probe 111 transmits ultrasonic waves to the target to be detected, and after a certain delay, the transmitting and receiving circuit 111 receives the ultrasonic echoes reflected from the target to be detected to form echo signals; the echo signals enter a beam synthesis and signal processing module 120 to finish focusing delay, weighting and channel summation and signal processing, so as to obtain a real-time ultrasonic image sequence; the workflow processor 130 classifies images according to a database (which may be a preset set of standard sections of an automatic workflow collected in advance) for the currently acquired ultrasound image sequence, determines a section type corresponding to the currently acquired ultrasound image sequence, and matches the currently acquired ultrasound image sequence with a standard section of a preset automatic workflow protocol based on the section type, and outputs an ultrasound image with the highest matching degree as an ultrasound section image, wherein the ultrasound image sequence and the corresponding ultrasound section image may be displayed on the display 140 or may be stored in a memory (not shown).
In this embodiment, the display 140 of the ultrasonic imaging workflow system 10 may be a touch display screen, a liquid crystal display screen, or the like, or may be an independent display device such as a liquid crystal display, a television, or the like, which is independent of the ultrasonic imaging workflow system 10, or may be a display screen on an electronic device such as a mobile phone, a tablet computer, or the like. In addition, in the embodiment of the present application, the memory of the ultrasound imaging workflow system 10 may be a flash memory card, a solid state memory, a hard disk, or the like.
In another implementation of the embodiment shown in fig. 2, as shown in fig. 3, the ultrasound imaging workflow system 10 may further include a matching result processing module, configured to automatically perform corresponding operations on the matching ultrasound sectional images according to operations such as annotation, bitmap, measurement and memory preset in the preset automatic workflow protocol.
Fig. 4 illustrates an ultrasound imaging workflow system 10 according to another embodiment of the present application, which differs from fig. 2 in that a human-machine interface is provided for interactive feedback with a user, for example, after image matching, the matching result may be output on a display 140 for user confirmation, and each confirmed image may be matched and associated with a preset standard section of an automatic workflow protocol, and the confirmed image is an ultrasound section image. Another variant of fig. 4 is shown in fig. 5, which differs from fig. 4 in that the matching associated ultrasound images or image sequences are automatically subjected to corresponding operations according to the operations of annotation, body position map, measurement, and storage map preset in the automatic workflow protocol.
The ultrasound imaging method in the present application is described in detail below. The embodiment of the application provides a tangent plane processing method of an ultrasonic imaging workflow, which is applied to an ultrasonic imaging workflow system 10, is particularly suitable for the ultrasonic imaging workflow system 10 comprising a touch display screen, and can utilize touch display screen to input touch screen operation.
Referring to fig. 6, a method for processing a slice of an ultrasound imaging workflow according to an embodiment of the present application includes step S20 and step S40.
Step S20: automatically identifying the type of the section, comprising: and automatically identifying the section type of the single-frame ultrasonic image when the ultrasonic image data is the single-frame ultrasonic image, and automatically identifying the section type of the ultrasonic image sequence when the ultrasonic image data is the ultrasonic image sequence.
In step S20, the common classification and recognition method is to combine with a database and learn by using a machine learning method, so as to distinguish the features or rules of different section categories, and then classify and recognize the input image according to the learned features or rules. In general, step S20 may include: a database construction step and an identification step.
In the step of constructing the database, the database can be composed of a large number of samples, and in the concrete implementation, each sample is composed of a section image and a section category corresponding to the image in the case of adopting a full-supervision learning method; for the case of adopting the semi-supervised learning method, one part of the samples can consist of the section images and the section categories corresponding to the images, and the other part of the samples only has the images without marking the section categories corresponding to the images. Of course, samples suitable for both the fully supervised and semi-supervised learning methods may also be stored in the database.
In the identification step, the machine learning algorithm is designed to learn the characteristics or rules of different section categories in the database, thereby realizing the identification of the image. The image recognition step of one embodiment includes a feature extraction sub-step and a classification judgment sub-step.
In the feature extraction sub-step, the feature extraction method may be a feature extraction method of conventional digital image processing, such as a PCA (Principal Component Analysis ) algorithm, an LDA (Linear Discriminant Analysis, linear discriminant analysis) algorithm, harr features, texture features, or the like. The feature extraction method can also adopt a deep learning method, and the constructed database is subjected to feature learning by stacking a convolution layer, a pooling layer, an activation layer and a full connection layer; common deep learning networks are CNN (Convolutional Neural Networks, convolutional neural network), resNet (residual network), mobileNet lightweight convolutional neural network, VGG convolutional neural network, acceptance network, densanet network, and the like.
In the classification judgment sub-step, the extracted features are classified by using a KNN (K-Nearest Neighbor) classification algorithm, an SVM (Support Vector Machine) algorithm, a random forest, a neural network and other discriminators in combination with the features in the database, and the probability of which standard section the currently processed ultrasonic image belongs to and/or belongs to is determined. In general, through the classification judgment sub-step, the probability that the current image belongs to each facet class can be output.
Step S40: the automatic association matching comprises the steps of automatically associating the single-frame ultrasonic image with a corresponding standard section in an automatic workflow protocol to obtain an ultrasonic section image under the condition that the section type of the single-frame ultrasonic image is identified, automatically matching each frame of ultrasonic image in an ultrasonic image sequence with the corresponding standard section in the automatic workflow protocol under the condition that the section type of the ultrasonic sequence is identified, and selecting the ultrasonic image with the highest matching degree as the ultrasonic section image, wherein the automatic workflow protocol comprises a preset group of standard sections.
In step S40, the currently identified ultrasound image is associated with a corresponding standard cut plane in the automatic workflow protocol, specifically, if the identified ultrasound image belongs to the standard cut plane in the automatic workflow protocol, the ultrasound image is matched with the standard cut plane in the automatic workflow protocol.
For the ultrasonic image sequence, the content of the adjacent frames is similar, and the adjacent frames are generally judged to be the same section type after the step S20, so that the frame of ultrasonic image with the highest matching degree can be automatically selected as the ultrasonic section image, and the ultrasonic section image can be displayed on a human-computer interaction interface without manually selecting frames by a user.
For the ultrasound image sequence, there may be multiple types of sections at the same time, for example, when the fetal skull is scanned, the ultrasound probe is moved to obtain the cerebellum section, the thalamus section and the lateral ventricle section at the same time, so that the types of sections can be identified through step S20, and then the sections can be displayed in turn for confirmation to the doctor. And under the condition of single frame or multiple frames, if the identified ultrasonic image does not belong to a standard section in the automatic workflow protocol, returning to failure of matching.
Referring to fig. 7, the method for processing a tangent plane of an ultrasound imaging workflow according to another embodiment of the present application includes steps S41 in addition to steps S20 and S40: and according to the operation predefined by the automatic workflow protocol, automatically performing corresponding operation on the ultrasonic section image. Wherein the predefined operations of the automatic workflow protocol include: adding annotations, automatic or manual measurements, adding a map of the body position, adding probe markers, storing images and annotated information, and the like. That is, the post-image matching system may take mapping, annotating, adding body position maps and probe markers, automatic or manual measurements as required for the ultrasound slice image in an automatic workflow protocol. The operations can be set in advance through presetting, and the operations are automatically fetched and completed by the system after the corresponding section is identified, so that the requirement of automation in the aspect of doctor application is met.
Since the automatic recognition section type has a certain probability of recognition error, after the matching association in step S40 is performed, an association confirmation link may be set, and the user confirms the association result. Referring to fig. 8, a method for processing a tangent plane of an ultrasound imaging workflow according to another embodiment of the present application includes a step related to a step S61 branch and/or a step related to a step S62 branch in addition to steps S20 and S40.
In step S61, an input function and a playback selection function are provided at the human-computer interaction interface. Through the input function, the man-machine interaction interface receives an input signal, the input signal represents confirmation or denial of the ultrasonic section image output by the step S40 by the user, and the accuracy of matching can be known through the input signal. By the replay selection function, the sequence of ultrasound images may be replayed and/or the ultrasound section images may be re-selected for each frame of ultrasound images replayed. That is, through the input function and the playback selection function, it can be determined whether the user is satisfied with the ultrasound sectional image output in step S40, if the user is not satisfied with the current matching result, if the user considers that there are other images with better matching effect with the sectional image in the currently processed video with multiple frames, the user can directly manually play back the movie (i.e. the video) for searching. In one embodiment, the playback selection function may be presented to the user through a control (referred to as a playback selection control) set on the human-computer interface, where the control may be implemented in a button (button), a menu item, or the like; the input function can also be realized by accepting the modes of hooking, selecting, confirming buttons and the like of the ultrasonic section images presented on the display by input devices such as a track ball, a mouse, a keyboard and the like.
In step S62, a matching function is provided in the man-machine interface to match the selected ultrasound image that may be missed in recognition, and the type of the section of the ultrasound image that may be missed in recognition is identified. In a specific implementation, by providing a matching function implemented by a control (such as a button) on a man-machine interaction interface, a user can select an ultrasonic image which is considered to be possibly missed to be identified, namely if the user is not satisfied with the current matching result, if the user considers that a section of multi-frame image still has a section which can be matched with a section but is not identified by a system, the user can play back multi-frame videos to the image and click on a corresponding standard section icon (the standard section icon is one of a group of icons presented on the man-machine interaction interface, and the group of icons corresponds to a group of standard section preset by an automatic workflow protocol) to be matched, or click on the standard section icon and play back the image to be matched. In one example, when an unfinished ultrasound sectional image is clicked, the ultrasound sectional image may be marked as completed after the ultrasound sectional image is stored; and when clicking on the completed ultrasound sectional image, storing the ultrasound sectional image, and automatically copying a new ultrasound sectional image to be displayed on the display.
For example, the user collects a transparent compartment image, the current image is identified as the transparent compartment image in step S20, and step S40 performs matching association between the image and a transparent compartment section in an automatic workflow protocol; then, in one embodiment, as shown in fig. 8, operations such as annotating, measuring, adding a body position map and the like are performed on the image area according to the requirement of the transparent compartment tangent plane in the automatic workflow protocol, then a user confirms whether the matching association is accurate, and if the user confirms that the association is accurate, the map is directly stored or the map is stored after measurement; in another embodiment, as shown in fig. 9, the user confirms whether the matching association is accurate, if the user confirms that the matching association is accurate, then annotates, measures, adds a body position map and the like in the image area according to the requirement of the transparent compartment section in the automatic workflow protocol, and then stores the map.
In the foregoing embodiment of the present application, a frozen image control may be further provided on the man-machine interaction interface, and freezing and thawing of the ultrasound section image may only be achieved by triggering the frozen image control. In a frozen state, the image is not thawed by clicking a control such as a section button, an image storage button and the like on the human-computer interaction interface, the image is always kept frozen, and when the control such as the section button is clicked, only a preset annotation and a body position map are displayed on the currently frozen image, and the image is thawed unless the user clicks the frozen image triggering control again. Therefore, the user can adjust and select the ultrasonic section image which is considered to be more suitable by the user, and the related standard section of the automatic workflow protocol is supported to be matched with the ultrasonic image sequence for a plurality of times, so that the ultrasonic image energy scanned by the user is fully utilized.
In addition, as shown in fig. 10, an embodiment of the present application also provides an ultrasound imaging system 20 that includes an ultrasound probe 210, transmit and receive circuitry 220, a processor 230, and a display 240. In the ultrasound imaging system 20, the transmitting and receiving circuit 220 may excite the ultrasound probe 210 to transmit ultrasound waves to the object to be detected and receive ultrasound echoes of the ultrasound waves returned from the object to be detected, obtain ultrasound echo signals, the processor 230 may obtain at least one frame of ultrasound images from the ultrasound echo signals, determine a slice type to which the at least one frame of ultrasound images belongs, determine at least one target standard slice from standard slices contained in the automated workflow protocol according to the slice type, associate the at least one target standard slice with the at least one frame of ultrasound images, and the display 240 may display the at least one frame of ultrasound images. The implementation of each component may refer to the related content in the ultrasonic imaging workflow system and the section processing method thereof, and will not be described in detail herein.
Based on the ultrasound imaging system shown in fig. 10, an embodiment of the present application further provides an ultrasound imaging method, as shown in fig. 11, comprising the following steps S100-S108.
Step S100: transmitting ultrasonic waves to a target to be detected, receiving ultrasonic echoes of the ultrasonic waves returned by the target to be detected, and obtaining ultrasonic echo signals;
step S102: obtaining at least one frame of ultrasonic image according to the ultrasonic echo signal;
step S104: determining the section type of at least one frame of ultrasonic image;
step S106: determining at least one target standard section from standard sections contained in the automatic workflow protocol according to the section type;
step S108: at least one target standard cut plane is associated with at least one frame of ultrasound image.
The implementation of step S100 to step S106 may refer to the related content related to the aforementioned ultrasound imaging workflow system and the tangent plane processing method thereof.
In step S108, if the ultrasound image obtained in step S102 is a single-frame ultrasound image, the single-frame ultrasound image is automatically associated to a corresponding standard section (i.e., a target standard section) in a set of standard sections preset by an automatic workflow protocol, and the specific implementation of the association can be realized by means of feature extraction, image matching and the like as described above; if the ultrasound image obtained in step S102 is a multi-frame ultrasound image, there may be the following cases:
(1) If the multiple frames of ultrasonic images are similar and identified as the same type of section, automatically selecting the frame of ultrasonic image with the highest matching degree, and automatically associating the frame of ultrasonic image with the target standard section;
(2) If the multi-frame ultrasonic image is identified as a plurality of types of tangent planes, automatically associating the multi-frame ultrasonic image with a corresponding standard tangent plane;
(3) If the multi-frame ultrasonic image is identified as a plurality of types of tangent planes, and a plurality of frames of ultrasonic images are identified as the same type of tangent plane A, automatically selecting the frame of ultrasonic image with high matching degree to be associated with the tangent plane A according to the mode of (1), and automatically associating the rest frames according to the mode of (2);
(4) If the multi-frame ultrasound image is identified as multiple types of cuts, the user may be presented with confirmation and manual association.
If the identified ultrasound image does not belong to a standard cut plane in the automatic workflow protocol, a matching failure is returned.
Through the method and the system provided by the embodiment of the application, after a user scans a section of image, the user can automatically select the proper image which can be matched with each section in the protocol from the images, and the user can judge according to a certain sequence. For approved matches, the system automatically obtains the cut surface settings, annotates, marks the body position map, measures (if needed) and stores the map, and confirms the next match after completion. For unacknowledged matches, the user is free to play back the movie for re-matching, and can discard or add matches. Compared with the existing ultrasonic automatic workflow, the method saves the time of repeated scanning and image matching, obviously reduces the operation of doctors on ultrasonic equipment, greatly improves the working efficiency, shortens the inspection time, is more humanized and fully respects the scanning habit of users. Embodiments of the present application also provide a computer readable storage medium storing a plurality of program instructions that, when invoked by the workflow processor 130 or the processor 230 for execution, may perform part or all of the steps or any combination of the steps in the method for automatically processing a slice of an ultrasound imaging workflow in various embodiments of the present application. In one embodiment, the computer readable storage medium may be the aforementioned memory, which may be a non-volatile storage medium such as a flash memory card, a solid state memory, a hard disk, or the like.
In the present embodiment, the workflow processor 130 of the ultrasound imaging workflow system 10 and the processor 230 of the ultrasound imaging system 20 may be implemented by software, hardware, firmware, or a combination thereof, and may use a circuit, a single or multiple application specific integrated circuits (application specific integrated circuits, ASIC), a single or multiple general purpose integrated circuits, a single or multiple microprocessors, a single or multiple programmable logic devices, or a combination of the foregoing circuits or devices, or other suitable circuits or devices, so that the processor 105 may perform the respective steps of the method for automatically processing a slice of an ultrasound imaging workflow in the foregoing embodiments.
Reference is made to various exemplary embodiments herein. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary embodiments without departing from the scope herein. For example, the various operational steps and components used to perform the operational steps may be implemented in different ways (e.g., one or more steps may be deleted, modified, or combined into other steps) depending on the particular application or taking into account any number of cost functions associated with the operation of the system.
Additionally, as will be appreciated by one of skill in the art, the principles herein may be reflected in a computer program product on a computer readable storage medium preloaded with computer readable program code. Any tangible, non-transitory computer readable storage medium may be used, including magnetic storage devices (hard disks, floppy disks, etc.), optical storage devices (CD-ROMs, DVDs, blu-Ray disks, etc.), flash memory, and/or the like. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including means which implement the function specified. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified.
While the principles herein have been shown in various embodiments, many modifications of structure, arrangement, proportions, elements, materials, and components, which are particularly adapted to specific environments and operative requirements, may be used without departing from the principles and scope of the present disclosure. The above modifications and other changes or modifications are intended to be included within the scope of this document.
The foregoing detailed description has been described with reference to various embodiments. However, those skilled in the art will recognize that various modifications and changes may be made without departing from the scope of the present disclosure. Accordingly, the present disclosure is to be considered as illustrative and not restrictive in character, and all such modifications are intended to be included within the scope thereof. Also, advantages, other advantages, and solutions to problems have been described above with regard to various embodiments. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature. The terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, system, article, or apparatus. Furthermore, the term "couple" and any other variants thereof are used herein to refer to physical connections, electrical connections, magnetic connections, optical connections, communication connections, functional connections, and/or any other connection.
Those skilled in the art will recognize that many changes may be made to the details of the above-described embodiments without departing from the underlying principles of the invention. Accordingly, the scope of the invention should be determined from the claims.

Claims (19)

1. A method for processing a slice of an ultrasound imaging workflow, comprising:
acquiring ultrasound image data, wherein the ultrasound image data comprises a single frame ultrasound image or comprises a sequence of ultrasound images;
automatically identifying the section type of the single-frame ultrasonic image or automatically identifying the section type of the ultrasonic image sequence;
according to the identified section type of the single-frame ultrasonic image, automatically associating the single-frame ultrasonic image to a corresponding standard section in a workflow protocol, and taking the single-frame ultrasonic image as an ultrasonic section image corresponding to the corresponding standard section; or automatically matching each frame of ultrasonic image in the ultrasonic image sequence with a corresponding standard section in the workflow protocol according to the identified section type of the ultrasonic sequence, and automatically associating the ultrasonic image with the highest matching degree to the corresponding standard section in the workflow protocol as an ultrasonic section image corresponding to the corresponding standard section; the workflow protocol comprises a preset set of standard cut planes.
2. The method of claim 1, wherein the workflow protocol further comprises predefined operations for each standard facet, the operations comprising at least one of: adding notes, measuring, adding a body position map, adding a probe mark and storing;
the method further comprises the steps of: and according to the operation predefined by the workflow protocol, automatically performing corresponding operation on the ultrasonic section image.
3. The method of claim 1, wherein the method further comprises: providing a human-computer interaction interface, wherein at least one of the following information is presented on the human-computer interaction interface: and the ultrasonic image data, the ultrasonic section image and the relevant information of the workflow protocol.
4. A method as claimed in claim 3, wherein the method further comprises: and receiving an input signal through the man-machine interaction interface, wherein the input signal is used for representing confirmation or denial of the ultrasonic section image.
5. The method of claim 4, further comprising providing a replay selection control on the human-machine interaction interface to replay the sequence of ultrasound images and/or to re-determine ultrasound sectional images for each frame of ultrasound images replayed.
6. The method of claim 5, wherein the method further comprises:
and providing a matching control on the man-machine interaction interface to match the selected ultrasonic images which are likely to be missed in recognition, and identifying the section type of the ultrasonic images which are likely to be missed in recognition.
7. The method of claim 6, wherein the method further comprises:
and receiving at least one frame of ultrasonic image selected from the replayed ultrasonic image sequence, and automatically associating the at least one frame of ultrasonic image selected to a corresponding standard section in the workflow protocol through the matching control to serve as the ultrasonic section image corresponding to the corresponding standard section.
8. The method of claim 1, wherein the workflow protocol further comprises various types of facet features pre-stored in a database; the step of automatically identifying the section type of the ultrasonic image sequence comprises the following steps:
extracting features of each frame of ultrasonic image in the ultrasonic image sequence;
and classifying the extracted features corresponding to each frame of ultrasonic image according to the various types of the section features, thereby determining the section type of each frame of ultrasonic image.
9. The method of claim 8, wherein the feature extraction method comprises a feature extraction method or a deep learning method of conventional digital image processing; the classifying comprises classifying each frame of ultrasonic image by using a pattern recognition algorithm, and determining the section type of each frame of ultrasonic image and/or the probability of the section type.
10. The method of claim 1, wherein the step of automatically associating the single frame ultrasound image to a corresponding standard cut plane in a workflow protocol as an ultrasound cut plane image corresponding to the corresponding standard cut plane comprises: and searching a corresponding standard tangent plane in the workflow protocol according to the identified tangent plane type of the single-frame ultrasonic image, and automatically determining the single-frame ultrasonic image as the ultrasonic tangent plane image corresponding to the standard tangent plane.
11. The method of claim 1, wherein acquiring ultrasound image data comprises:
transmitting ultrasonic waves to a target to be detected, receiving ultrasonic echoes of the ultrasonic waves returned from the target to be detected, obtaining ultrasonic echo signals, and carrying out beam synthesis and signal processing on the ultrasonic echo signals to obtain ultrasonic image data; or alternatively
The ultrasound image data is imported.
12. An ultrasound imaging method, comprising:
transmitting ultrasonic waves to a target to be detected, and receiving ultrasonic echoes of the ultrasonic waves returned by the target to be detected to obtain ultrasonic echo signals;
obtaining an ultrasonic image according to the ultrasonic echo signal;
determining the section type of the ultrasonic image;
determining a target standard section from standard sections contained in a workflow protocol according to the section type;
the target standard cut plane is associated with the ultrasound image.
13. The method as recited in claim 12, further comprising:
and if the target standard section is not identified from the standard sections contained in the workflow protocol according to the section type, returning to the matching failure.
14. An ultrasound imaging workflow system comprising a workflow processor and a display, wherein:
the workflow processor is configured to:
acquiring ultrasound image data, wherein the ultrasound image data comprises a single frame ultrasound image or comprises a sequence of ultrasound images;
automatically identifying the section type of the single-frame ultrasonic image or automatically identifying the section type of the ultrasonic image sequence;
According to the identified section type of the single-frame ultrasonic image, automatically associating the single-frame ultrasonic image to a corresponding standard section in a workflow protocol, and taking the single-frame ultrasonic image as an ultrasonic section image corresponding to the corresponding standard section; or automatically matching each frame of ultrasonic image in the ultrasonic image sequence with a corresponding standard section in the workflow protocol according to the identified section type of the ultrasonic sequence, and automatically associating the ultrasonic image with the highest matching degree to the corresponding standard section in the workflow protocol as an ultrasonic section image corresponding to the corresponding standard section; the workflow protocol comprises a preset group of standard cut planes;
the display is used for displaying a human-machine interface, and the human-machine interface comprises at least one of the ultrasonic image data, the ultrasonic section image and the information of the workflow protocol.
15. The system as recited in claim 14, further comprising:
an ultrasonic probe is provided with a probe body,
the transmitting and receiving circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to a target to be detected, receiving ultrasonic echoes returned from the target to be detected and forming echo signals;
And the beam synthesis and signal processing module is used for carrying out beam synthesis and signal processing on the echo signals to obtain the ultrasonic image data, wherein the ultrasonic image data comprises at least one frame of ultrasonic image.
16. The system of claim 14, wherein a matching control is provided on the man-machine interface for matching selected potentially missing identified ultrasound images to identify a type of cut surface of the potentially missing identified ultrasound images.
17. The system of claim 14, wherein a frozen image control is provided on the human-machine interface, the ultrasound sectional image being frozen or thawed by triggering the frozen image control.
18. An ultrasound imaging system, comprising:
an ultrasonic probe is provided with a probe body,
the transmitting and receiving circuit is used for exciting the ultrasonic probe to transmit ultrasonic waves to a target to be detected, receiving ultrasonic echoes of the ultrasonic waves returned from the target to be detected, and obtaining ultrasonic echo signals;
the processor is used for obtaining an ultrasonic image according to the ultrasonic echo signal, determining the type of the section to which the ultrasonic image belongs, determining a target standard section from standard sections contained in a workflow protocol according to the type of the section, and associating the target standard section with the ultrasonic image;
And the display is used for displaying the ultrasonic image.
19. A readable storage medium, characterized in that it has stored thereon a computer program which, when executed, implements the steps of the method according to any of claims 1 to 13.
CN202311770122.4A 2018-12-29 2018-12-29 Ultrasound imaging system and related workflow system and method Pending CN117618021A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311770122.4A CN117618021A (en) 2018-12-29 2018-12-29 Ultrasound imaging system and related workflow system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202311770122.4A CN117618021A (en) 2018-12-29 2018-12-29 Ultrasound imaging system and related workflow system and method
CN201811637350.3A CN111374698A (en) 2018-12-29 2018-12-29 Ultrasound imaging system and related workflow system and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201811637350.3A Division CN111374698A (en) 2018-12-29 2018-12-29 Ultrasound imaging system and related workflow system and method

Publications (1)

Publication Number Publication Date
CN117618021A true CN117618021A (en) 2024-03-01

Family

ID=71222456

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311770122.4A Pending CN117618021A (en) 2018-12-29 2018-12-29 Ultrasound imaging system and related workflow system and method
CN201811637350.3A Pending CN111374698A (en) 2018-12-29 2018-12-29 Ultrasound imaging system and related workflow system and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201811637350.3A Pending CN111374698A (en) 2018-12-29 2018-12-29 Ultrasound imaging system and related workflow system and method

Country Status (1)

Country Link
CN (2) CN117618021A (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529083A (en) * 2020-12-15 2021-03-19 深圳开立生物医疗科技股份有限公司 Ultrasonic scanning method, device, equipment and storage medium
CN113274056A (en) * 2021-06-30 2021-08-20 深圳开立生物医疗科技股份有限公司 Ultrasonic scanning method and related device
CN113693625B (en) * 2021-09-03 2022-11-08 深圳迈瑞软件技术有限公司 Ultrasonic imaging method and ultrasonic imaging apparatus
CN114334095A (en) * 2021-12-31 2022-04-12 深圳度影医疗科技有限公司 Intelligent identification method and system for ultrasonic examination and terminal equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104680481B (en) * 2013-11-28 2018-09-11 深圳迈瑞生物医疗电子股份有限公司 A kind of ultrasonic wave added checking method and system
CN103955698B (en) * 2014-03-12 2017-04-05 深圳大学 The method of standard tangent plane is automatically positioned from ultrasonoscopy
EP3127486B1 (en) * 2014-03-20 2024-07-31 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Method and device for automatic identification of measurement item and ultrasound imaging apparatus
CN106580368B (en) * 2016-11-26 2020-06-23 汕头市超声仪器研究所有限公司 Auxiliary scanning method for ultrasonic imaging equipment
CN107569257A (en) * 2017-09-29 2018-01-12 深圳开立生物医疗科技股份有限公司 Ultrasonoscopy processing method and system, ultrasonic diagnostic equipment
CN108567446A (en) * 2018-05-10 2018-09-25 深圳开立生物医疗科技股份有限公司 Cardiac ultrasonic equipment and its quickly method of selection cardiac cycle phase correspondence image

Also Published As

Publication number Publication date
CN111374698A (en) 2020-07-07

Similar Documents

Publication Publication Date Title
KR102646194B1 (en) Method and apparatus for annotating ultrasonic examination
CN117618021A (en) Ultrasound imaging system and related workflow system and method
CN111374703B (en) Method and system for medical grading system
US20110245623A1 (en) Medical Diagnosis Using Community Information
US20140221836A1 (en) Ultrasound diagnostic imaging apparatus
JP2021191429A (en) Apparatuses, methods, and systems for annotation of medical images
WO2011066486A2 (en) Advanced multimedia structured reporting
US20230135046A1 (en) Classification display method of ultrasound data and ultrasound imaging system
WO2007036880A1 (en) User interface system and method for creating, organizing and setting-up ultrasound imaging protocols
US10918279B2 (en) System for connecting medical image capture apparatuses via a network
JP5186858B2 (en) Medical information processing system, medical information processing method, and program
EP4061230B1 (en) Systems and methods for obtaining medical ultrasound images
US11553887B2 (en) Limited data persistence in a medical imaging workflow
CN113693625B (en) Ultrasonic imaging method and ultrasonic imaging apparatus
CN111096765B (en) Ultrasonic diagnostic apparatus, method of rapidly searching for unfinished section using the same, and storage medium storing the same
JP6178125B2 (en) Ultrasonic diagnostic equipment
US12070357B2 (en) System and method for automatic association and display of video loop subject matter for enhanced identification
JP2012143380A (en) Ultrasonic diagnostic apparatus
JP2023127441A (en) Information processing device, control method of the same and program
CN118526221A (en) Ultrasonic diagnostic apparatus and recording medium
CN116258668A (en) System and method for enhancing automatic association and display of identified video loop topics
CN118571427A (en) Medical image callback method, medical image storage method and related components
CN117333871A (en) Annotation method, annotation device, annotation equipment and storage medium
CN116069965A (en) Ultrasonic image retrieval method and system
JP2005287753A (en) Ultrasonic diagnosis apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination