CN117547302A - Ultrasonic scanning device, method and storage medium - Google Patents

Ultrasonic scanning device, method and storage medium Download PDF

Info

Publication number
CN117547302A
CN117547302A CN202311755649.XA CN202311755649A CN117547302A CN 117547302 A CN117547302 A CN 117547302A CN 202311755649 A CN202311755649 A CN 202311755649A CN 117547302 A CN117547302 A CN 117547302A
Authority
CN
China
Prior art keywords
model
ultrasonic
current
ultrasound
prompt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311755649.XA
Other languages
Chinese (zh)
Inventor
蔡璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan United Imaging Healthcare Co Ltd
Original Assignee
Wuhan United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan United Imaging Healthcare Co Ltd filed Critical Wuhan United Imaging Healthcare Co Ltd
Priority to CN202311755649.XA priority Critical patent/CN117547302A/en
Publication of CN117547302A publication Critical patent/CN117547302A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/523Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physiology (AREA)
  • Quality & Reliability (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The embodiment of the specification provides an ultrasonic scanning device, an ultrasonic scanning method and a storage medium, wherein the device comprises a 3D model unit, an ultrasonic scanning unit and an auxiliary focus positioning unit; the 3D model unit is configured to acquire a 3D model of the corresponding part of the patient, the 3D model being determined based on 3D medical examination data of the patient; the ultrasonic scanning unit is configured to acquire a target ultrasonic image of a corresponding part of a patient under the prompt of the auxiliary focus positioning unit, wherein the target ultrasonic image comprises a focus, and the focus is included in a 3D model of the corresponding part of the patient; the auxiliary focus positioning unit is configured to determine a current position corresponding to the current ultrasonic image in the 3D model, and give a prompt for scanning by the ultrasonic scanning unit based on the current position and the focus position in the 3D model.

Description

Ultrasonic scanning device, method and storage medium
Technical Field
The present disclosure relates to the field of medical imaging technologies, and in particular, to an ultrasound scanning apparatus, an ultrasound scanning method, and a storage medium.
Background
Medical devices are a critical area of modern medicine, and in clinical diagnostic and therapeutic practice, medical devices play a vital role. Among them, ultrasound scanning is an important means for performing preclinical diagnosis on a patient, and focus location can be performed on the patient (such as a specific portion) through ultrasound scanning, and a specific position where a lesion occurs can be determined for subsequent treatment. Currently, conventional ultrasound scanning is two-dimensional in nature. Ultrasound scanning relies on the operating experience of a physician, and for the same patient, the efficiency of different physicians to complete the scanning depends on the physician's experience. Lesions with hidden locations or smaller lesions, etc. are difficult to find by a physician.
Accordingly, there is a need to provide an ultrasound scanning apparatus, method and storage medium.
Disclosure of Invention
One or more embodiments of the present specification provide an ultrasound scanning apparatus including a 3D model unit, an ultrasound scanning unit, and an auxiliary lesion localization unit; the 3D model unit is configured to obtain a 3D model of a corresponding part of a patient, the 3D model being determined based on 3D medical examination data of the patient; the ultrasonic scanning unit is configured to acquire a target ultrasonic image of the corresponding part of the patient under the prompt of the auxiliary focus positioning unit, wherein the target ultrasonic image comprises a focus, and the focus is included in a 3D model of the corresponding part of the patient; the auxiliary focus positioning unit is configured to determine a current position corresponding to a current ultrasonic image in the 3D model, and give a prompt for scanning by the ultrasonic scanning unit based on the current position and the focus position in the 3D model.
One or more embodiments of the present specification provide an ultrasound scanning method, the method comprising: acquiring a 3D model of a corresponding part of a patient, the 3D model being determined based on 3D medical examination data of the patient; determining a corresponding current position of a current ultrasonic image in the 3D model, and determining a scanning prompt based on the current position and the focus position in the 3D model; based on the prompt, a target ultrasound image of the patient's corresponding site is acquired, the target ultrasound image including a lesion included in a 3D model of the patient's corresponding site.
One or more embodiments of the present specification provide a computer-readable storage medium storing computer instructions that, when read by a computer, perform the above-described ultrasound scanning method.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is an exemplary schematic diagram of an application scenario of an ultrasound scanning apparatus shown in some embodiments of the present description;
FIG. 2 is an exemplary schematic diagram of an ultrasound scanning apparatus according to some embodiments of the present description;
FIG. 3 is an exemplary flow chart of an ultrasound scanning method shown in accordance with some embodiments of the present description;
FIG. 4 is an exemplary schematic diagram of lesion localization and navigation according to some embodiments of the present description;
fig. 5 is a general flow chart of an ultrasound scanning method according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
In the ultrasonic scanning process, a doctor needs to realize ultrasonic scanning of different organs or parts by changing the manipulation, switching the position or the angle of the probe and the like, so that the requirement on the operation experience of the doctor is extremely high. For the same patient, different doctors may scan different sections. For some patients, the focus part is relatively hidden or the focus is smaller, so that the focus part is easily ignored by an ultrasonic doctor and missed.
At present, ultrasonic scanning is mainly realized by the following modes: firstly, two-dimensional based ultrasound scanning, which is difficult to provide three-dimensional data information. Secondly, three-dimensional ultrasound, which is limited to frame rate, image quality, etc., cannot provide accurate 3D information and information for dividing its region (organ or lesion). Thirdly, ultrasonic is combined with fusion navigation of 3D medical examination data, which can solve detection or positioning of a part of hidden focus, but because an additional device is needed, and fusion registration needs a certain time to operate, the method is relatively time-consuming and labor-consuming, and does not have the premise of universal application.
In view of the above problems, some embodiments of the present disclosure provide an ultrasound scanning apparatus, a method and a storage medium, where 3D medical examination data acquired by medical imaging devices such as electronic computed tomography (Computed Tomography, CT) and magnetic resonance imaging (Magnetic Resonance Imaging, MRI) are associated with a modeling model through an AI algorithm, and a friendly man-machine interaction interface is provided for a user in a manner of combining the 3D model and a 2D section, so as to implement operations such as special focus positioning, section guidance and section identification, so that an entrance threshold of ultrasound operations can be greatly reduced, doctor working efficiency and patient satisfaction can be improved, and ultrasound image quality control and the like can be further assisted.
Fig. 1 is an exemplary schematic diagram of an application scenario of an ultrasound scanning apparatus according to some embodiments of the present description. In some embodiments, the application scenario 100 of the ultrasound scanning apparatus may include a storage device 110, a network 120, a processor 130, and an ultrasound device 140.
Storage device 110 may store data, instructions, and/or any other information. In some embodiments, the storage device may store data and/or instructions related to the application scenario 100 of the ultrasound scanning apparatus. For example, the storage device 110 may store computer instructions for performing ultrasound scanning. For another example, the storage device 110 may store 3D medical examination data, target ultrasound images, and the like.
In some embodiments, the storage device 110 may be a storage device internal or external to the ultrasound scanning apparatus. In some embodiments, the storage device may be part of the processor 130. In some embodiments, the storage device 110 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, the storage device 110 may be connected to the network 120 for communication with the processor 130. In some embodiments, the storage device 110 may be communicatively connected to the ultrasound device 140, for example, the storage device 110 may transmit stored 3D medical examination data directly to the ultrasound device 140.
The network 120 may connect components of the application scenario 100 of the ultrasound scanning apparatus and/or connect the application scenario 100 of the ultrasound scanning apparatus with external resource portions. Network 120 enables communication between the various components and with other components outside the device to facilitate the exchange of data and/or information. For example, 3D medical examination data stored by the storage device 110 may be transmitted over the network 120 to the processor 130 for processing. For another example, the processor 130 may transmit the generated target ultrasound image to the storage device 110 over the network 120. For another example, the ultrasound device 140 may transmit the acquired current ultrasound image to the storage device 110 via the network 120.
In some embodiments, network 120 may be any one or more of a wired network or a wireless network. For example, the network 120 may include a cable network, a fiber optic network, a telecommunications network, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth network, a ZigBee network, a Near Field Communication (NFC), an intra-device bus, an intra-device line, a cable connection, and the like, or any combination thereof. The network connection between the parts can be in one of the above-mentioned ways or in a plurality of ways.
The processor 130 may be used to process data related to the ultrasound scanning apparatus. For example, the processor 130 may acquire a target ultrasound image of a corresponding portion of a patient by performing the ultrasound scanning apparatus disclosed in the present specification. For example, the processor 130 may obtain a 3D model of the patient's corresponding region; determining a corresponding current position of a current ultrasonic image in the 3D model, and determining a scanning prompt based on the current position and the focus position in the 3D model; based on the prompt, a target ultrasound image of the corresponding portion of the patient is acquired.
In some embodiments, the processor 130 may include a 3D model unit, an ultrasound scanning unit, and an auxiliary lesion localization unit. For more on the 3D model unit, the ultrasound scanning unit and the auxiliary lesion localization unit, reference may be made to fig. 2 and its related description.
In some embodiments, the processor 130 may be a single server or a group of servers. The server farm may be centralized or distributed. In some embodiments, the processor 130 may be local or remote. In some embodiments, the processor 130 may be implemented on a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-layer cloud, or the like, or any combination thereof.
The ultrasonic device 140 refers to a medical instrument for diagnosing a disease according to the principle of ultrasonic waves. The ultrasonic device 140 may be a variety of forms of ultrasonic devices. In some embodiments, the ultrasound device 140 may include, but is not limited to, a desktop ultrasound device, a portable ultrasound device, a palm ultrasound device, and the like. Such as an all-digital color ultrasonic diagnostic apparatus, a portable color Doppler ultrasonic diagnostic apparatus, etc.
In some embodiments, the ultrasound device 140 may include an ultrasound probe 140-1, a device host 140-2, an ultrasound display device 140-3, an input apparatus 140-4, and the like. The ultrasonic probe 140-1 may be used to transmit and receive ultrasonic waves, and perform conversion between an electrical signal and an ultrasonic signal. The device host 140-2 may be used to process signals received from the probe. The ultrasound display device 140-3 may be used to display images and information involved in the ultrasound scanning process. In some embodiments, the ultrasound display device 140-3 may include an ultrasound display interface. The input device 140-4 may be used to implement a human-machine interaction function. The input device 140-4 may include a keyboard, speakers, etc. For example, the doctor may input instructions to the ultrasound device 140 through a keyboard, the ultrasound device 140 may alert and guide the doctor through a speaker, etc. The components of the ultrasonic device may be adjusted according to the specific circumstances, and are not limited herein.
The ultrasonic display interface is an interface for displaying an ultrasonic image and a position of a section. In some embodiments, the ultrasound display interface may be part of an ultrasound display device.
In some embodiments, any location of the ultrasound display interface may be used to display an ultrasound image or a section location. For example, a home page of the ultrasound display interface may be used to display the current ultrasound image. Any other position of the ultrasonic display interface can be used for displaying the position of the section in the 3D model corresponding to the current section, and the like. For example, the sidebars (left, right, up, down, etc.) of the ultrasound display interface may all be used to display the position of the cut surface in the 3D model corresponding to the current cut surface. The sidebar may be located in multiple orientations of the ultrasound display interface. The layout setting for the ultrasonic display interface may be set according to the specific case, and is not limited herein. For more on the current slice, 3D model, slice position see fig. 3 and its related description.
In some embodiments, the user may interact via the ultrasound device 140 and the components of the application scenario 100 of the ultrasound scanning apparatus. Wherein the user refers to the person involved in using the ultrasound device. For example, the user may refer to a doctor or the like. In some embodiments, the user may input instructions through the input device 140-4 to operate the ultrasound apparatus 140. For example, the user may operate the ultrasound device 140 via a keyboard to view the reminder content. In some embodiments, the processor 130 may be part of the ultrasound device 140.
For more on the 3D medical examination data, target ultrasound images, cues, etc., described above, see fig. 3-5 and their associated description.
It should be noted that the above description of the application scenario 100 of the ultrasound scanning apparatus is for illustrative purposes only and is not intended to limit the scope of the present description. Various alterations and modifications will occur to those skilled in the art in light of the present description. However, such changes and modifications do not depart from the scope of the present specification. For example, the processor 130 and the ultrasonic device 140 may share one storage device 110, or may have respective storage devices 110, or the like.
Fig. 2 is an exemplary schematic diagram of an ultrasound scanning apparatus according to some embodiments of the present description. As shown in fig. 2, the ultrasound scanning apparatus 200 may include a 3D model unit 210, an ultrasound scanning unit 220, and an auxiliary lesion localization unit 230. The ultrasound scanning apparatus 200 according to the embodiment of the present specification will be described in detail below. It should be noted that the following examples are only for explaining the present specification, and do not constitute a limitation of the present specification.
In some embodiments, the ultrasound device 140 may be part of a component of the ultrasound scanning apparatus 200. In some embodiments, the ultrasound device 140 may include a 3D model unit 210, an ultrasound scanning unit 220, and an auxiliary lesion localization unit 230.
The 3D model unit 210 refers to a unit for acquiring a 3D model.
In some embodiments, the 3D model unit 210 may be configured to obtain a 3D model of the patient, the 3D model being determined based on 3D medical examination data of the patient.
In some embodiments, the 3D model unit 210 may be further configured to determine target medical examination data of a pattern associated with the lesion location and/or the lesion type based on the lesion location and/or the lesion type in the 3D medical examination data; based on the target medical examination data, a 3D model is determined.
The ultrasonic scanning unit 220 refers to a unit for performing ultrasonic real-time scanning.
In some embodiments, the ultrasound scanning unit 220 may be configured to acquire a target ultrasound image of the patient's corresponding site, including the lesion, included in the 3D model of the patient's corresponding site, under the direction of the auxiliary lesion localization unit 230.
The auxiliary lesion localization unit 230 refers to a unit for performing operations such as tangent plane guidance and lesion guidance.
In some embodiments, the auxiliary lesion localization unit 230 may be configured to determine a current position of the current ultrasound image corresponding in the 3D model, and to give a prompt for the ultrasound scanning unit to scan based on the current position and the lesion position in the 3D model.
In some embodiments, the auxiliary lesion localization unit 230 may be further configured to determine a prompt content of the prompt based on the tangent plane information of the current ultrasound image and the 3D model, the prompt content including an ultrasound probe movement mode and/or an ultrasound tangent plane adjustment mode; the ultrasonic probe moving mode and/or the ultrasonic section adjusting mode comprise one or a combination of more of image display, image guidance, text prompt and voice prompt.
In some embodiments, the auxiliary lesion localization unit 230 may be further configured to determine a standard cut plane based on the 3D model, the standard cut plane including the lesion; and determining prompt content based on the difference between the standard section and the current section.
In some embodiments, the auxiliary lesion localization unit 230 may be further configured to segment the current ultrasound image resulting in an image segmentation result; dividing the 3D model to obtain a model division result; and giving a prompt based on the image segmentation result and the model segmentation result.
In some embodiments, the auxiliary lesion localization unit 230 may be further configured to display the position of the tangent plane in the 3D model corresponding to the current tangent plane on the ultrasound display interface, and prompt a next operation method of the ultrasound scanning unit, where the next operation method includes an ultrasound probe movement manner and/or an ultrasound tangent plane adjustment manner, the ultrasound probe movement manner includes at least one of a movement distance and a movement direction, and the ultrasound tangent plane adjustment manner includes at least one of rotation and inclination.
In some embodiments, both the auxiliary lesion localization unit 230 and the 3D model unit 210 are implemented by machine learning models.
In some embodiments, the 3D model unit 210 may include a modeling model and the auxiliary lesion localization unit 230 may include an auxiliary localization model. For more on the modeling model and the aided location model, see fig. 3 and its related description.
In some embodiments of the present disclosure, an auxiliary focus positioning unit and a 3D model unit are implemented through a machine learning model, so that a 3D model of a corresponding part of a patient and confirmation prompt contents can be quickly and accurately constructed, which is beneficial to implementing operations such as focus positioning, section guiding, section identifying, and the like, and improves the working efficiency of a doctor and the use experience of a user.
For more details on the 3D model unit 210, the ultrasound scanning unit 220, and the auxiliary lesion localization unit 230, see fig. 3-5 and their associated descriptions.
It should be understood that the apparatus shown in fig. 2 and its units may be implemented in various ways. It should be noted that the above description of the ultrasound scanning apparatus and the units thereof is for convenience of description only and is not intended to limit the present description to the scope of the illustrated embodiments. It will be understood by those skilled in the art that it is possible, after understanding the principles of the device, to combine individual units arbitrarily or to construct sub-devices in connection with other units without departing from such principles. In some embodiments, the 3D model unit 210, the ultrasound scanning unit 220, and the auxiliary lesion localization unit 230 disclosed in fig. 2 may be different units in one device, or may be one unit to implement the functions of two or more units described above. For example, each unit may share one memory cell, or each unit may have a respective memory cell. Such variations are within the scope of the present description.
Fig. 3 is an exemplary flow chart of an ultrasound scanning method according to some embodiments of the present description. As shown in fig. 3, the process 300 may include the following steps. In some embodiments, one or more operations of the flow 300 shown in fig. 3 may be implemented in the application scenario 100 of the ultrasound scanning apparatus shown in fig. 1.
In step 310, a 3D model of the corresponding portion of the patient is acquired, the 3D model being determined based on 3D medical examination data of the patient. In some embodiments, step 310 may be performed by 3D model unit 210.
In some embodiments, the patient may refer to a patient in need of an ultrasound scan, or the like.
The 3D model refers to a digitized 3D model of a particular part of the patient. A site may refer to any location of a patient, for example, any organ or body tissue of a patient. For example, the organ may include, but is not limited to, any of the heart, abdominal organs, gynecological organs, and the like. The body tissue may include, but is not limited to, any of the head, neck, torso, limbs, and the like. In some embodiments, a portion of a patient may correspond to a 3D model. The 3D model may embody information about the lesion within the corresponding part of the patient, such as the size of the lesion, the location of the lesion, etc. In some embodiments, the 3D model may refer to a 3D model that the ultrasound device may identify and/or make a location match. The 3D model can embody focus on the ultrasonic equipment, and realizes the transformation and/or the position matching interaction of the 3D medical examination data of the medical imaging equipment such as DR, CT, MRI and the data identifiable by the ultrasonic equipment.
The 3D medical examination data refers to 3D ultrasound data about the patient obtained by the medical imaging device. In some embodiments, the 3D medical examination data may include a variety of types. For example, the 3D medical examination data may include ultrasound data such as medical images. Medical imaging devices refer to devices that form medical images by applying physical signals (e.g., X-rays, ultrasound, strong magnetic fields) to a human body. The medical imaging device may include one or any combination of electronic computed tomography (Computed Tomography, CT), positron emission tomography (Positron Emission Tomography, PET), direct digital flat panel X-ray imaging (Digital Radiography, DR), magnetic resonance imaging (Magnetic Resonance Imaging, MRI), and the like. In some embodiments, different medical imaging devices may acquire different 3D medical examination data. For example, DR may acquire DR data, MRI may acquire MRI data, and the like.
In some embodiments, 3D medical examination data of the patient acquired by the medical imaging device may be acquired from the storage device 110. In some embodiments, the 3D medical examination data may also be manually or automatically imported into the ultrasound device for the 3D model unit 210 to determine a 3D model of the corresponding part of the patient. For more on the ultrasound device, see fig. 1 and its related description.
In some embodiments, the 3D model unit 210 may acquire a 3D model of the patient in a variety of ways. For example, the 3D model unit 210 may acquire a pre-stored reconstructed 3D model from an external device (e.g., the storage device 110, the medical imaging device, etc.). For another example, the 3D model unit 210 may determine a 3D model of the corresponding portion of the patient through various methods based on 3D medical examination data of the patient. For example, the 3D model unit 210 may process ultrasound data such as medical images by means of rendering, scaling, etc., so as to obtain a 3D model of the patient. For another example, the 3D model unit 210 may construct a 3D model of the corresponding portion of the patient that the ultrasound device may identify and/or match by AI algorithms on the ultrasound device. In some embodiments, the AI algorithm may include one or any combination of a machine learning algorithm, a deep learning algorithm, and the like. When obvious focus exists in 3D medical examination data of a patient (for example, focus lesion occurs in the liver of the patient), focus related information such as the size of the focus, the position of the focus and the like can be displayed in a 3D model of a corresponding part of the patient constructed through an AI algorithm.
In some embodiments, the 3D model unit 210 may determine a 3D model of the corresponding part of the patient by means of a modeling model based on the 3D medical examination data of the patient.
The modeling model refers to a model for constructing a 3D model of the corresponding part of the patient. In some embodiments, the modeling model may be a deep learning model. For example convolutional neural networks (Convolutional Neural Networks, CNN) or generating antagonism networks (Generative Adversarial Networks, GAN), etc.
In some embodiments, the input of the modeling model may include 3D medical examination data of the patient, and the output may be a 3D model of the corresponding portion of the patient.
In some embodiments, the modeling model may be trained based on a number of first training samples. The first training sample of the training modeling model includes 3D medical examination data of the sample patient. The first label is an actual 3D model of the corresponding part of the patient corresponding to the first training sample. The actual 3D model may be a model that is recognizable and/or location-matched by an ultrasound device into which 3D medical examination data of a medical imaging device such as DR, CT, MRI is converted by rendering, scaling, etc. The first training sample may be obtained based on a historical database. For example, the first training sample may be a historical 3D medical examination data acquisition of a historical patient. The first label is a model that can be identified and/or position matched by an ultrasound device into which historical 3D medical examination data of a historical patient is converted via rendering, scaling, etc.
In some embodiments, the processor may input a first training sample with a first tag into the modeling model to obtain an initial 3D model of the corresponding portion of the patient; and constructing a first loss function based on the initial 3D model and the first label, and iteratively updating parameters of the modeling model based on a plurality of first training samples so that the first loss function of the modeling model meets preset conditions. For example, the first loss function converges, or the first loss function value is smaller than a preset value. And when the first loss function meets the preset condition, training is completed, and a trained modeling model is obtained.
In some embodiments, for a location where no lesion is found in the 3D medical examination data, the 3D model of the patient's corresponding location may be a generic 3D model of that location. For example, a 3D model of the heart at a site existing in a hospital database. The 3D model common to the organ may include a variety of types. For example, the corresponding cardiac 3D model may include a plurality of cardiac 3D models for different ages, different weights, different sexes, etc. A general 3D model of the organ to which the patient corresponds may be determined based on the information to which the patient corresponds. In some embodiments, for a location where a lesion has been found in the 3D medical examination data, the 3D model of the patient-corresponding location may be a 3D model specific to the location corresponding to the patient, e.g., a 3D model of the patient-corresponding location constructed by AI algorithms from the 3D medical examination data of the patient.
In some embodiments, the 3D model unit 210 may determine target medical examination data for a mode associated with the lesion location and/or lesion type based on the lesion location and/or lesion type in the plurality of 3D medical examination data; based on the target medical examination data, a 3D model is determined.
The focus refers to the pathological tissue of the organism. Such as tumors, stones, etc. The focus position refers to the position of the human body where the focus is located. For example, the lesion is located in a blood vessel, the lesion is located in the heart, etc. The lesion type refers to the type to which the lesion tissue belongs. In some embodiments, the lesion location and/or lesion type may be determined in a variety of ways. For example, the lesion location and/or lesion type may be empirically determined by a physician based on 3D medical examination data observations.
The target medical examination data refers to 3D medical examination data used for constructing a 3D model of the site where the lesion of the patient is located. In some embodiments, the target medical examination data may be one or more of the 3D medical examination data of a pattern associated with a lesion location and/or a lesion type. Such as DR data corresponding to associated mode direct digital flat panel X-ray imaging (Digital Radiography, DR), MRI data corresponding to associated mode magnetic resonance imaging (Magnetic Resonance Imaging, MRI), and the like.
In some embodiments, the 3D model unit 210 may determine the target medical examination data of the pattern associated with the lesion location and/or lesion type in a variety of ways based on the lesion location and/or lesion type in the 3D medical examination data. For example, the 3D model unit 210 may determine the target medical examination data by looking up a table based on a first preset relationship table comprising a correspondence of lesion locations and/or lesion types to the target medical examination data. Wherein the correspondence may relate to a nature of the lesion. The first preset relationship table may be empirically set by a physician. For example, the target medical examination data may be selected to readily display digitized image data (e.g., DR data, MRI data, etc.) of the mediastinal structure, with the lesion being within the blood vessel. For more on DR data and MRI data see the relevant description above. In some embodiments, the target medical examination data may be empirically determined manually based on lesion location and/or lesion type selection in the 3D medical examination data.
In some embodiments, the 3D model unit 210 may determine the 3D model in a variety of ways based on the target medical examination data. For example, machine learning models, and the like. For the content of this section, reference is made to the description of determining a 3D model based on 3D medical examination data above, and no further description is given here.
In some embodiments of the present disclosure, by using a focal position and/or a focal type in a plurality of 3D medical examination data, target medical examination data of a mode with high adaptation and good imaging effect associated with the focal position and/or the focal type may be reasonably determined, and based on the target medical examination data, the determination of the 3D model may be made more accurate.
Step 320, determining a current position of the current ultrasound image corresponding in the 3D model, and determining a prompt for scanning based on the current position and the lesion position in the 3D model. In some embodiments, step 320 may be performed by the auxiliary lesion localization unit 230.
The current ultrasound image refers to a current image displayed by an ultrasound display device in the ultrasound device at the current time. In some embodiments, the current ultrasound image is a two-dimensional image. In some embodiments, during the ultrasound scanning process, the ultrasound scanning unit 220 may scan the patient in real time through the ultrasound probe to obtain the current ultrasound image. Fig. 4 is an exemplary schematic diagram of lesion localization and navigation according to some embodiments of the present description. The ultrasound image 410 and the ultrasound image 450 in fig. 4 may be the current ultrasound image corresponding to different current times.
The current position refers to the position of the current ultrasound image (two-dimensional image) corresponding in the 3D model (three-dimensional image).
The prompt refers to relevant information for reminding the user how to scan. In some embodiments, the prompt may include a reminder of the relevant aspects of moving the probe, adjusting the section, etc. In some embodiments, the section where the lesion is obtained may be guided by the prompt. The moving probe refers to moving information of the ultrasonic probe, and the moving information comprises a moving direction, a moving distance and the like. The adjusting section refers to adjusting information of the current section, including adjusting angles, adjusting distances and the like. For more on the ultrasound probe and the current section, see fig. 1 and the related description below.
In some embodiments, the auxiliary lesion localization unit 230 may determine the current location of the current ultrasound image corresponding in the 3D model, and determine the hint of the scan in a variety of ways based on the current location and the lesion location in the 3D model. For example, the auxiliary lesion localization unit 230 may match the current ultrasound image with the 3D model, determining a corresponding current position of the current ultrasound image in the 3D model. Matching may include a variety of ways. For example, the auxiliary lesion localization unit 230 may perform registration after determining by the location or distance of one or more feature points between the current ultrasound image and the 3D model, or the identification of the size and location of the organ. The feature points may include a variety of, for example, the apex of the lung, a segment of a rib, etc.
In some embodiments, the auxiliary lesion localization unit 230 may obtain a corresponding current position of the current ultrasound image in the 3D model through the position determination model based on the current ultrasound image and the 3D model.
The position determination model may be a machine learning model, e.g., a deep neural network model, etc.
In some embodiments, the position determination model may be obtained based on a number of position training samples. Training the position training samples of the position determination model include a sample ultrasound image of a sample patient and a sample 3D model. The position label is the corresponding mark position of the sample patient corresponding to the sample ultrasonic image in the sample 3D model. The location tag may be obtained based on a manual tag. The location training samples may be obtained based on a historical database. The sample ultrasound image may be at least one of an ultrasound two-dimensional image obtained by an ultrasound apparatus, a segmentation result of a region or feature point or the like corresponding to a sample patient, a recognition result of the ultrasound two-dimensional image, or the like. The sample 3D model may be a three-dimensional image obtained by a medical imaging device such as DR, CT, MRI.
In some embodiments, the processor may input the position training sample with the position tag into the position determination model to obtain a predicted current position of the sample ultrasound image corresponding to the sample 3D model, compare the predicted current position with a marked position of the sample ultrasound image in the sample 3D model, construct a loss function, and train the position determination model through loss calculation until the training end condition is met. The training end condition may be that the loss function converges, or that the loss function value is smaller than a preset value, or the like. And when the loss function meets the training ending condition, training is completed, and a trained position determination model is obtained. The processor inputs the current ultrasonic image and the 3D model into the trained position determination model, and the corresponding current position of the current ultrasonic image in the 3D model can be obtained.
The auxiliary lesion localization unit 230 may determine a hint that the scanning of the position difference may be shortened based on the position difference of the current position and the lesion position in the 3D model. The prompt for the scan may prompt the ultrasound scanning unit 220 how to move and/or adjust to direct the ultrasound scanning unit 220 to find the patient's lesion as soon as possible.
In some embodiments, the auxiliary lesion localization unit 230 may determine the hint content of the hint based on the tangent plane information of the current ultrasound image and the 3D model.
The tangential plane refers to a slice surface in a specific direction formed by cutting a human body. Different cutting angles may correspond to different tangential planes. Such as a cross-section, a longitudinal section, a coronal section, etc. The section information refers to the related information of the section. In some embodiments, the facet information may include anatomy, facet location, facet angle, and the like.
Anatomical structure refers to the anatomical information of a tangent plane. For example, for a parasternal left ventricular long axis section of cardiac ultrasound, the anatomy may be anterior wall of the right ventricle, septum, left ventricle, etc. in order from front to back. In some embodiments, the auxiliary lesion localization unit 230 may match and identify the tangent plane with a history tangent plane at the same location and at the same angle in the history database, and determine the anatomical structure of the tangent plane based on the anatomical structure of the history tangent plane.
The position of the slice refers to the position information of the slice in the 3D model. The tangent plane angle refers to the angle information of the tangent plane in the 3D model.
The current section refers to the section of the patient part corresponding to the current ultrasonic image. In 430 and 470 of fig. 4, the cylinder may be shown as a 3D model of the corresponding portion of the patient, and the trapezoid may be shown as a current slice corresponding to a different current time. The current slice plane shown by the trapezoids 430 and 470 may be the slice planes of the patient region corresponding to the current ultrasound images shown by 410 and 450, respectively.
In some embodiments, the slice information of the current ultrasound image may refer to slice information of the current slice, including anatomy of the current slice, slice position, slice angle, and the like. In some embodiments, the auxiliary focus positioning unit 230 may identify the current tangent plane obtained in real time during the ultrasonic scanning process, and perform real-time matching with the 3D model, so as to position the tangent plane position and the tangent plane angle of the current tangent plane in the 3D model.
In some embodiments, the facet position and facet angle may be represented in a variety of ways. For example, the auxiliary lesion localization unit 230 may set up a rectangular coordinate system with a point (such as a centroid) of the 3D model as an origin, an axis perpendicular to the ground plane as a Y axis, an axis parallel to the ground plane as an X axis, the X axis and the Y axis being perpendicular to each other, and an axis perpendicular to both the X axis and the Y axis as a Z axis, and the intersection point coordinates of the normal vector of the current tangent plane passing through the origin and the current tangent plane represent the position and the angle of the tangent plane of the current tangent plane.
The prompt content refers to relevant content for reminding a user about ultrasonic scanning. In some embodiments, the hints may include ultrasound probe movement and ultrasound section adjustment. The ultrasonic probe moving mode refers to a mode of reminding the content related to the moving probe. The ultrasonic section adjustment mode refers to a mode of reminding the content related to the adjustment section. For more on moving the probe and adjusting the section see the relevant description above.
In some embodiments, the ultrasound probe movement and/or ultrasound cut surface adjustment may include one or more of image presentation, image guidance, text prompting, and voice prompting. Wherein image presentation may refer to presentation of the ultrasound image and the current section to be acquired after moving the probe and adjusting the section. For example, the image presentation may be a direct presentation of the ultrasound image shown at 420 in fig. 4 and the current slice in the 3D model shown at 440. Image guidance may refer to sequentially displaying a plurality of ultrasound images and a current section corresponding to the process of moving the probe and adjusting the section. For example, the image guidance may be the guidance process of the ultrasound image shown in 410, … …, 420, etc. in fig. 4, and the guidance process of the current slice in the 3D model shown in corresponding 430, … …, 440. As another example, the image guidance may be another guidance of ultrasound images of the same portion of the same patient during different scanning procedures as shown at 450, … …, 460 in fig. 4, with a corresponding another guidance of the current slice in the 3D model as shown at 470, … …, 480. The text prompt may refer to using the text data to prompt the probe to be moved and adjust the cut surface. For example, the text prompt may be to display the text "the ultrasound probe moves 1 cm upward, is tilted 20 ° to the right" or the like on the ultrasound display interface. The voice prompt may refer to using voice data to prompt the probe to move and adjust the cut surface. For example, the voice prompt may be to play voice through a speaker "the ultrasonic probe is moved up by 1 cm, is tilted 20 ° to the right", or the like.
In some embodiments, the auxiliary lesion localization unit 230 may determine the hints content of the hints in a variety of ways based on the tangent plane information of the current ultrasound image and the 3D model. For example, the auxiliary focus positioning unit 230 may compare the position and the angle deviation of the current tangent plane of the current ultrasound image in the 3D model with the distance and the angle deviation of the tangent plane of the focus, so as to determine the prompting content of the prompting, so that the ultrasound probe moves to the tangent plane including the focus and/or the tangent plane which can best embody the focus structure.
In some embodiments, the auxiliary lesion localization unit 230 may include an auxiliary localization model. The auxiliary lesion localization unit 230 may determine a prompt content of the prompt through the auxiliary localization model based on the tangent plane information of the current ultrasound image and the 3D model.
The aided location model refers to a model for determining the content of a hint. In some embodiments, the aided location model may be a machine learning model, such as a convolutional neural network or the like. In some embodiments, the inputs to the auxiliary localization model may include the tangent plane information of the current ultrasound image and the 3D model, and the output may be a prompt content of the prompt.
In some embodiments, the aided location model may be trained based on a number of second training samples. The second training sample for training the auxiliary positioning model comprises sample section information of a current ultrasonic image of the sample and a sample 3D model. The second label is the actual prompt content corresponding to the second training sample. The second training sample may be obtained based on a historical database. For example, the second training sample may be the section information and 3D model of the historical ultrasound image of the historical patient, and the second label is the historical prompt. The second tag may be obtained in a number of ways. For example, the auxiliary lesion localization unit 230 may perform registration based on the current ultrasonic image of the sample and the 3D model of the sample, determine a next action prompt according to the lesion position and the relative position and angle of the ideal tangent plane and the sample tangent plane in the sample tangent plane information, and determine the next action prompt as the second tag (history prompt content). The ideal tangential plane can be preset in advance. For another example, the second tag may be determined by manual labeling. Wherein the registration may include a plurality of ways. Example 1 is that the sample current ultrasound image and the sample 3D model can be made to be uniform in size by scaling by comparing the actual size of the sample current ultrasound image and the sample 3D model with the size of the common feature. Registering the current ultrasonic image of the sample and the 3D model of the sample in the same proportion of 1:1.
In some embodiments, the training process of the auxiliary positioning model and the modeling model is similar, and reference may be made to the above related description, which is not repeated here.
In some embodiments of the present disclosure, according to the section information of the current ultrasound image and the 3D model, the prompt contents of the moving probe and adjusting the section can be accurately determined, and the prompt can be performed in various manners (such as images, characters, voices, etc.), which is beneficial to realizing focus positioning and section guiding, and can improve the diagnosis efficiency of doctors.
In some embodiments, the auxiliary lesion localization unit 230 may determine a standard cut plane based on the 3D model, and determine hints based on differences between the standard cut plane and the current cut plane.
The standard cut surface is a cut surface which can meet the requirements of users. For example, the trapezoids shown in 440 and 480 of fig. 4 may be the same standard cut that ultimately corresponds to the same portion of the same patient during different scanning procedures. In some embodiments, the standard cut may include, but is not limited to, an optimal cut that enables visualization of a lesion in a patient site, a particular cut that a physician wants to obtain during an interventional procedure, and the like.
In some embodiments, the auxiliary lesion localization unit 230 may determine the standard cut plane by a preset table based on the 3D model. The preset table may be set by default by the system or manually empirically. For example, for a patient with a different left ventricular outflow tract, the auxiliary lesion positioning unit 230 may select the parasternal long axis section in the 3D model as the standard section according to a preset table. For another example, for a patient with a lesion somewhere in the apex of the heart, the auxiliary lesion localization unit 230 may select a specific slice in the 3D model (e.g., a slice 0.5 cm from the bottom of the apex of the 3D model) as a standard slice according to a predetermined table.
In some embodiments, the difference between the standard cut and the current cut refers to the difference in the cut position and cut angle of the standard cut and the current cut in the 3D model. The differences may include position differences and angle differences of the standard cut plane and the current cut plane.
In some embodiments, the auxiliary lesion localization unit 230 may determine the hint content in a variety of ways based on differences between the standard cut surface and the current cut surface. For example, the auxiliary lesion localization unit 230 determines the moving mode of the ultrasonic probe according to the position difference between the standard tangent plane and the current tangent plane (e.g., the standard tangent plane is located inside the current tangent plane, and the ultrasonic probe moves inside). For another example, the auxiliary lesion localization unit 230 determines the ultrasound cutting plane adjustment method according to the angle difference between the standard cutting plane and the current cutting plane (e.g., the standard cutting plane is 60 ° to the left of the current cutting plane, and the ultrasound cutting plane is rotated 60 ° to the left).
In some embodiments of the present disclosure, the standard tangent plane is determined by the 3D model, so that the determination of the prompt content is more accurate according to the difference between the standard tangent plane and the current tangent plane, which is beneficial to the subsequent quick acquisition of the most satisfactory tangent plane.
In some embodiments, the auxiliary lesion localization unit 230 may segment the current ultrasound image to obtain an image segmentation result, segment the 3D model to obtain a model segmentation result, and give a hint based on the image segmentation result and the model segmentation result.
The image segmentation result refers to a result obtained after segmentation of the current ultrasound image. The model segmentation result is a result obtained by segmenting the 3D model. The image segmentation result and the model segmentation result are segmentation results obtained by segmenting the images by adopting the same segmentation mode. The image segmentation result and the model segmentation result have a one-to-one correspondence. For example, a certain corresponding image segmentation result and model segmentation result each correspond to the same certain part of the patient. In some embodiments, the auxiliary lesion localization unit 230 may segment the current ultrasound image and the 3D model by a variety of methods, resulting in an image segmentation result and a model segmentation result. For example, the auxiliary lesion localization unit 230 may perform image recognition on the current ultrasound image and the 3D model, divide the current ultrasound image and the 3D model into partial images and partial models of a plurality of different parts, and determine the partial images and the partial models of the plurality of different parts as the image division result and the model division result. Wherein the different parts may be different anatomical structures. Image recognition techniques may include, but are not limited to, segmentation models, such as Segment Anything Model (SAM), segGPT, and the like.
In some embodiments, the input of the segmentation model may comprise the current ultrasound image or the 3D model, and the output may be an image segmentation result or a model segmentation result.
In some embodiments, the segmentation model may be obtained based on a number of third training samples. A third training sample of the training segmentation model comprises a sample current ultrasound image or a sample 3D model. And the third label is an actual segmentation result corresponding to the third training sample. The third training samples and the third tag may be obtained based on a historical database. For example, the third training sample may be a historical ultrasound image and a historical 3D model of the historical patient, with the third label being a historical segmentation result.
In some embodiments, the training process of the segmentation model and the modeling model is similar, and reference may be made to the above related description, which is not repeated here.
In some embodiments, the auxiliary lesion localization unit 230 may give hints based on the image segmentation results and the model segmentation results in a variety of ways. For example, the auxiliary lesion localization unit 230 may determine a difference in position between a certain model segmentation result including a lesion and a corresponding image segmentation result based on matching (e.g., registration, etc.). The auxiliary lesion localization unit 230 may determine a hint based on the location discrepancy that may shorten the scan of the location discrepancy. For more on matching (e.g., registration, etc.), see the relevant description above. In some embodiments, the auxiliary lesion localization unit 230 may determine cues regarding position and angle via the auxiliary localization model described above. For example, the input of the auxiliary localization model may include a certain model segmentation result of the lesion and a corresponding image segmentation result, and the output may be a hint about position and angle.
In some embodiments of the present disclosure, by segmenting the current ultrasound image and the 3D model, according to the segmentation results, the prompt corresponding to the different segmentation results is prompted, so that the user can more quickly learn the structural information of the current section, which is beneficial to improving the speed and quality of acquiring the target ultrasound image.
Step 330, based on the prompt, a target ultrasound image of the corresponding portion of the patient is acquired. In some embodiments, step 330 may be performed by the ultrasound scanning unit 220.
The target ultrasound image refers to an ultrasound image that enables accurate observation of lesion location, lesion size, and the like. In some embodiments, the target ultrasound image may include a lesion included in a 3D model of the corresponding site of the patient. For example, in fig. 4, the same target ultrasound images shown at 420 and 460 may correspond to different scanning procedures for the same portion of the same patient. Wherein the area shown by the dark hexagon may be a lesion. Correspondingly, the 3D model shown at 440 and 480, which may be the same portion of the same patient, is identical to the standard cut corresponding to the different scanning procedures. The dark hexagon area shown in 420 is the lesion in the target ultrasound image and the dark hexagon area shown in 440 is the lesion cut by the standard cut surface in the 3D model.
In some embodiments, the ultrasound scanning unit 220 may acquire the target ultrasound image of the corresponding portion of the patient in a variety of ways based on the prompt. For example, according to the prompt (text prompt, voice prompt, etc.), the ultrasonic probe is operated to move, rotate, tilt, etc., the change of the current ultrasonic image is observed in real time, and the operation is carried out for a plurality of times until the target ultrasonic image is obtained.
In some embodiments, the schematic diagram of fig. 4 may represent lesion location and navigation of the same site of the same patient during different ultrasound scanning procedures. For example, 410, 420, 430, 440 may represent the current ultrasound image, the target ultrasound image, a schematic of the current cut plane in the 3D model, and a schematic of the standard cut plane in the 3D model, respectively, in the first ultrasound scan. 450. 460, 470, 480 may represent the current ultrasound image, the target ultrasound image, the schematic of the current slice in the 3D model, and the schematic of the standard slice in the 3D model in the second ultrasound scan, respectively. According to 430, … …, 440, 410, … …, 420 may be guided, i.e. by scanning guidance of 430, … …, 440, the current ultrasound image 410 may be guided to the target ultrasound image 420 by moving the probe and adjusting the cut plane, and adjusting the current cut plane to the standard cut plane in the 3D model, thereby achieving lesion localization and navigation. In some embodiments, fig. 4 may also represent other lesion localization and navigation scenarios. Such as lesion location and navigation of different patients during ultrasound scanning, etc. The above description about fig. 4 is only for illustration and description, and does not limit the scope of application of the present specification.
In some embodiments, the auxiliary lesion localization unit 230 may display the position of the section in the 3D model corresponding to the current section on the ultrasound display interface, and prompt the next operation method of the ultrasound scanning unit. For more details regarding ultrasound display interfaces, see the associated description of FIG. 1.
The next operation method refers to a method for operating the probe to move the position and the angle in the ultrasonic scanning process. In some embodiments, the next procedure may include an ultrasound probe movement and/or ultrasound facet adjustment. The moving mode and/or the adjusting mode of the ultrasonic probe can be referred to the above related description, and will not be described herein. Methods of prompting the next manipulation may include, but are not limited to, one or more of image presentation, image guidance, text prompting, and voice prompting.
In some embodiments, the auxiliary lesion localization unit 230 may determine the next procedure in a variety of ways. For example, the auxiliary lesion localization unit 230 may determine a position difference between a position of a tangent plane in the 3D model corresponding to the current tangent plane and a position of a standard tangent plane based on the ultrasonic display interface. The auxiliary lesion localization unit 230 may determine an ultrasonic probe movement pattern and/or an ultrasonic section adjustment pattern that may shorten the position difference based on the position difference. The auxiliary lesion localization unit 230 may determine the determined moving mode of the ultrasonic probe and/or the ultrasonic section adjustment mode as a next operation method. For more details on this section, reference may be made to the description of how the above auxiliary lesion localization unit determines the prompt content, which is not described in detail herein.
In some embodiments, the position of the ultrasonic probe in the 3D model can be determined by matching the current tangent plane with the 3D model, and then the ultrasonic probe can be guided to obtain the target ultrasonic image based on the comparison of the current tangent plane and the standard tangent plane.
In some embodiments of the present disclosure, the position of the current tangent plane in the 3D model corresponding to the current tangent plane is displayed on the ultrasound display interface, so that the position of the current tangent plane in the 3D model can be intuitively observed, and the next operation method of the ultrasound scanning unit is prompted, so that the threshold for entering the ultrasound operation can be greatly reduced, the man-machine interaction is more facilitated, the acquisition of the target ultrasound image is rapidly guided, and the working efficiency of a doctor is improved.
In some embodiments of the present disclosure, according to 3D medical examination data of a patient, a 3D model of a corresponding portion of the patient may be accurately determined, a current position corresponding to a current ultrasound image in the 3D model may be determined, and a prompt for scanning may be determined based on the current position and a focus position in the 3D model, so as to facilitate quick and accurate acquisition of a target ultrasound image of the corresponding portion of the patient, to implement operations such as focus positioning, section guidance, section identification, and the like, and improve diagnosis efficiency of a doctor and user experience satisfaction.
It should be noted that the above description of the process 300 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
Fig. 5 is a general flow chart of an ultrasound scanning method according to some embodiments of the present description. As shown in fig. 5, the process 500 may include the following steps. In some embodiments, one or more operations of the flow 500 shown in fig. 5 may be implemented in the application scenario 100 of the ultrasound scanning apparatus shown in fig. 1. For example, the flow 500 shown in FIG. 5 may be stored in the storage device 110 in the form of instructions and invoked and/or executed by the processor 130.
At step 510, 3D medical examination data of a patient is acquired.
For more on this part see fig. 3 and its related description.
Step 520, importing 3D medical examination data of the patient into the ultrasound device.
In some embodiments, the user may import 3D medical examination data of the patient into the ultrasound device in a variety of ways. Such as network transmissions, bluetooth transmissions, etc. For more on the ultrasound device, see fig. 1 and its related description.
In step 530, a 3D model of the corresponding portion of the patient is determined.
In some embodiments, the 3D model unit may determine a 3D model of the corresponding part of the patient based on the 3D medical examination data of the patient. For more details on this part, see step 310 in fig. 3 and the associated description.
Step 540, the user starts ultrasound real-time scanning and activates the auxiliary lesion localization unit.
In some embodiments, the user turns on and logs in to the ultrasound device. When two-dimensional ultrasound guiding is used for interventional operation, it is important to be able to find the focus position of a patient as soon as possible. When the user starts to carry out ultrasonic scanning and if the use requirement of the auxiliary focus positioning unit exists in the ultrasonic scanning process, the auxiliary focus positioning unit can be selected to be activated.
In some embodiments, the user may activate the auxiliary lesion localization unit in a variety of ways. For example, the user can activate the auxiliary focus positioning unit through the key, and after activation, the auxiliary focus positioning unit can perform real-time matching on the current tangent plane and the 3D model, so that the functions of tangent plane guiding and focus guiding are realized. In some embodiments, the user may determine whether to activate the auxiliary lesion locating unit, the point in time when the auxiliary lesion locating unit is activated, and the period of time when the auxiliary lesion locating unit is used, depending on the actual situation. For more details on the auxiliary lesion localization unit, see fig. 2 and its associated description.
At step 550, based on the 3D model, the hint content is determined.
For more on this part see fig. 3 and its related description.
Step 560, displaying the position of the section in the 3D model corresponding to the current section, and prompting the next operation method of the ultrasonic scanning unit, thereby obtaining the target ultrasonic image of the corresponding part of the patient.
For more on this part see fig. 3 and its related description.
It should be noted that the above description of the process 500 is for purposes of illustration and description only, and is not intended to limit the scope of applicability of the present disclosure. Various modifications and changes to flow 500 will be apparent to those skilled in the art in light of the present description. However, such modifications and variations are still within the scope of the present description.
One or more embodiments of the present description provide an ultrasound scanning system. The system comprises a 3D model module, an ultrasonic scanning module and an auxiliary focus positioning module.
In some embodiments, the 3D model module may be configured to obtain a 3D model of the corresponding portion of the patient, the 3D model being determined based on 3D medical examination data of the patient.
In some embodiments, the 3D model module may be further configured to determine target medical examination data for a mode associated with the lesion location and/or lesion type based on the lesion location and/or lesion type in the 3D medical examination data; based on the target medical examination data, a 3D model is determined.
The ultrasonic scanning module is used for conducting ultrasonic real-time scanning.
In some embodiments, the ultrasound scanning module may be configured to acquire a target ultrasound image of the patient's corresponding site, the target ultrasound image including the lesion, the lesion included in the 3D model of the patient's corresponding site, under the direction of the auxiliary lesion localization unit.
The auxiliary focus positioning module is used for performing operations such as tangent plane guidance, focus guidance and the like.
In some embodiments, the auxiliary lesion localization module may be configured to determine a current location of the current ultrasound image corresponding in the 3D model, and to give a prompt for the ultrasound scanning module to scan based on the current location and the lesion location in the 3D model.
In some embodiments, the auxiliary lesion localization module may be further configured to determine a prompt content of the prompt based on the tangent plane information of the current ultrasound image and the 3D model, the prompt content including an ultrasound probe movement mode and/or an ultrasound tangent plane adjustment mode; the ultrasonic probe moving mode and/or the ultrasonic section adjusting mode comprise one or a combination of more of image display, image guidance, text prompt and voice prompt.
In some embodiments, the auxiliary lesion localization module may be further configured to determine a standard tangent plane based on the 3D model, the standard tangent plane including the lesion; and determining prompt content based on the difference between the standard section and the current section.
In some embodiments, the auxiliary lesion localization module may be further configured to segment the current ultrasound image resulting in an image segmentation result; dividing the 3D model to obtain a model division result; and giving a prompt based on the image segmentation result and the model segmentation result.
In some embodiments, the auxiliary lesion localization module may be further configured to display a position of the tangent plane in the 3D model corresponding to the current tangent plane on the ultrasound display interface, and prompt a next operation method of the ultrasound scanning unit, where the next operation method includes an ultrasound probe movement mode and/or an ultrasound tangent plane adjustment mode.
In some embodiments, the auxiliary lesion localization module and the 3D model module are both implemented by machine learning models.
For more on the above modules see the relevant description of fig. 1-5.
One or more embodiments of the present specification provide a computer-readable storage medium storing computer instructions that, when read by a computer, perform the above-described ultrasound scanning method.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the present description. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (10)

1. An ultrasonic scanning device is characterized by comprising a 3D model unit, an ultrasonic scanning unit and an auxiliary focus positioning unit;
the 3D model unit is configured to obtain a 3D model of a corresponding part of a patient, the 3D model being determined based on 3D medical examination data of the patient;
the ultrasonic scanning unit is configured to acquire a target ultrasonic image of the corresponding part of the patient under the prompt of the auxiliary focus positioning unit, wherein the target ultrasonic image comprises a focus, and the focus is included in a 3D model of the corresponding part of the patient;
the auxiliary focus positioning unit is configured to determine a current position corresponding to a current ultrasonic image in the 3D model, and give the prompt scanned by the ultrasonic scanning unit based on the current position and the focus position in the 3D model.
2. The apparatus of claim 1, wherein the auxiliary lesion localization unit is further configured to:
determining prompt contents of the prompt based on the section information of the current ultrasonic image and the 3D model, wherein the prompt contents comprise an ultrasonic probe moving mode and/or an ultrasonic section adjusting mode;
the ultrasonic probe moving mode and/or the ultrasonic section adjusting mode comprises one or a combination of more of image display, image guidance, text prompt and voice prompt.
3. The apparatus of claim 2, wherein the auxiliary lesion localization unit is further configured to:
determining a standard tangent plane based on the 3D model, the standard tangent plane including the lesion;
and determining the prompt content based on the difference between the standard section and the current section.
4. The apparatus of claim 1, wherein the 3D model unit is further configured to:
determining target medical examination data of a pattern associated with the lesion location and/or the lesion type based on the lesion location and/or the lesion type in the 3D medical examination data;
the 3D model is determined based on the target medical examination data.
5. The apparatus of claim 1, wherein the auxiliary lesion localization unit is further configured to:
dividing the current ultrasonic image to obtain an image division result;
dividing the 3D model to obtain a model division result;
and giving the prompt based on the image segmentation result and the model segmentation result.
6. The apparatus of claim 1, wherein the auxiliary lesion localization unit is further configured to:
displaying the position of the section in the 3D model corresponding to the current section on an ultrasonic display interface, and prompting the next operation method of the ultrasonic scanning unit, wherein the next operation method comprises an ultrasonic probe moving mode and/or an ultrasonic section adjusting mode, the ultrasonic probe moving mode comprises at least one of a moving distance and a moving direction, and the ultrasonic section adjusting mode comprises at least one of rotation and inclination.
7. The apparatus of claim 1, wherein the auxiliary lesion localization unit and the 3D model unit are each implemented by a machine learning model.
8. An ultrasound scanning method, the method comprising:
Acquiring a 3D model of a corresponding part of a patient, the 3D model being determined based on 3D medical examination data of the patient;
determining a corresponding current position of a current ultrasonic image in the 3D model, and determining a scanning prompt based on the current position and the focus position in the 3D model;
based on the prompt, a target ultrasound image of the patient's corresponding site is acquired, the target ultrasound image including a lesion included in a 3D model of the patient's corresponding site.
9. The method of claim 8, wherein the determining a current location of a current ultrasound image corresponding in the 3D model, determining a prompt for scanning based on the current location and a lesion location in the 3D model comprises:
determining prompt contents of the prompt based on the section information of the current ultrasonic image and the 3D model, wherein the prompt contents comprise an ultrasonic probe moving mode and/or an ultrasonic section adjusting mode;
the ultrasonic probe moving mode and/or the ultrasonic section adjusting mode comprises one or a combination of more of image display, image guidance, text prompt and voice prompt.
10. A computer readable storage medium storing computer instructions which, when read by a computer in the storage medium, perform the method of any one of claims 8-9.
CN202311755649.XA 2023-12-18 2023-12-18 Ultrasonic scanning device, method and storage medium Pending CN117547302A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311755649.XA CN117547302A (en) 2023-12-18 2023-12-18 Ultrasonic scanning device, method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311755649.XA CN117547302A (en) 2023-12-18 2023-12-18 Ultrasonic scanning device, method and storage medium

Publications (1)

Publication Number Publication Date
CN117547302A true CN117547302A (en) 2024-02-13

Family

ID=89812837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311755649.XA Pending CN117547302A (en) 2023-12-18 2023-12-18 Ultrasonic scanning device, method and storage medium

Country Status (1)

Country Link
CN (1) CN117547302A (en)

Similar Documents

Publication Publication Date Title
JP6796085B2 (en) Systems and methods for navigation to targeted anatomical objects in medical imaging-based procedures
CN109419524B (en) Control of medical imaging system
RU2683720C2 (en) Ultrasound acquisition feedback guidance to target view
JP2022507622A (en) Use of optical cords in augmented reality displays
WO2015161728A1 (en) Three-dimensional model construction method and device, and image monitoring method and device
CN114129240B (en) Method, system and device for generating guide information and electronic equipment
CN115944392A (en) Ultrasound system and method for planning ablation
US10755453B2 (en) Image processing apparatus, image processing method, and ultrasound imaging apparatus having image processing unit
JP2016522725A (en) System and method for 3D acquisition of ultrasound images
CN112741692B (en) Rapid navigation method and system for realizing device navigation to target tissue position
US10991069B2 (en) Method and apparatus for registration of medical images
US11660142B2 (en) Method for generating surgical simulation information and program
US20200345325A1 (en) Automated path correction during multi-modal fusion targeted biopsy
US20230181148A1 (en) Vascular system visualization
DE112020001809T5 (en) Image-based probe positioning
KR20170086311A (en) Medical imaging apparatus and operating method for the same
CN113679470A (en) Computer-aided puncture path planning method and device for craniocerebral puncture operation and storage medium
US20240050172A1 (en) Surgical pathway processing system, method, device, and storage medium
JP6734111B2 (en) Finding information creation device and system
CN116712167A (en) Navigation method and system for pulmonary nodule operation
CN117547302A (en) Ultrasonic scanning device, method and storage medium
RU2736800C1 (en) Method for preparation and performing of surgical operation on small pelvis organs
CN113907883A (en) 3D visualization operation navigation system and method for ear-side skull-base surgery
WO2024067629A1 (en) Methods, systems, and mediums for scanning
CN115089294B (en) Interventional operation navigation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination