CN113116384A - Ultrasonic scanning guidance method, ultrasonic device and storage medium - Google Patents

Ultrasonic scanning guidance method, ultrasonic device and storage medium Download PDF

Info

Publication number
CN113116384A
CN113116384A CN201911413636.8A CN201911413636A CN113116384A CN 113116384 A CN113116384 A CN 113116384A CN 201911413636 A CN201911413636 A CN 201911413636A CN 113116384 A CN113116384 A CN 113116384A
Authority
CN
China
Prior art keywords
ultrasonic
current
ultrasound
image
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911413636.8A
Other languages
Chinese (zh)
Inventor
赵明昌
莫若理
陆振宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Chison Medical Technologies Co Ltd
Original Assignee
Wuxi Chison Medical Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Chison Medical Technologies Co Ltd filed Critical Wuxi Chison Medical Technologies Co Ltd
Priority to CN201911413636.8A priority Critical patent/CN113116384A/en
Publication of CN113116384A publication Critical patent/CN113116384A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Theoretical Computer Science (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Software Systems (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Evolutionary Computation (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Business, Economics & Management (AREA)
  • Primary Health Care (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Vascular Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention relates to the technical field of ultrasonic navigation, in particular to an ultrasonic scanning guiding method, ultrasonic equipment and a storage medium, wherein the ultrasonic scanning guiding method comprises the following steps: loading a three-dimensional ultrasonic model corresponding to a target organ to be scanned of a detection object, wherein the three-dimensional ultrasonic model at least comprises a standard scanning section with position information and angle information; acquiring a current ultrasonic image scanned by the ultrasonic probe; acquiring position information and angle information of the current ultrasonic image based on the current ultrasonic image; and guiding the ultrasonic probe to move to the standard scanning section according to the position information and the angle information of the current ultrasonic image and the standard scanning section. The invention improves the speed and the accuracy of the ultrasonic probe for searching the standard scanning section and improves the scanning efficiency of operators.

Description

Ultrasonic scanning guidance method, ultrasonic device and storage medium
Technical Field
The invention relates to the technical field of ultrasonic navigation, in particular to an ultrasonic scanning guiding method, ultrasonic equipment and a storage medium.
Background
The ultrasonic diagnostic apparatus has wide application in clinical medicine, and can be used for ultrasonic image examination and diagnosis of various parts of a body from head to foot. The quality of the ultrasound image obtained by the ultrasound scanning determines the later diagnosis. In practical situations, doctors operate the ultrasonic probe to move to a target organ for scanning, but experience accumulation and operation proficiency of different doctors are different, some doctors have less experience accumulation, and doctors with poor operation experience cannot operate the ultrasonic probe quickly and accurately to obtain an ultrasonic image of a standard section.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provides an ultrasonic scanning guiding method, ultrasonic equipment and a storage medium, wherein the ultrasonic scanning guiding method can guide an ultrasonic probe to move to a standard scanning section.
As a first aspect of the present invention, there is provided an ultrasound scanning guidance method including:
loading a three-dimensional ultrasonic model corresponding to a target organ to be scanned of a detection object, wherein the three-dimensional ultrasonic model at least comprises a standard scanning section with position information and angle information;
acquiring a current ultrasonic image scanned by the ultrasonic probe;
acquiring position information and angle information of the current ultrasonic image based on the current ultrasonic image;
and guiding the ultrasonic probe to move to the standard scanning section according to the position information and the angle information of the current ultrasonic image and the standard scanning section.
Further, acquiring the position information and the angle information of the current ultrasound image based on the current ultrasound image comprises:
and inputting the current ultrasonic image and the three-dimensional ultrasonic model into a trained indexing neural network model for processing, and determining the position information and the angle information of the current ultrasonic image.
Further, the inputting the current ultrasound image and the three-dimensional ultrasound model into a trained index neural network model for processing, and determining the position information and the angle information of the current ultrasound image includes:
a first feature vector in the current ultrasonic image is obtained through a two-dimensional convolution neural extraction network;
extracting a second feature vector in the three-dimensional ultrasonic model through a three-dimensional convolution neural network;
splicing the first feature vector and the second feature vector in a dimension to obtain a first spliced feature vector;
and inputting the first splicing characteristic vector into a full-connection layer, and outputting the position information and the angle information of the current ultrasonic image.
Further, acquiring the position information and the angle information of the current ultrasound image based on the current ultrasound image comprises:
and inputting the current ultrasonic image into a trained full convolution neural network model for processing, and determining the position information and the angle information of the current ultrasonic image.
Further, the inputting the current ultrasound image into the trained fully-convolutional neural network model for processing, and determining the position information and the angle information of the current ultrasound image includes:
inputting the current ultrasonic image into a full convolution neural network for processing to obtain a characteristic diagram of the current ultrasonic image;
performing global maximum pooling on the feature map to obtain a third feature vector of the current ultrasonic image;
carrying out global average pooling on the feature map to obtain a fourth feature vector of the current ultrasonic image;
splicing the third feature vector and the fourth feature vector to obtain a second spliced feature vector;
and inputting the second splicing characteristic vector into a full connection layer, and outputting the position information and the angle information of the current ultrasonic image.
Further, the guiding the ultrasound probe to move to the standard scanning section according to the position information and the angle information of the current ultrasound image and the standard scanning section includes:
planning a guide path of the ultrasonic probe moving to the standard scanning tangent plane according to the position information and the angle information;
acquiring a real-time position of the ultrasonic probe;
judging whether the ultrasonic probe deviates from the guide path according to the real-time position of the ultrasonic probe, and if so, updating the guide path according to the real-time position;
and displaying the guide path, the standard scanning section and the ultrasonic probe in real time.
Further, the displaying the guide path, the standard scanning section and the ultrasonic probe in real time includes:
acquiring an environment image which is shot by a camera and at least comprises a detection object and an ultrasonic probe;
and highlighting the guide path, the standard scanning section and the ultrasonic probe on the environment image and/or the body surface of the detection object.
Further, in the process of guiding the ultrasonic probe to move to the standard scanning section, providing operation prompt information, wherein the operation prompt information comprises: one or more of voice-operated prompts, visual-operated prompts, and tactile-operated prompts.
As a second aspect of the present invention, the present invention also provides an ultrasound device comprising at least a memory, a processor, said memory having stored thereon a computer program,
the processor, when executing the computer program on the memory, implements the steps of the ultrasound scan guidance method of any of the above.
As a third aspect of the present invention, the present invention further provides a computer storage medium, in which a computer program is stored, and the computer program is used for implementing the steps of the ultrasound scanning guidance method according to any one of the above items when being executed by a processor.
The ultrasonic scanning guiding method can rapidly and accurately determine the position information and the angle information of the current ultrasonic image and the position information and the angle information of the standard scanning section, which are acquired by the ultrasonic probe, through the indexing neural network model and the loaded three-dimensional ultrasonic model, and guide the ultrasonic probe to move to the standard scanning section according to the position relationship between the current ultrasonic image and the standard scanning section. The invention improves the speed and accuracy of the ultrasonic probe for searching the standard scanning tangent plane.
Furthermore, the ultrasonic scanning guiding method can generate a visual guiding path, and display the guiding path, the standard scanning section and the ultrasonic probe in real time, so that the scanning accuracy is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of the ultrasound scanning guidance method according to the present invention.
Fig. 2 is a flowchart of an ultrasound scanning guidance method according to another embodiment of the present invention.
FIG. 3 is a flowchart of the operation of the indexing neural network model process of the present invention.
FIG. 4 is a flowchart of the operation of the full convolution neural network model process of the present invention.
FIG. 5 is a schematic structural diagram of an indexing neural network model according to the present invention.
FIG. 6 is a schematic structural diagram of a full convolution neural network model according to the present invention.
Fig. 7 is a schematic view of scanning guidance on a display according to the present invention.
Fig. 8 is a schematic view of scanning guidance on a display according to another embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art. Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
In actual situations, doctors operate the ultrasonic probe to move to a target organ for scanning, but experience accumulation and operation proficiency of different doctors are different, some doctors have less experience accumulation, and doctors with poor operation experience cannot operate the ultrasonic probe quickly and accurately to obtain an ultrasonic image of a standard section. There is therefore a need for a method that can prompt the physician how to operate the ultrasound probe to be able to quickly and accurately acquire a standard scan slice that is consistent with an ultrasound diagnosis.
Fig. 1 is a flowchart of the ultrasound scanning guidance method according to the present invention. As shown in fig. 1, in a first aspect of the present invention, an ultrasound scanning guidance method is provided, including:
s100, loading a three-dimensional ultrasonic model corresponding to a target organ to be scanned of a detection object, wherein the three-dimensional ultrasonic model at least comprises a standard scanning section with position information and angle information;
specifically, target organ information to be scanned of the detection object needs to be acquired when loading the three-dimensional ultrasound model corresponding to the target organ to be scanned of the detection object, and the target organ information may be an input target organ name or an indication icon of the target organ on the ultrasound device. Target organ information can be input through an input unit on the ultrasonic equipment, so that the ultrasonic equipment can acquire a target organ to be scanned of a detection object; the input unit can be a keyboard, a trackball, a mouse, a touch pad or the like or a combination thereof; the input unit may also be a voice recognition input unit, a gesture recognition input unit, or the like. It should be understood that the target organ to be scanned by the ultrasound probe can also be identified by machine vision or a trained identification network model.
The three-dimensional ultrasonic model is stored in a storage medium in advance, and the three-dimensional ultrasonic model of the corresponding organ is loaded according to the target organ to be scanned. It should be understood that the three-dimensional ultrasound model is reconstructed by scanning the human body in advance. Specifically, ultrasonic scanning is carried out on the tissue to be modeled along a preset direction through an ultrasonic probe, and an ultrasonic image of each section of the tissue to be modeled is obtained; acquiring six-degree-of-freedom parameters corresponding to the ultrasonic images of different sections scanned by the probe; and inputting the ultrasonic image of each section and the corresponding six-degree-of-freedom parameter into the trained deep neural network model to obtain the three-dimensional ultrasonic model of the tissue to be modeled.
The ultrasound image of each slice in the three-dimensional ultrasound model is provided with position information and angle information. Generating a world coordinate system comprising the probe and a tissue to be modeled by a magnetic field generator in the scanning process of the ultrasonic probe; the six-degree-of-freedom parameter of the probe is obtained through a magnetic positioner arranged on the probe, and the six-degree-of-freedom parameter comprises a position parameter and a direction parameter of the probe. In the actual ultrasonic diagnosis process, different sections of organs are often observed to assist a doctor in diagnosis, so that the three-dimensional ultrasonic model at least comprises one standard scanning section with position information and angle information.
Step S200, acquiring a current ultrasonic image scanned by the ultrasonic probe;
the ultrasound probe is used for transmitting and receiving ultrasound waves, and the ultrasound probe is excited by a transmission pulse, transmits the ultrasound waves to a target tissue (for example, an organ, a tissue, a blood vessel and the like in a human body or an animal body), receives an ultrasound echo with information of the target tissue reflected from a target area after a certain time delay, and converts the ultrasound echo into an electric signal again to obtain an ultrasound image of the target tissue. When a physician operates an ultrasound probe or operates the ultrasound probe through a mechanical arm, a current ultrasound image acquired by the ultrasound probe needs to be acquired first, and position information and angle information of the current ultrasound image need to be calculated.
Step S300, acquiring position information and angle information of the current ultrasonic image based on the current ultrasonic image;
it is to be understood that the position information and the angle information are six-degree-of-freedom coordinates (x, y, z, ax, ay, az), and ax, ay, az are angle information in the xyz direction. The invention determines the position information and the angle information of the current ultrasonic image through a trained index neural network model or a full convolution neural network model.
Specifically, the method comprises the following steps:
in an embodiment, the current ultrasound image and the three-dimensional ultrasound model are input into a trained index neural network model for processing, and position information and angle information of the current ultrasound image are determined. Wherein the indexing neural network model comprises at least: two-dimensional convolutional neural networks and three-dimensional convolutional neural networks. The two-dimensional convolutional neural network is used for processing the input current ultrasonic image and at least comprises a two-dimensional convolutional layer, a maximum pooling layer, an average pooling layer and an activation function layer. The three-dimensional convolution neural network is used for processing the input three-dimensional ultrasonic model. The three-dimensional convolutional neural network at least comprises a three-dimensional convolutional layer, a maximum pooling layer, an average pooling layer and an activation function layer.
In another embodiment, the current ultrasound image is input into a trained fully convolutional neural network model for processing, and the position information and the angle information of the current ultrasound image are determined.
Inputting the current ultrasonic image and the three-dimensional ultrasonic model into a trained indexing neural network model for processing, and the specific steps comprise:
step S310, extracting a first feature vector in the current ultrasonic image through a two-dimensional convolution neural network;
the index neural network model at least comprises a two-dimensional convolution neural network and a three-dimensional convolution neural network, the current ultrasonic image is input into the corresponding two-dimensional convolution neural network, and a first feature vector in the current ultrasonic image is extracted through the two-dimensional convolution neural network, wherein the first feature vector is a one-dimensional feature vector. As shown in fig. 5, a represents the input current ultrasound image.
Step S320, extracting a second feature vector in the three-dimensional ultrasonic model through a three-dimensional convolution neural network;
and inputting the loaded three-dimensional ultrasonic model into a corresponding three-dimensional convolution neural network for processing, and extracting a second feature vector in the three-dimensional ultrasonic model through the three-dimensional convolution neural network. The three-dimensional convolutional neural network at least comprises a three-dimensional convolutional layer, a maximum pooling layer, an average pooling layer and an activation function layer, and the output is averaged or added on a channel, so that a one-dimensional feature vector is obtained, namely the second feature vector is also a one-dimensional feature vector. Where the convolution kernel of the three-dimensional convolution layer may be 3 x 3, as shown in fig. 5, b represents the three-dimensional ultrasound model.
Step S330, splicing the first eigenvector and the second eigenvector in a dimension to obtain a first spliced eigenvector;
step S340, inputting the first splicing characteristic vector into a full connection layer, and outputting the position information and the angle information of the current ultrasonic image.
The number of neurons of the fully-connected layer is the same as the number of position information and angle information, and preferably, the number of fully-connected layers is 6.
In another embodiment, as shown in fig. 6, the current ultrasound image is input into a trained fully convolutional neural network model for processing, and the position information and the angle information of the current ultrasound image are determined.
The method comprises the following specific steps:
step S350, inputting the current ultrasonic image into a full convolution neural network for processing to obtain a characteristic diagram of the current ultrasonic image;
step S360, carrying out global maximum pooling on the feature map to obtain a third feature vector of the current ultrasonic image;
step S370, carrying out global average pooling on the feature map to obtain a fourth feature vector of the current ultrasonic image;
step S380, splicing the third eigenvector and the fourth eigenvector to obtain a second spliced eigenvector;
and step S390, inputting the second splicing characteristic vector into a full connection layer, and outputting the position information and the angle information of the current ultrasonic image.
The various steps or actions in the method descriptions may also be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. The full convolution neural network model is based on two-dimensional convolution neural network training, and the two-dimensional convolution neural network at least comprises a two-dimensional convolution layer, a maximum pooling layer, an average pooling layer and an activation function layer. It should be appreciated that although the full convolution neural network model has fewer three-dimensional convolution networks than the indexing neural network model, the full convolution neural network model has greater data processing capability than the two-dimensional convolution neural network in the indexing neural network model. The three-dimensional ultrasonic model is a plurality of section images scanned along a certain angle, each section image is provided with corresponding (x, y, z, ax, ay, az), and b can be regarded as the three-dimensional model of the organ.
It should be understood that the full convolution neural network model is to perform multi-angle scanning on a certain organ to obtain multi-angle multi-slice images, each slice image has a corresponding (x, y, z, ax, ay, az), the purpose of the network is to establish a relation model between the slice image of one organ and the corresponding position, i.e. a prediction stage, for example, sampling the same organ of several different people (e.g. 5000 people), each organ performs scanning of different angles (e.g. 360 angles), 200 frames of ultrasound images can be obtained in each angular direction, and then the number of training samples of the full convolution neural network model is 5000 x 360 x 200= 360000000; and (3) training the huge sample ultrasonic image, and updating parameters of the full convolution neural network to obtain a full convolution neural network model. When a current ultrasonic image acquired by an ultrasonic probe is input into the full convolution neural network model, the position information and the angle information (x, y, z, ax, ay, az) of the current ultrasonic image can be obtained. The training adopts a regression method, and the loss function is mean square error.
And S400, guiding the ultrasonic probe to move to the standard scanning section according to the position information and the angle information of the current ultrasonic image and the standard scanning section.
Specifically, in the previous step, the position information and the angle information (X, Y, Z, AX, AY, AZ) of the current ultrasound image, and the position information and the angle information (X, Y, Z, AX, AY, AZ) of the standard scanning section preset in the three-dimensional ultrasound image are determined. And planning a guide path of the ultrasonic probe moving to the standard scanning section according to the position information and the angle information of the current ultrasonic image and the standard scanning section, wherein the position information and the angle information are six-degree-of-freedom coordinates. And according to the position information and the angle information of the current ultrasonic image and the standard scanning section, the position information and the angle information of the current ultrasonic image and the standard scanning section.
As shown in fig. 7, the scanning guidance area 1000 displayed on the display includes at least a first guidance area 1600 and a second guidance area 1700, where the first guidance area 1600 displays at least position information and angle information of the current ultrasound probe, position information and angle information of a standard scanning section, and operation prompt information. The operation prompt information at least comprises the translation distance and the selected angle, and can also be the pressure of the ultrasonic probe pressing. The second guide region includes the object 1100 to be detected, the target organ 1500 highlighted on the object 1100, the current ultrasound probe 1200, the guide path 1400, and the target virtual probe 1300, it being understood that the highlighting may be highlighting the entire target organ 1500 or the outline of the target organ 1500. The current ultrasound probe 1200 moves according to its real-time position, and the target virtual probe 1300 needs to move to a position to obtain the ultrasound probe corresponding to the standard scanning section.
In another embodiment, as shown in fig. 8, a physician may need to examine a plurality of standard scanning slices when performing an ultrasound scan on a certain target organ, and the present invention plans a guiding path 1400 according to the distance between the position information of different standard scanning slices and the current ultrasound probe 1200. It should be understood that the guide path 1400 is also highlighted, and may be highlighted by a distinctive color, flashing, or the like.
In order to improve the accuracy of guiding the scanning of the ultrasonic probe, the invention also displays the guide path, the standard scanning section and the ultrasonic probe in real time. It will be appreciated that the physician may manipulate the ultrasound probe along the guide path to a standard scanning slice. The guide path may be displayed on a display or projected by a projection device at a position corresponding to the detection object.
Aiming at the condition that the ultrasonic probe deviates from a guide path due to misoperation of a doctor in the moving scanning process of the ultrasonic probe, the invention guides the ultrasonic probe to move to the standard scanning section according to the position information and the angle information of the current ultrasonic image and the standard scanning section, as shown in fig. 2, and the method comprises the following steps:
step S410, planning a guide path of the ultrasonic probe moving to the standard scanning tangent plane according to the position information and the angle information;
step S420, acquiring the real-time position of the ultrasonic probe;
in one embodiment, the real-time position information and the angle information of the current ultrasound image acquired by the ultrasound probe can be acquired by inputting the ultrasound image acquired by the ultrasound probe in real time into the indexing neural network model. The real-time position of the ultrasonic probe can be identified through a trained tracking neural network model, and the method specifically comprises the following steps: acquiring a model image of an ultrasonic probe; inputting the model image and the environment image into a shared full convolution neural network, wherein the shared full convolution neural network outputs a first feature corresponding to the model image and a second feature corresponding to the environment image; the first characteristic is convolution of a convolution kernel and the second characteristic to obtain a spatial response graph; and outputting the spatial response map to a linear interpolation layer to acquire the real-time position of the ultrasonic probe in the environment image.
It should be understood that the model image of the ultrasound probe is preset in the ultrasound device and can be called through the input unit, the input unit can be a keyboard, a trackball, a mouse, a touch pad, or the like, or a combination thereof, and the input unit can also be a voice recognition input unit, a gesture recognition input unit, or the like. It is to be understood that the target organ information may be a name of the target organ or a target organ icon displayed on the display selected through the input unit. The spatial response map comprises the response intensity of the first feature on the second feature, the response intensity value is 0-1, and the model image and the acquaintance value of each position in the environment image.
Step S430, judging whether the ultrasonic probe deviates from the guide path according to the real-time position of the ultrasonic probe, if so, updating the guide path according to the real-time position;
sending a deviation prompt within a preset distance range of the ultrasonic probe deviating from the guide path; the deviation alarm prompt comprises one or more of an indicator light, a voice prompt and a vibration prompt; sending a deviation correction prompt, wherein the deviation correction prompt comprises the step of prompting the moving direction and distance of the ultrasonic probe on a display, and it is to be understood that the distance of the ultrasonic probe deviating from the guide path is smaller, so that the path does not need to be re-planned, and the ultrasonic probe is only required to be prompted to be controlled to return to the original guide path to continue moving, and the display comprises displays of VR, AR and other display devices; and/or displaying the moving direction and distance of the ultrasonic probe on the surface of the detection object, specifically, displaying a guide path at the body surface of the detection object through a projection device or a laser guide device, and an operation prompting step of the ultrasonic probe. And after the ultrasonic probe deviates from the guide path and exceeds a preset range, re-planning the guide path according to the real-time position of the ultrasonic probe. Specifically, the shortest guide path is newly selected according to the real-time position of the ultrasonic probe and the position of the target organ at the time. Indicating on a display a direction and distance of movement of the ultrasound probe; and/or displaying the moving direction and distance of the ultrasonic probe on the surface of the detection object.
And step S440, displaying the guide path, the standard scanning section and the ultrasonic probe in real time.
Specifically, the guide path, the standard scanning section and the ultrasonic probe are highlighted on the environment image and/or the body surface of the detection object. The guide path, the standard scanning section and the ultrasonic probe can be displayed in a distinguishing way through different colors or shades and the like.
Further, in order to further prompt the position of a standard scanning section, a target virtual probe is displayed at the position of the detection object corresponding to the standard scanning section so as to guide the ultrasonic probe. It should be understood that the corresponding position of the detected object may be displayed on the display, or a three-dimensional virtual ultrasound probe may be projected at the corresponding position of the actual detected object.
In order to further improve the speed and accuracy of scanning, the invention also comprises: in the process of guiding the ultrasonic probe to move to the standard scanning section, providing operation prompt information, wherein the operation prompt information comprises: one or more of voice-operated prompts, visual-operated prompts, and tactile-operated prompts. The visual operation prompt can prompt the direction and the angle of the probe moving on the display or generate a movement indication icon at the body surface corresponding to the detection object. The tactile operation cue is that the ultrasonic probe vibrates when the ultrasonic probe deviates from the guide path.
The ultrasonic scanning guiding method can rapidly and accurately determine the position information and the angle information of the current ultrasonic image and the position information and the angle information of the standard scanning section, which are acquired by the ultrasonic probe, through the indexing neural network model and the loaded three-dimensional ultrasonic model, and guide the ultrasonic probe to move to the standard scanning section according to the position relationship between the current ultrasonic image and the standard scanning section. The invention improves the speed and accuracy of the ultrasonic probe for searching the standard scanning tangent plane. Furthermore, the ultrasonic scanning guiding method can generate a visual guiding path, and display the guiding path, the standard scanning section and the ultrasonic probe in real time, so that the scanning accuracy is improved.
As a third aspect of the present invention, the present invention also provides an ultrasound apparatus, which at least includes a memory, and a processor, wherein the memory stores a computer program thereon, and the processor implements the steps of the ultrasound scanning guidance method according to any one of the above items when executing the computer program on the memory.
The memory may include a volatile memory (RAM), such as a random-access memory (RAM); the memory may also include a non-volatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviated: HDD) or a solid-state drive (english: SSD); the memory may also comprise a combination of memories of the kind described above.
The processor may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of a CPU and an NP. The processor may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
As a fourth aspect of the present invention, the present invention further provides a computer storage medium, in which a computer program is stored, and the computer program is used for implementing the steps of the ultrasound scanning guidance method according to any one of the above items when the computer program is executed by a processor. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (10)

1. An ultrasonic scanning guidance method is characterized by comprising the following steps:
loading a three-dimensional ultrasonic model corresponding to a target organ to be scanned of a detection object, wherein the three-dimensional ultrasonic model at least comprises a standard scanning section with position information and angle information;
acquiring a current ultrasonic image scanned by the ultrasonic probe;
acquiring position information and angle information of the current ultrasonic image based on the current ultrasonic image;
and guiding the ultrasonic probe to move to the standard scanning section according to the position information and the angle information of the current ultrasonic image and the standard scanning section.
2. The ultrasound scanning guidance method according to claim 1, wherein the obtaining of the position information and the angle information of the current ultrasound image based on the current ultrasound image comprises:
and inputting the current ultrasonic image and the three-dimensional ultrasonic model into a trained indexing neural network model for processing, and determining the position information and the angle information of the current ultrasonic image.
3. The ultrasound scanning guidance method according to claim 2, wherein the inputting the current ultrasound image and the three-dimensional ultrasound model into a trained index neural network model for processing, and determining the position information and the angle information of the current ultrasound image comprises:
a first feature vector in the current ultrasonic image is obtained through a two-dimensional convolution neural extraction network;
extracting a second feature vector in the three-dimensional ultrasonic model through a three-dimensional convolution neural network;
splicing the first feature vector and the second feature vector in a dimension to obtain a first spliced feature vector;
and inputting the first splicing characteristic vector into a full-connection layer, and outputting the position information and the angle information of the current ultrasonic image.
4. The ultrasound scanning guidance method according to claim 1, wherein the obtaining of the position information and the angle information of the current ultrasound image based on the current ultrasound image comprises:
and inputting the current ultrasonic image into a trained full convolution neural network model for processing, and determining the position information and the angle information of the current ultrasonic image.
5. The ultrasound scanning guidance method according to claim 4, wherein the inputting the current ultrasound image into a trained fully-convolutional neural network model for processing, and determining the position information and the angle information of the current ultrasound image comprises:
inputting the current ultrasonic image into a full convolution neural network for processing to obtain a characteristic diagram of the current ultrasonic image;
performing global maximum pooling on the feature map to obtain a third feature vector of the current ultrasonic image;
carrying out global average pooling on the feature map to obtain a fourth feature vector of the current ultrasonic image;
splicing the third feature vector and the fourth feature vector to obtain a second spliced feature vector;
and inputting the second splicing characteristic vector into a full connection layer, and outputting the position information and the angle information of the current ultrasonic image.
6. The ultrasound scanning guidance method according to any one of claims 1 to 5, wherein the guiding the ultrasound probe to move to the standard scanning section according to the position information and the angle information of the current ultrasound image and the standard scanning section comprises:
planning a guide path of the ultrasonic probe moving to the standard scanning tangent plane according to the position information and the angle information;
acquiring a real-time position of the ultrasonic probe;
judging whether the ultrasonic probe deviates from the guide path according to the real-time position of the ultrasonic probe, and if so, updating the guide path according to the real-time position;
and displaying the guide path, the standard scanning section and the ultrasonic probe in real time.
7. The ultrasound scanning guidance method of claim 6, wherein the real-time display of the guidance path, standard scanning slices and the ultrasound probe comprises:
acquiring an environment image which is shot by a camera and at least comprises a detection object and an ultrasonic probe;
and highlighting the guide path, the standard scanning section and the ultrasonic probe on the environment image and/or the body surface of the detection object.
8. The ultrasound scanning guidance method of claim 1, wherein in guiding the ultrasound probe to move to the standard scanning section, operation prompt information is provided, and the operation prompt information comprises: one or more of voice-operated prompts, visual-operated prompts, and tactile-operated prompts.
9. An ultrasound device comprising at least a memory, a processor, said memory having stored thereon a computer program, characterized in that,
the processor, when executing the computer program on the memory, implements the steps of the ultrasound scan guidance method of any of claims 1 to 8.
10. A computer storage medium comprising, in combination,
the computer storage medium has stored therein a computer program which, when executed by a processor, is adapted to carry out the steps of the ultrasound scan guidance method of any one of claims 1 to 8.
CN201911413636.8A 2019-12-31 2019-12-31 Ultrasonic scanning guidance method, ultrasonic device and storage medium Pending CN113116384A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911413636.8A CN113116384A (en) 2019-12-31 2019-12-31 Ultrasonic scanning guidance method, ultrasonic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911413636.8A CN113116384A (en) 2019-12-31 2019-12-31 Ultrasonic scanning guidance method, ultrasonic device and storage medium

Publications (1)

Publication Number Publication Date
CN113116384A true CN113116384A (en) 2021-07-16

Family

ID=76770344

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911413636.8A Pending CN113116384A (en) 2019-12-31 2019-12-31 Ultrasonic scanning guidance method, ultrasonic device and storage medium

Country Status (1)

Country Link
CN (1) CN113116384A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342497A (en) * 2023-03-01 2023-06-27 天津市鹰泰利安康医疗科技有限责任公司 Three-dimensional mapping method and system for inner wall of human body cavity

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080097165A1 (en) * 2006-10-19 2008-04-24 Abhishek Gattani System and method for determining an optimal surgical trajectory
US20170086785A1 (en) * 2015-09-30 2017-03-30 General Electric Company System and method for providing tactile feedback via a probe of a medical imaging system
CN108135529A (en) * 2015-09-10 2018-06-08 赞克特机器人有限公司 For guiding the system and method for the insertion of medical instrument
CN109044400A (en) * 2018-08-31 2018-12-21 上海联影医疗科技有限公司 Ultrasound image mask method, device, processor and readable storage medium storing program for executing
CN109410242A (en) * 2018-09-05 2019-03-01 华南理工大学 Method for tracking target, system, equipment and medium based on double-current convolutional neural networks
CN110084794A (en) * 2019-04-22 2019-08-02 华南理工大学 A kind of cutaneum carcinoma image identification method based on attention convolutional neural networks
CN110134964A (en) * 2019-05-20 2019-08-16 中国科学技术大学 A kind of text matching technique based on stratification convolutional neural networks and attention mechanism
CN110310292A (en) * 2019-06-28 2019-10-08 浙江工业大学 A kind of wrist portion reference bone dividing method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080097165A1 (en) * 2006-10-19 2008-04-24 Abhishek Gattani System and method for determining an optimal surgical trajectory
CN108135529A (en) * 2015-09-10 2018-06-08 赞克特机器人有限公司 For guiding the system and method for the insertion of medical instrument
US20170086785A1 (en) * 2015-09-30 2017-03-30 General Electric Company System and method for providing tactile feedback via a probe of a medical imaging system
CN109044400A (en) * 2018-08-31 2018-12-21 上海联影医疗科技有限公司 Ultrasound image mask method, device, processor and readable storage medium storing program for executing
CN109410242A (en) * 2018-09-05 2019-03-01 华南理工大学 Method for tracking target, system, equipment and medium based on double-current convolutional neural networks
CN110084794A (en) * 2019-04-22 2019-08-02 华南理工大学 A kind of cutaneum carcinoma image identification method based on attention convolutional neural networks
CN110134964A (en) * 2019-05-20 2019-08-16 中国科学技术大学 A kind of text matching technique based on stratification convolutional neural networks and attention mechanism
CN110310292A (en) * 2019-06-28 2019-10-08 浙江工业大学 A kind of wrist portion reference bone dividing method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342497A (en) * 2023-03-01 2023-06-27 天津市鹰泰利安康医疗科技有限责任公司 Three-dimensional mapping method and system for inner wall of human body cavity
CN116342497B (en) * 2023-03-01 2024-03-19 天津市鹰泰利安康医疗科技有限责任公司 Three-dimensional mapping method and system for inner wall of human body cavity

Similar Documents

Publication Publication Date Title
CN112288742B (en) Navigation method and device for ultrasonic probe, storage medium and electronic equipment
CN110090069B (en) Ultrasonic puncture guiding method, guiding device and storage medium
CN110584714A (en) Ultrasonic fusion imaging method, ultrasonic device, and storage medium
EP2514366A1 (en) Automatic ultrasonic scanning system and scanning method thereof
CN109584350A (en) Measurement point in diagnosis imaging determines
CN111629670B (en) Echo window artifact classification and visual indicator for ultrasound systems
KR20220020359A (en) Representation of target during ultrasound probe aiming
EP4014890A1 (en) Ultrasonic diagnostic apparatus and control method for ultrasonic diagnostic apparatus
CN113116386B (en) Ultrasound imaging guidance method, ultrasound apparatus, and storage medium
KR20150091748A (en) Scan position guide method of three dimentional ultrasound system
US20220160335A1 (en) Ultrasound diagnostic apparatus and method of controlling ultrasound diagnostic apparatus
EP4005494A1 (en) Ultrasonic diagnostic device and method for controlling ultrasonic diagnostic device
KR20160071889A (en) Apparatus and method for supporting on diagnosis using multi image
CN111145137B (en) Vein and artery identification method based on neural network
CN113116384A (en) Ultrasonic scanning guidance method, ultrasonic device and storage medium
JP5630967B2 (en) Image processing apparatus and control method thereof
KR20200096125A (en) Prescriptive guidance for ultrasound diagnostics
CN113129342A (en) Multi-modal fusion imaging method, device and storage medium
CN117257346A (en) Ultrasonic probe guiding method and device based on image recognition
EP2740408B1 (en) Ultrasound diagnostic method and ultrasound diagnostic apparatus using volume data
US20220087652A1 (en) Three-dimensional ultrasound imaging support apparatus, three-dimensional ultrasound imaging support method, and three-dimensional ultrasound imaging support program
CN113116378A (en) Multi-modal fusion imaging method, ultrasound apparatus, and storage medium
US20230137369A1 (en) Aiding a user to perform a medical ultrasound examination
JP2023551131A (en) Guided acquisition of 3D representations of anatomical structures
CN112689478B (en) Ultrasonic image acquisition method, system and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210716

RJ01 Rejection of invention patent application after publication