CN114052795B - Focus imaging and anti-false-prick therapeutic system combined with ultrasonic autonomous scanning - Google Patents

Focus imaging and anti-false-prick therapeutic system combined with ultrasonic autonomous scanning Download PDF

Info

Publication number
CN114052795B
CN114052795B CN202111261523.8A CN202111261523A CN114052795B CN 114052795 B CN114052795 B CN 114052795B CN 202111261523 A CN202111261523 A CN 202111261523A CN 114052795 B CN114052795 B CN 114052795B
Authority
CN
China
Prior art keywords
focus
ultrasonic
scanning
information
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111261523.8A
Other languages
Chinese (zh)
Other versions
CN114052795A (en
Inventor
陈芳
叶浩然
张道强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202111261523.8A priority Critical patent/CN114052795B/en
Publication of CN114052795A publication Critical patent/CN114052795A/en
Application granted granted Critical
Publication of CN114052795B publication Critical patent/CN114052795B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/12Surgical instruments, devices or methods, e.g. tourniquets for ligaturing or otherwise compressing tubular parts of the body, e.g. blood vessels, umbilical cord
    • A61B17/12009Implements for ligaturing other than by clamps or clips, e.g. using a loop with a slip knot
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • A61B2017/00075Motion

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Vascular Medicine (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Reproductive Health (AREA)
  • Surgical Instruments (AREA)

Abstract

The invention discloses a focus imaging and anti-false-prick treatment system combined with ultrasonic autonomous scanning, which comprises: an ultrasonic scanning probe, an externally-hung scanning control module, an intraoperative imaging guidance display module, a focus quantification module fusing sequence attention and an anti-false-prick elastic treatment needle module. The invention can realize autonomous imaging and ultrasonic scanning in operation to obtain comprehensive focus information without depending on expensive calculation server platform and mechanical arm equipment, and is provided with an anti-false-puncture elastic treatment needle at a treatment end, thereby ensuring the accuracy and safety of operation treatment.

Description

Focus imaging and anti-false-prick therapeutic system combined with ultrasonic autonomous scanning
Technical Field
The invention relates to the technical field of ultrasonic medicine, in particular to a focus imaging and false-prick-preventing treatment system combining ultrasonic autonomous scanning.
Background
Compared with the traditional surgery, the ultrasonic guided minimally invasive surgery obtains the structural information of the internal tissues and the focus of the human body through the ultrasonic imaging equipment, and presents the structural information to a doctor, so that the doctor is helped to determine the operation modes of the treatment needle and the treatment instrument, and the focus of the human body is treated. Because the ultrasonic energy realizes real-time and portable imaging, and the trauma is small, the complications are few and the postoperative recovery is quick under the guidance of imaging, the minimally invasive surgery guided by the ultrasonic wave has become a new breakthrough of modern treatment modes. In the ultrasonic guided minimally invasive surgery, disorder information is quickly and accurately obtained through ultrasonic imaging, safe positioning and treatment of a focus by a therapeutic instrument are ensured, guiding can be improved for doctor operation, injury in surgery is reduced, and surgery safety is improved.
Traditional ultrasound guided minimally invasive surgery requires a clinician to determine ultrasound imaging information and rely on clinical experience to determine the scan direction and scan angle of the ultrasound probe. Meanwhile, after the doctor penetrates the treatment needle near the focus with the assistance of the ultrasonic image, the doctor performs treatment operations such as puncture, excision and the like on the focus by treatment. Therefore, the imaging and treatment modes are seriously dependent on clinical experience of doctors, the complete imaging of the focus by the ultrasound is difficult to ensure, the operation of a treatment needle is difficult to finely control, and massive bleeding in the focus area caused by misoperation is easy to cause. In order to solve the problems, related researches propose to introduce an image intelligent analysis technology into an ultrasonic guided minimally invasive surgery, namely, after automatically segmenting and tracking focuses in ultrasonic images, feeding back related information to doctors. However, in such a way, only focus information in an already acquired ultrasonic image can be extracted, accurate imaging guidance information cannot be given, and it is difficult to ensure that an ultrasonic probe performs complete imaging on a focus region. Meanwhile, the introduction of the current image intelligent analysis technology requires a calculation server platform which is expensive to purchase in hospitals, and has low clinical applicability. In addition, in the aspect of safe operation of the therapeutic apparatus, related researches mainly use a mechanical arm with multiple degrees of freedom to realize fine control of the therapeutic apparatus so as to reduce incorrect operation in operation and ensure safety. However, in this way, an expensive mechanical arm is required to be equipped for the existing therapeutic apparatus, and a long training and learning are required for the doctor to effectively fuse with the existing surgical operation procedure.
Therefore, in the current ultrasonic guided minimally invasive surgery, the existing ultrasonic image analysis mode cannot give out prompt information of a focus complete section scanning mode, has high requirements on operation methods of doctors, and cannot realize autonomous scanning imaging. In addition, the prior treatment instrument misoperation prevention scheme mainly adopts a mechanical arm control mode. Before clinical use, doctors are required to perform training and learning for a long time. As can be seen, there is currently no new guidance system that can provide both intraoperative imaging guidance for the physician and effective anti-false-puncturing procedures for the treatment.
Disclosure of Invention
The invention aims to solve the technical problem of providing a focus imaging and anti-false-prick treatment system combining ultrasonic autonomous scanning, which can realize intra-operative autonomous imaging and ultrasonic scanning without depending on expensive calculation server platforms and mechanical arm equipment so as to obtain comprehensive focus information, and is provided with an anti-false-prick elastic treatment needle at a treatment end, thereby ensuring the accuracy and safety of surgical treatment.
In order to solve the technical problems, the invention provides a focus imaging and anti-false-ligation treatment system combining ultrasonic autonomous scanning, which comprises: an ultrasonic scanning probe, an externally-hung scanning control module, an intraoperative imaging guidance display module, a focus quantification module fusing sequence attention and an anti-false-prick elastic treatment needle module; aiming at an ultrasonic scanning probe, obtaining a probe outer pendant through three-dimensional printing, and providing an external hanging type scanning control module based on reinforcement learning on the probe outer pendant, providing an imaging guiding model of depth reinforcement learning, and providing guiding information for focus scanning of doctors; through the intraoperative imaging guidance display module, imaging guidance information is intuitively displayed for a doctor, and the doctor is ensured to autonomously adjust the scanning direction of the ultrasonic probe, so that complete focus information is obtained; for the obtained more complete focus information, calculating the size and shape information of the focus by using a focus quantification module of the attention of the fusion sequence; finally, the elastic treatment needle module for preventing false pricking can further adjust the magnitude of the elastic control force according to the size and the form information of the focus, so that the contact force between the elastic treatment needle and the focus is ensured to be in a safe force application range, and the problem of focus massive hemorrhage caused by false force application is prevented.
Preferably, the external-hanging type scanning control module takes an acquired ultrasonic sequence as a bounded environment, takes a priori image of a focus structural state as a standard acquisition surface, compares a current acquisition image with a focus standard surface, takes the current acquisition image as a target task, takes the bounded environment as a carrier for interaction of an intelligent agent under the guidance of the target task, and is used for receiving actions, outputting states and calculating rewards of the current actions, wherein the intelligent agent is a deep convolution network based on multi-scale feature extraction and fusion; in the iterative process, outputting a current state, namely a current acquired ultrasonic image, comparing the current state with a focus standard acquisition surface to serve as a current state comparison quantity, then receiving the state comparison quantity by an agent, performing data processing to obtain response feedback, further changing the environment by the action to obtain a new state and a reward of the current action, and learning a preliminary ultrasonic scanning guiding action by the agent through a Q value sequence learning algorithm in the iterative process; next, the ultrasound scanning guiding actions are refined through an adaptive imaging gesture searching strategy, in the adaptive imaging gesture searching strategy, the ultrasound scanning guiding actions obtained preliminarily are combined with the acquired ultrasound sequences to obtain an ultrasound three-dimensional body, then three-dimensional key points are extracted, structural key points are subjected to dynamic space-time modeling units, space-time information of the three-dimensional key points is settled through the dynamic space-time modeling units in a time recurrent neural network mode, and final imaging guiding information including scanning directions is output.
Preferably, the intraoperative imaging guidance display module completes subjective display through the relative position information of the scanning track and the focus and the current scanning direction indication arrow, and helps doctors to accurately adjust the ultrasonic scanning direction; the imaging guiding information output by the reinforcement learning-based externally hung scanning control module mainly comprises the guiding direction of next scanning of the ultrasonic probe, namely the guiding direction of next scanning relative to the current acquisition point; the intraoperative imaging guidance display module firstly guides a three-dimensional model of a focus, and superimposes direction guidance information of next scanning of the ultrasonic probe in a focus three-dimensional model space through a direction indication arrow, so that the scanning direction is intuitively displayed; then, three-dimensional distance information of the arrow tip of the arrow and the center point of the three-dimensional model is displayed to a doctor through a display, and quantitative scanning direction information is given to the doctor; and finally, connecting arrow point points of all the scanning direction indication arrows in the ultrasonic scanning process, and further obtaining three-dimensional display of the scanning track by utilizing three-dimensional point interpolation.
Preferably, a focus quantification module fused with the sequence attention quantifies and solves more complete focus information obtained by scanning an ultrasonic probe; a focus area sequence change attention unit is introduced, and focus continuity in a multi-frame ultrasonic sequence is utilized to assist focus in a current ultrasonic imageAccurate acquisition of structure, input as an acquired ultrasonic sequence X in a focus region sequence change attention unit M ∈R T×3×H×W Wherein T represents the number of frames of the ultrasonic sequence, H represents the length of the ultrasonic image, and W represents the width of the ultrasonic image; firstly, considering that a focus tissue side is often presented as a dark area and the boundary is fuzzy, a focus area sequence change attention unit utilizes local phase information and a local phase filtering technology to complete focus boundary enhancement; then, extracting the characteristics of each frame of image in the ultrasonic sequence by utilizing 5 image convolution operations to obtain the characteristics of a focus region; because the size of the focus area of the continuous frames in the ultrasonic image sequence has continuity, and the gray information of the images in the focus area has difference in the continuous frame images, the difference of the tissue components of the whole focus under different imaging sections is reflected, therefore, the focus area sequence change attention unit obtains the relevant contrast information I of the tissue components in the focus area in the ultrasonic image sequence by calculating the relevance of the focus areas in the front and back continuous frame images M ∈R T×3×H×W The method comprises the steps of carrying out a first treatment on the surface of the Finally, by correlating the contrast information I in successive ultrasound images M ∈R T×3×H×W Performing sigmoid nonlinear activation mapping to obtain focus region sequence change attention weight W M ∈R T×3×H×W The method is used for assisting focus quantitative analysis in the currently acquired ultrasonic image;
in a focus quantification module of the fusion sequence attention, the focus current information is primarily extracted from the currently acquired ultrasonic image through a focus positioning and analyzing unit of the current frame; firstly, performing preliminary positioning on focus positions through image pyramid and histogram equalization operation; then, the preliminary focus positioning result is utilized, and the shape deformation model is combined to realize the optimized segmentation of focus boundaries, so as to obtain the focus boundary C in the current collected ultrasonic image O ∈R 3×H×W And lesion size S, etc.; obtain focus region sequence change attention weight W M ∈R T×3×H×W And boundary C of focus in current acquired ultrasonic image O ∈R 3×H×W And lesion size S, sequence changes were notedForce weight W M ∈R T×3×H×W Boundary with lesion C O ∈R 3×H×W Performing weight weighting multiplication operation on the lesion size S to obtain change curves of lesion sizes and boundaries under different tangent planes of the lesion; finally, for the change curve, the curve features are extracted as the result of focus size and boundary quantification.
Preferably, the anti-false-prick elastic treatment needle module is used for restraining the operation force of the treatment needle through the first-stage elastic control unit; the first-stage elastic control unit comprises an elastic piece and a conical body, wherein the input signal is a focus size change curve peak value, and when the curve peak value exceeds a set threshold value alpha, the elastic piece of the control unit is automatically connected to the resistance through hole, so that the conical body rotates to be small in shape; when the rotation and deformation of the body are small, the transmission force generated by the force driving control unit connected with the body is also small, so that the operation force of the treatment needle is ensured to be weakened after being transmitted; the operating force after passing through the force driving control unit is further transmitted to the second-stage elastic control unit through the elastic piece to carry out secondary constraint on the operating force, so that the operating force is ensured to be in a safe force application range; the input signal of the second-stage elastic control unit is the fluctuation rate and deflection value of the focus boundary change curve, when the fluctuation rate and deflection value exceeds a threshold value beta, the resistance of an elastic piece of the control unit can become twice the focus elastic force, so that the rotating force of a cone connected with the control unit is smaller than twice the focus elastic force; the rotational force of the conical body is small, so that the operating force of the tip of the therapeutic needle is small, and the problem of focus massive hemorrhage caused by incorrect force can be effectively prevented.
The beneficial effects of the invention are as follows: the invention utilizes the externally hung scanning control technology and combines the anti-false-puncture elastic treatment needle module with two-stage control, can carry out ultrasonic autonomous scanning under the condition of not depending on an expensive calculation server platform and mechanical arm equipment, realizes complete imaging of focus, ensures that the focus treatment contact force is in a safe force application range, can prevent the problem of focus massive hemorrhage caused by false force application, and ensures the accuracy and safety of operation treatment.
Drawings
Fig. 1 is a schematic diagram of a system structure according to the present invention.
FIG. 2 is a schematic illustration of the connection of the present invention in use.
FIG. 3 is a schematic diagram of a method for controlling plug-in scan guidance based on reinforcement learning according to the present invention.
FIG. 4 is a schematic diagram of focus algorithm of attention of the fusion sequence of the present invention.
FIG. 5 is a schematic view of an anti-false-puncture elastic treatment needle according to the present invention.
Detailed Description
As shown in fig. 1, a focus imaging and anti-false-puncture treatment system combined with ultrasound autonomous scanning, comprising: an ultrasonic scanning probe, an externally-hung scanning control module, an intraoperative imaging guidance display module, a focus quantification module fusing sequence attention and an anti-false-prick elastic treatment needle module. Firstly, printing a probe outer pendant with high matching degree for the used ultrasonic probe through three-dimensional scanning and three-dimensional printing. The external hanging part of the probe is provided with an external hanging type scanning control module based on reinforcement learning, which is realized by a Field Programmable Gate Array (FPGA), and the module provides an imaging guiding model of deep reinforcement learning and can provide guiding information for focus scanning of doctors; the obtained ultrasonic scanning imaging guidance information intuitively displays the imaging guidance information to a doctor through an intraoperative imaging guidance display module, and helps the doctor to autonomously adjust the scanning direction of the ultrasonic probe so as to obtain complete focus information. The intraoperative imaging guidance display module is used for completing visual display mainly through the relative position information of the scanning guidance track and the focus output by the reinforcement learning-based externally hung scanning control module and the current scanning direction indication arrow so as to ensure that complete focus scanning is realized; and for the obtained more complete focus information, further calculating the size and shape information of the focus by using a focus quantification module of the attention of the fusion sequence. In a focus quantification module of the focus of the fusion sequence, focus areas in the acquired ultrasonic sequence are analyzed to obtain focus area sequence change focus weight, the focus area sequence change focus weight is compared with focus information in an ultrasonic image of a current scanning section, size and boundary change curves of focuses under different sections are calculated, and focus size and boundary quantification results are predicted. Finally, the anti-false-pricking elastic treatment needle module can adjust the magnitude of the operation force through the elastic control unit according to the quantitative information of the size and the shape of the focus, so that the contact force between the elastic treatment needle and the focus is ensured to be in a safe force application range, and the problem of focus massive hemorrhage caused by false force application is prevented. A schematic connection of the present invention in use is shown in fig. 2.
The external-hanging type scanning control module based on reinforcement learning is designed with a convenient and detachable imaging scanning guiding calculation and control unit for ultrasonic autonomous scanning so as to realize complete imaging of focus. Firstly, printing a probe outer pendant with high matching degree for the used ultrasonic probe through three-dimensional scanning and printing. The external pendant of the ultrasonic probe is provided with an imaging scanning guide calculation and control model based on reinforcement learning, which is realized by a Field Programmable Gate Array (FPGA). As shown in fig. 3, the model will use deep reinforcement learning as the basis module of the overall framework to implement imaging scan guidance calculations. In this module, the acquired ultrasound sequence is taken as a bounded environment and the prior image of the lesion structural state is taken as a standard acquisition surface. By comparing the current acquired image with the focus standard, the current acquired image is used as a target task. Under the guidance of the target task, the bounded environment is further used as a main carrier for interaction of the intelligent agent and is used for receiving the action, outputting the state and calculating the rewards of the current action. Here, the agent is a deep convolutional network based on multi-scale feature extraction and fusion. In the iterative process, the bounded environment outputs the current state, namely the currently acquired ultrasonic image, and compares the current state with the focus standard acquisition surface to serve as the current state comparison quantity. Then the agent receives the state comparison quantity, processes the data, and obtains the feedback of response, namely action (ultrasonic scanning guidance), and the action further changes the environment, so that the new state and the rewards of the current action are obtained. In the iterative process, the intelligent agent learns the preliminary ultrasonic scanning guiding action through a Q value sequence learning algorithm. Next, the ultrasound scan guidance actions are further refined by an adaptive imaging pose search strategy. In the self-adaptive imaging posture searching strategy, the ultrasonic scanning guiding action obtained preliminarily is combined with the acquired ultrasonic sequence to obtain an ultrasonic three-dimensional body. And then, extracting three-dimensional key points, passing the structural key points through a dynamic space-time modeling unit, settling the space-time information of the three-dimensional key points by using a time recurrent neural network form, and outputting final imaging guiding information including a scanning direction and the like.
The intraoperative imaging guidance display module mainly completes subjective display through the relative position information of the scanning track and the focus and the current scanning direction indication arrow, and helps doctors to accurately adjust the ultrasonic scanning direction. The imaging guiding information output by the reinforcement learning-based plug-in scanning control module mainly comprises the guiding direction of the next scanning of the ultrasonic probe, namely the guiding direction of the next scanning relative to the current acquisition point. The intraoperative imaging guidance display module firstly guides a three-dimensional model of a focus, and superimposes direction guidance information of next scanning of the ultrasonic probe in a focus three-dimensional model space through a direction indication arrow, so that the scanning direction is intuitively displayed. Then, three-dimensional distance information between the arrow tip of the pointing arrow and the center point of the three-dimensional model is displayed to the doctor through a display, and quantitative scanning direction information is given to the doctor. And finally, connecting arrow point points of all the scanning direction indication arrows in the ultrasonic scanning process, and further obtaining three-dimensional display of the scanning track by utilizing three-dimensional point interpolation.
The focus quantification module of the fusion sequence attention is mainly used for carrying out quantification solution on more complete focus information obtained by scanning of an ultrasonic probe. In consideration of continuity of focus structures in an acquired ultrasonic image sequence, in order to accurately quantify focus information from a currently acquired ultrasonic image, related information of a previous ultrasonic image frame needs to be referred to, so that a focus area sequence change attention unit is introduced, and accurate acquisition of focus structures in the current ultrasonic image is assisted by using focus continuity in a multi-frame ultrasonic sequence. As shown in fig. 4, in the focal region sequence change attention unit, an ultrasound sequence X is input as an acquisition M ∈R T ×3×H×W (where T represents the number of ultrasound sequence frames, H represents the ultrasound image length, and W represents the ultrasound image width). First, considering that focal tissue parties tend to appear as dark areas, boundariesThe focus area sequence change attention unit uses local phase information and local phase filtering technology to complete focus boundary enhancement. Then, the characteristics of each frame of image in the ultrasonic sequence are extracted by utilizing 5 image convolution operations on the ultrasonic sequence after the boundary is enhanced, and the characteristics of the focus area are obtained. Furthermore, because the size of the focus area of the continuous frames in the ultrasonic image sequence has continuity, and the image gray information in the focus area has differences in the continuous frame images, the tissue composition differences of the whole focus under different imaging sections are reflected. Therefore, the focus area sequence change attention unit obtains the relevant contrast information I of the tissue components in the focus area in the ultrasonic image sequence by calculating the focus area correlation in the front and back continuous frame images M ∈R T×3×H×W . Finally, by correlating the contrast information I in successive ultrasound images M ∈R T×3×H×W Performing sigmoid nonlinear activation mapping to obtain focus region sequence change attention weight W M ∈R T ×3×H×W The method is used for assisting in quantitative analysis of focus in the currently acquired ultrasonic image.
In the focus quantification module of the fusion sequence attention, the focus current information is primarily extracted by a focus positioning and analyzing unit of the current frame for the currently acquired ultrasonic image. First, the focus position is initially located through image pyramid and histogram equalization operations. Then, the preliminary focus positioning result is utilized, and the shape deformation model is combined to realize the optimized segmentation of focus boundaries, so as to obtain the focus boundary C in the current collected ultrasonic image O ∈R 3×H×W And lesion size S, etc. Obtain focus region sequence change attention weight W M ∈R T×3×H×W And focus related information (focus boundary C) Q ∈R 3×H×W And lesion size S), the sequence is changed by the attention weight W M ∈R T×3×H×W Boundary with lesion C Q ∈R 3×H×W And performing weight weighted multiplication operation on the lesion size S to obtain change curves of the lesion size and boundaries of the lesion under different tangential planes (i.e. sequences) of the lesion. Finally, for the change curve, mentionCurve characteristics (such as fluctuation rate, skewness, area under curve, peak value and the like) are taken as the result of focus size and boundary quantification.
As shown in fig. 5, the elastic control force is further adjusted according to quantitative information of the size and shape of the focus in the anti-false-prick elastic treatment needle module, so that the contact force between the elastic treatment needle and the focus is ensured to be in a safe force application range, and the problem of focus massive hemorrhage caused by false force application is prevented. The elastic treatment needle module is prevented from being pricked by mistake, and the operation force of the treatment needle is restrained by the first-stage elastic control unit. The first-stage elastic control unit comprises an elastic piece and a conical body, wherein the input signal is a focus size change curve peak value, and when the curve peak value exceeds a set threshold value alpha, the elastic piece of the control unit can be automatically connected to the resistance through hole, so that the conical body rotates and deforms little. When the rotation and deformation of the body are small, the transmission force generated by the force driving control unit connected with the body is also small, so that the operation force of the treatment needle is ensured to be weakened after transmission. The operating force after the force driving control unit is further transmitted to the second-stage elastic control unit through the elastic piece to secondarily restrict the operating force, so that the operating force is ensured to be in a safe force application range. The input signal of the second-stage elastic control unit is the fluctuation rate and deflection value of the focus boundary change curve, and when the fluctuation rate and deflection value exceeds a threshold value beta, the resistance of the elastic piece of the control unit can become twice the focus elastic force, so that the rotating force of a cone connected with the control unit is smaller than twice the focus elastic force. The rotational force of the conical body is small, so that the operating force of the tip of the therapeutic needle is small, and the problem of focus massive hemorrhage caused by incorrect force can be effectively prevented.

Claims (3)

1. A lesion imaging and anti-false-ligation treatment system in combination with ultrasound autonomous scanning, comprising: an ultrasonic scanning probe, an externally-hung scanning control module, an intraoperative imaging guidance display module, a focus quantification module fusing sequence attention and an anti-false-prick elastic treatment needle module; aiming at an ultrasonic scanning probe, obtaining a probe outer pendant through three-dimensional printing, wherein an outer hanging type scanning control module based on reinforcement learning is arranged on the probe outer pendant, and the imaging guiding model of depth reinforcement learning is utilized to provide guiding information for focus scanning of doctors; through the intraoperative imaging guidance display module, imaging guidance information is intuitively displayed for a doctor, and the doctor is ensured to autonomously adjust the scanning direction of the ultrasonic probe, so that complete focus information is obtained; for the obtained more complete focus information, calculating the size and shape information of the focus by using a focus quantification module of the attention of the fusion sequence; finally, the elastic treatment needle module for preventing false pricking can further adjust the magnitude of the elastic control force according to the size and the form information of the focus, so that the contact force between the elastic treatment needle and the focus is ensured to be in a safe force application range, and the problem of focus massive hemorrhage caused by false force application is prevented;
the external-hanging type scanning control module takes an acquired ultrasonic sequence as a bounded environment, takes a priori image of a focus structure state as a standard acquisition surface, takes a current acquisition image and a focus standard surface as a target task, and takes the bounded environment as a carrier for interaction of an intelligent agent under the guidance of the target task, wherein the intelligent agent is a deep convolution network based on multi-scale feature extraction and fusion, and is used for receiving actions, outputting states and calculating rewards of the current actions; in the iterative process, outputting a current state, namely a current acquired ultrasonic image, comparing the current state with a focus standard acquisition surface to serve as a current state comparison quantity, then receiving the state comparison quantity by an agent, performing data processing to obtain response feedback, further changing the environment by the action to obtain a new state and a reward of the current action, and learning a preliminary ultrasonic scanning guiding action by the agent through a Q value sequence learning algorithm in the iterative process; next, refining an ultrasonic scanning guiding action through a self-adaptive imaging posture searching strategy, in the self-adaptive imaging posture searching strategy, combining the ultrasonic scanning guiding action obtained preliminarily with an acquired ultrasonic sequence to obtain an ultrasonic three-dimensional body, extracting three-dimensional key points, enabling the structural key points to pass through a dynamic space-time modeling unit, settling space-time information of the three-dimensional key points by using a time recurrent neural network, and outputting final imaging guiding information including a scanning direction;
attention to the fusion sequenceThe focus quantization module performs quantization solution on more complete focus information obtained by scanning the ultrasonic probe; a focus area sequence change attention unit is introduced, focus continuity in a multi-frame ultrasonic sequence is utilized to assist accurate acquisition of focus structure in a current ultrasonic image, and in the focus area sequence change attention unit, an ultrasonic sequence X is input as acquisition M ∈R T×3×H×W Wherein T represents the number of frames of the ultrasonic sequence, H represents the length of the ultrasonic image, and W represents the width of the ultrasonic image; firstly, considering that focus tissues are presented as dark areas and the boundaries are fuzzy, a focus area sequence change attention unit utilizes local phase information and a local phase filtering technology to complete focus boundary enhancement; then, extracting the characteristics of each frame of image in the ultrasonic sequence by utilizing 5 image convolution operations to obtain the characteristics of a focus region; because the size of the focus area of the continuous frames in the ultrasonic image sequence has continuity, and the gray information of the images in the focus area has difference in the continuous frame images, the difference of the tissue components of the whole focus under different imaging sections is reflected, therefore, the focus area sequence change attention unit obtains the relevant contrast information I of the tissue components in the focus area in the ultrasonic image sequence by calculating the relevance of the focus areas in the front and back continuous frame images M ∈R T×3×H×W The method comprises the steps of carrying out a first treatment on the surface of the Finally, by correlating the contrast information I in successive ultrasound images M ∈R T ×3×H×W Performing sigmoid nonlinear activation mapping to obtain focus region sequence change attention weight W M ∈R T×3×H×W The method is used for assisting focus quantitative analysis in the currently acquired ultrasonic image;
in a focus quantification module of the fusion sequence attention, the focus current information is primarily extracted from the currently acquired ultrasonic image through a focus positioning and analyzing unit of the current frame; firstly, performing preliminary positioning on focus positions through image pyramid and histogram equalization operation; then, the preliminary focus positioning result is utilized, and the shape deformation model is combined to realize the optimized segmentation of focus boundaries, so as to obtain the focus boundary C in the current collected ultrasonic image Q ∈R 3×H×W And lesion size S letterExtinguishing; obtain focus region sequence change attention weight W M ∈R T×3×H×W And boundary C of focus in current acquired ultrasonic image Q ∈R 3 ×H×W And focus size S, the sequence is changed by attention weight W M ∈R T×3×H×W Boundary with lesion C Q ∈R 3×H×W Performing weight weighting multiplication operation on the lesion size S to obtain change curves of lesion sizes and boundaries under different tangent planes of the lesion; finally, for the change curve, the curve features are extracted as the result of focus size and boundary quantification.
2. The focus imaging and anti-false-puncture treatment system combined with ultrasonic autonomous scanning according to claim 1, wherein the intraoperative imaging guidance display module completes subjective display through the relative position information of the scanning track and the focus and the current scanning direction indication arrow, and helps doctors to accurately adjust the ultrasonic scanning direction; the imaging guiding information output by the reinforcement learning-based externally hung scanning control module mainly comprises the guiding direction of next scanning of the ultrasonic probe, namely the guiding direction of next scanning relative to the current acquisition point; the intraoperative imaging guidance display module firstly guides a three-dimensional model of a focus, and superimposes direction guidance information of next scanning of the ultrasonic probe in a focus three-dimensional model space through a direction indication arrow, so that the scanning direction is intuitively displayed; then, three-dimensional distance information of the arrow tip of the arrow and the center point of the three-dimensional model is displayed to a doctor through a display, and quantitative scanning direction information is given to the doctor; and finally, connecting arrow point points of all the scanning direction indication arrows in the ultrasonic scanning process, and further obtaining three-dimensional display of the scanning track by utilizing three-dimensional point interpolation.
3. The focus imaging and false-prick preventing treatment system combined with ultrasonic autonomous scanning as claimed in claim 1, wherein the false-prick preventing elastic treatment needle module restrains the operation force of the treatment needle through the first-stage elastic control unit; the first-stage elastic control unit comprises an elastic piece and a conical body, wherein the input signal is a focus size change curve peak value, and when the curve peak value exceeds a set threshold value alpha, the elastic piece of the control unit is automatically connected to the resistance through hole, so that the conical body rotates to be small in shape; when the rotation and deformation of the body are small, the transmission force generated by the force driving control unit connected with the body is also small, so that the operation force of the treatment needle is ensured to be weakened after being transmitted; the operating force after passing through the force driving control unit is further transmitted to the second-stage elastic control unit through the elastic piece to carry out secondary constraint on the operating force, so that the operating force is ensured to be in a safe force application range; the input signal of the second-stage elastic control unit is the fluctuation rate and deflection value of the focus boundary change curve, when the fluctuation rate and deflection value exceeds a threshold value beta, the resistance of an elastic piece of the control unit can become twice the focus elastic force, so that the rotating force of a cone connected with the control unit is smaller than twice the focus elastic force; the rotational force of the conical body is small, so that the operating force of the tip of the therapeutic needle is small, and the problem of focus massive hemorrhage caused by incorrect force is effectively prevented.
CN202111261523.8A 2021-10-28 2021-10-28 Focus imaging and anti-false-prick therapeutic system combined with ultrasonic autonomous scanning Active CN114052795B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111261523.8A CN114052795B (en) 2021-10-28 2021-10-28 Focus imaging and anti-false-prick therapeutic system combined with ultrasonic autonomous scanning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111261523.8A CN114052795B (en) 2021-10-28 2021-10-28 Focus imaging and anti-false-prick therapeutic system combined with ultrasonic autonomous scanning

Publications (2)

Publication Number Publication Date
CN114052795A CN114052795A (en) 2022-02-18
CN114052795B true CN114052795B (en) 2023-11-07

Family

ID=80235681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111261523.8A Active CN114052795B (en) 2021-10-28 2021-10-28 Focus imaging and anti-false-prick therapeutic system combined with ultrasonic autonomous scanning

Country Status (1)

Country Link
CN (1) CN114052795B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101006933A (en) * 2006-01-23 2007-08-01 西门子公司 Method and device for displaying 3d objects
CN101053531A (en) * 2007-05-17 2007-10-17 上海交通大学 Early tumor positioning and tracking method based on multi-mold sensitivity intensifying and imaging fusion
CN105447872A (en) * 2015-12-03 2016-03-30 中山大学 Method for automatically identifying liver tumor type in ultrasonic image
CN206597027U (en) * 2016-10-09 2017-10-31 深圳华大智造科技有限公司 A kind of ultrasonic scanner accessory system
AU2017268489B1 (en) * 2016-12-02 2018-05-17 Avent, Inc. System and method for navigation to a target anatomical object in medical imaging-based procedures
CN108272502A (en) * 2017-12-29 2018-07-13 战跃福 A kind of ablation needle guiding operating method and system of CT three-dimensional imagings guiding
US10032281B1 (en) * 2017-05-03 2018-07-24 Siemens Healthcare Gmbh Multi-scale deep reinforcement machine learning for N-dimensional segmentation in medical imaging
CN109074665A (en) * 2016-12-02 2018-12-21 阿文特公司 System and method for navigating to targeted anatomic object in the program based on medical imaging
CN110347860A (en) * 2019-07-01 2019-10-18 南京航空航天大学 Depth image based on convolutional neural networks describes method
CN110381846A (en) * 2017-03-06 2019-10-25 辛可索诺有限责任公司 Angiemphraxis diagnostic method, equipment and system
CN111260786A (en) * 2020-01-06 2020-06-09 南京航空航天大学 Intelligent ultrasonic multi-mode navigation system and method
CN111477318A (en) * 2020-04-25 2020-07-31 华南理工大学 Virtual ultrasonic probe tracking method for remote control
CN112370161A (en) * 2020-10-12 2021-02-19 珠海横乐医学科技有限公司 Operation navigation method and medium based on ultrasonic image characteristic plane detection
CN112612274A (en) * 2020-12-22 2021-04-06 清华大学 Autonomous motion decision control method and system for ultrasonic inspection robot
CN113218400A (en) * 2021-05-17 2021-08-06 太原科技大学 Multi-agent navigation algorithm based on deep reinforcement learning
WO2021169126A1 (en) * 2020-02-25 2021-09-02 平安科技(深圳)有限公司 Lesion classification model training method and apparatus, computer device, and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI473598B (en) * 2012-05-18 2015-02-21 Univ Nat Taiwan Breast ultrasound image scanning and diagnostic assistance system
US10573031B2 (en) * 2017-12-06 2020-02-25 Siemens Healthcare Gmbh Magnetic resonance image reconstruction with deep reinforcement learning
WO2019219387A1 (en) * 2018-05-16 2019-11-21 Koninklijke Philips N.V. Automated tumor identification during surgery using machine-learning

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101006933A (en) * 2006-01-23 2007-08-01 西门子公司 Method and device for displaying 3d objects
CN101053531A (en) * 2007-05-17 2007-10-17 上海交通大学 Early tumor positioning and tracking method based on multi-mold sensitivity intensifying and imaging fusion
CN105447872A (en) * 2015-12-03 2016-03-30 中山大学 Method for automatically identifying liver tumor type in ultrasonic image
CN206597027U (en) * 2016-10-09 2017-10-31 深圳华大智造科技有限公司 A kind of ultrasonic scanner accessory system
AU2017268489B1 (en) * 2016-12-02 2018-05-17 Avent, Inc. System and method for navigation to a target anatomical object in medical imaging-based procedures
CN109074665A (en) * 2016-12-02 2018-12-21 阿文特公司 System and method for navigating to targeted anatomic object in the program based on medical imaging
CN110381846A (en) * 2017-03-06 2019-10-25 辛可索诺有限责任公司 Angiemphraxis diagnostic method, equipment and system
US10032281B1 (en) * 2017-05-03 2018-07-24 Siemens Healthcare Gmbh Multi-scale deep reinforcement machine learning for N-dimensional segmentation in medical imaging
CN108272502A (en) * 2017-12-29 2018-07-13 战跃福 A kind of ablation needle guiding operating method and system of CT three-dimensional imagings guiding
CN110347860A (en) * 2019-07-01 2019-10-18 南京航空航天大学 Depth image based on convolutional neural networks describes method
CN111260786A (en) * 2020-01-06 2020-06-09 南京航空航天大学 Intelligent ultrasonic multi-mode navigation system and method
WO2021169126A1 (en) * 2020-02-25 2021-09-02 平安科技(深圳)有限公司 Lesion classification model training method and apparatus, computer device, and storage medium
CN111477318A (en) * 2020-04-25 2020-07-31 华南理工大学 Virtual ultrasonic probe tracking method for remote control
CN112370161A (en) * 2020-10-12 2021-02-19 珠海横乐医学科技有限公司 Operation navigation method and medium based on ultrasonic image characteristic plane detection
CN112612274A (en) * 2020-12-22 2021-04-06 清华大学 Autonomous motion decision control method and system for ultrasonic inspection robot
CN113218400A (en) * 2021-05-17 2021-08-06 太原科技大学 Multi-agent navigation algorithm based on deep reinforcement learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
三维超声影像导航机器人系统的临床应用;刘少丽;杨向东;冯涛;陈恳;梁萍;;中国生物医学工程学报(第06期);全文 *

Also Published As

Publication number Publication date
CN114052795A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN114901194B (en) Anatomical feature identification and targeting
AU2015264243B2 (en) Image guided autonomous needle insertion device for vascular access
JP2021527522A (en) Surgical Robot Artificial Intelligence for Surgical Surgery
CN104605917B (en) A kind of intelligence punctures control method and device
CN1989903A (en) Medical treatment device and associated method of operation
WO2023095492A1 (en) Surgery assisting system, surgery assisting method, and surgery assisting program
CN114224448B (en) Puncture path planning device, puncture path planning apparatus, and puncture path planning program
CN114052795B (en) Focus imaging and anti-false-prick therapeutic system combined with ultrasonic autonomous scanning
CN1836624A (en) Intelligent endoscope visual navigation system and method
Bi et al. Machine learning in robotic ultrasound imaging: Challenges and perspectives
CN116807577B (en) Full-automatic venipuncture equipment and full-automatic venipuncture method
CN109745074A (en) A kind of system and method for 3-D supersonic imaging
CN111466952B (en) Real-time conversion method and system for ultrasonic endoscope and CT three-dimensional image
Guo et al. Study on the automatic surgical method of the vascular interventional surgical robot based on deep learning
WO2020257533A1 (en) Devices and methods for targeting implant deployment in tissue
Mi et al. Detecting carotid intima-media from small-sample ultrasound images
Cao et al. Venibot: Towards autonomous venipuncture with automatic puncture area and angle regression from nir images
CN114430671A (en) Automatic closed-loop ultrasound plane steering for target localization in ultrasound imaging and associated devices, systems, and methods
JP7148193B1 (en) Surgery support system, surgery support method, and surgery support program
Deng et al. A Portable Robot-Assisted Device with Built-in Intelligence for Autonomous Ultrasound Acquisitions in Follow-Up Diagnosis
US20240299097A1 (en) Apparatus and method for matching an actual surgical image with a 3d-based virtual simulated surgical image
CN117994636B (en) Puncture target identification method, system and storage medium based on interactive learning
Li et al. Image processing and modeling for active needle steering in liver surgery
Kim et al. Micromanipulation in Surgery: Autonomous Needle Insertion Inside the Eye for Targeted Drug Delivery
US20240315779A1 (en) Systems and apparatuses for for navigation and procedural guidance of laser leaflet resection under intracardiac echocardiography

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant