CN113057734A - Surgical system - Google Patents

Surgical system Download PDF

Info

Publication number
CN113057734A
CN113057734A CN202110272335.9A CN202110272335A CN113057734A CN 113057734 A CN113057734 A CN 113057734A CN 202110272335 A CN202110272335 A CN 202110272335A CN 113057734 A CN113057734 A CN 113057734A
Authority
CN
China
Prior art keywords
real
time
image
surgical
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110272335.9A
Other languages
Chinese (zh)
Inventor
常兆华
何超
陈浩
常新朝
何洪军
周佳音
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN202110272335.9A priority Critical patent/CN113057734A/en
Publication of CN113057734A publication Critical patent/CN113057734A/en
Priority to PCT/CN2022/078302 priority patent/WO2022188651A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Abstract

The invention provides a surgical system, which comprises a control device and a surgical device; the control device comprises a focus identification module, an automatic planning module and a control module; the focus identification module is used for acquiring real-time focus information according to the intraoperative real-time medical image and the preoperative medical image; the automatic planning module is used for planning the operation path according to the real-time lesion information so as to obtain a target operation path; the control module is used for controlling the surgical equipment to execute the surgery according to the acquired surgical operation parameters and the target surgery path. The invention has the advantages of accuracy, reliability, safety, high efficiency and the like, can avoid numerous risks brought by the fact that the operation process completely depends on the clinical experience of doctors, and can effectively reduce the workload of doctors and improve the working efficiency of the doctors, so that the doctors can have more energy to put into the optimization of disease analysis and treatment schemes.

Description

Surgical system
Technical Field
The invention relates to the technical field of medicine, in particular to a surgical system.
Background
The cryoablation is mainly realized by a low-temperature apparatus, and focus tissues are subjected to cooling, freezing and rewarming processes in a controlled manner, so that irreversible damage and even necrosis of cells are caused. For example, the mechanism of tumor killing by cryoablation is: cell dehydration and shrinkage; intracellular ice crystal formation and mechanical damage of ice crystals; concentration of cell electrolyte toxicity and PH change; denaturation of cell membrane lipoprotein components; blood flow stasis and microthrombosis; immune effects, etc. The cryoablation operation not only has small operation wound, but also has the advantages of anesthesia pain, few postoperative complications, tumor diffusion prevention and the like, and is well received by doctors and patients.
At present, focus identification in a cryotherapy operation or a puncture operation is mainly carried out by judging the position and the volume of a focus through a preoperative nuclear Magnetic Resonance (MR) image and a CT image, and the problems of single image, incomplete focus display, sequential imaging time and operation time, incomplete current focus display and the like exist. A doctor judges a focus according to a preoperative image, plans a puncture path according to clinical experience, determines freezing parameters, performs cryoablation operation, cannot monitor the whole process, and has a plurality of risks due to the fact that the operation process completely depends on the clinical experience of the doctor.
It is noted that the information disclosed in this background section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
Disclosure of Invention
The invention aims to provide a surgical system which can solve the problems that in the prior art, doctors judge focuses by means of preoperative images, plan surgical paths according to clinical experience, determine surgical operation parameters and perform surgery, the whole surgical process cannot be monitored in real time, the surgical process completely depends on the clinical experience of the doctors, and numerous risks exist.
In order to solve the technical problem, the invention provides a surgical system, which comprises a control device and a surgical device, wherein the surgical device is in communication connection with the control device;
the control device comprises a focus identification module, an automatic planning module and a control module which are in communication connection;
the focus identification module is used for acquiring real-time focus information according to the intraoperative real-time medical image and the preoperative medical image;
the automatic planning module is used for planning a surgical path according to the real-time lesion information so as to obtain a target surgical path;
the control module is used for controlling the surgical equipment to execute the surgery according to the acquired surgical operation parameters and the target surgery path.
Optionally, the surgical system further includes a first image acquisition device, and the first image acquisition device is in communication connection with the control device and is used for acquiring intraoperative real-time medical images.
Optionally, the lesion identification module includes an image acquisition unit, an image registration unit, and a lesion identification unit, which are in communication connection;
the image acquisition unit is used for acquiring preoperative medical images and intraoperative real-time medical images;
the image registration unit is used for registering the preoperative medical image and the intraoperative real-time medical image to obtain a real-time registration image;
the focus identification unit is used for acquiring real-time focus information according to the real-time registration image.
Optionally, if the preoperative medical image is a CT image or an MRI image, and the intraoperative real-time medical image is a CT image or an MRI image, the image registration unit acquires a real-time registration image, including:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
performing three-dimensional modeling on the intraoperative real-time medical image to obtain an intraoperative real-time three-dimensional medical image;
registering the pre-operative three-dimensional medical image to the intra-operative real-time three-dimensional medical image to obtain a real-time registered image.
Optionally, if the preoperative medical image is a CT image or an MRI image, and the intraoperative real-time medical image is an ultrasound image, the image registration unit obtains a real-time registration image, including:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
registering the preoperative three-dimensional medical image to the intraoperative real-time medical image to obtain a first real-time registered image;
registering the intraoperative real-time medical image to the preoperative three-dimensional medical image to obtain a second real-time registered image.
Optionally, the surgical system further includes a second image acquisition device in communication connection with the control device, where the second image acquisition device is used to acquire an intra-operative real-time patient skin image;
the image acquisition unit is also used for acquiring an intraoperative real-time patient skin image;
the image registration unit acquires a real-time registration image, including:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
performing three-dimensional modeling on the intraoperative real-time medical image to obtain an intraoperative real-time three-dimensional medical image;
carrying out three-dimensional modeling on the intraoperative real-time patient skin image to obtain an intraoperative real-time human body model image;
registering and fusing the preoperative three-dimensional medical image and the intraoperative real-time three-dimensional medical image to obtain a first real-time fusion image;
registering the first real-time fusion image to the intra-operative real-time manikin image to obtain a real-time registration image.
Optionally, the acquiring, by the lesion recognizing unit, real-time lesion information includes:
and identifying the focus of the registered image by adopting a pre-trained deep neural network model so as to acquire real-time focus information.
Optionally, the acquiring, by the lesion recognizing unit, real-time lesion information includes:
segmenting the preoperative three-dimensional medical image by adopting a pre-trained deep neural network model to obtain a segmented image;
fusing the segmented image with the first real-time registration image to obtain a second real-time fused image;
fusing the segmented image with the second real-time registration image to obtain a third real-time fused image;
fusing the second real-time fusion image with the third real-time fusion image to obtain a fourth real-time fusion image;
and acquiring real-time focus information according to the fourth real-time fusion image.
Optionally, the real-time lesion information includes real-time lesion location information. Further optionally, the real-time lesion information further includes one or more of real-time lesion volume information, real-time lesion shape information, and real-time key organ tissue information.
Optionally, the obtaining of the target surgical path by the automatic planning module includes:
acquiring position information of at least one focus target point and a plurality of puncture points according to the real-time focus information;
evaluating each puncture point according to preset conditions to obtain a target puncture point;
and connecting the corresponding focus target point and the target puncture point to obtain a target puncture path.
Optionally, the obtaining the position information of at least one lesion target point and a plurality of puncture points according to the real-time lesion information includes:
acquiring a space mapping relation between an image coordinate system and a surgical equipment coordinate system;
acquiring real-time position information of the focus under the coordinate system of the surgical equipment according to the space mapping relation and the real-time focus information;
and acquiring the position information of at least one focus target point and a plurality of puncture points in the coordinate system of the surgical equipment according to the real-time position information of the focus in the coordinate system of the surgical equipment.
Optionally, the evaluating each puncture point according to a preset condition to obtain a target puncture point includes:
a, scoring each puncture point according to a preset scoring criterion, and taking the puncture point with the highest score as a target puncture point;
step B, judging whether the target puncture point can cover all focuses;
if not, executing the step C;
c, scoring each non-target puncture point according to a preset scoring criterion, and taking the non-target puncture point with the highest score as a target puncture point;
d, judging whether all the target puncture points can cover all the focuses together;
if not, repeating the steps C and D until all the target puncture points can cover the focus together.
Optionally, the scoring each puncture point according to a preset scoring criterion, and taking the puncture point with the highest score as a target puncture point, includes:
scoring each puncture point according to a preset multi-item scoring criterion so as to obtain each score of each puncture point;
respectively calculating the comprehensive scores of the puncture points according to the weights corresponding to the preset scoring criteria;
taking the puncture point with the highest comprehensive score as a target puncture point;
the scoring of each non-target puncture point according to a preset scoring criterion and the taking of the non-target puncture point with the highest score as the target puncture point comprise the following steps:
scoring each non-target puncture point according to a preset multi-item scoring criterion so as to obtain each score of each non-target puncture point;
calculating the comprehensive score of each non-target puncture point according to the weight corresponding to each preset score criterion;
and taking the non-target puncture point with the highest comprehensive score as a target puncture point.
Optionally, the control device further includes a functional safety module in communication connection with the lesion identification module, and the functional safety module is configured to monitor a real-time motion trajectory of the surgical device according to the registration image output by the image registration unit.
Optionally, the functional safety module is further configured to obtain safety operation boundary information according to the real-time lesion information, and determine whether a real-time motion trajectory of the surgical device exceeds a safety operation boundary region according to the safety operation boundary information.
Optionally, the automatic planning module is further configured to plan surgical operation parameters according to the real-time lesion information.
Optionally, the surgical device includes a driving unit and a surgical instrument, the surgical instrument is mounted on the driving unit, and the control module is configured to control the driving unit to drive the surgical instrument to perform a surgery according to the acquired surgical operation parameter and the target surgical path.
Optionally, the driving unit is a mechanical arm, and a holder for holding the surgical instrument is mounted at a tail end of the mechanical arm.
Optionally, the surgical instrument is an instrument for performing a puncture procedure;
the automatic planning module is used for planning a puncture path according to the real-time lesion information so as to obtain a target puncture path;
the control module is used for controlling the surgical equipment to execute the puncture operation according to the acquired operation parameters and the target puncture path.
Optionally, the surgical instrument is a cryoablation needle, the surgical device further includes a refrigerating device, and the control module is configured to control the refrigerating device to provide a cold source for the cryoablation needle according to the surgical operation parameter.
Optionally, the surgical procedure parameters include a freezing time, a number of freezing cycles, and a freezing dose.
Optionally, the surgical system further includes a human-computer interaction module in communication connection with the control device, and the human-computer interaction module is used for displaying and interacting data.
Optionally, the control device further includes a data storage module, and the data storage module is configured to store and manage data.
Optionally, the first image acquisition device is an ultrasound apparatus, the surgical system further includes a support, the support includes a base, and a first fixing device and a second fixing device installed on the base, the first fixing device is used for fixing an ultrasound probe of the ultrasound apparatus, the second fixing device is used for fixing a head cover of the ultrasound apparatus, and the first fixing device can be close to and away from the second fixing device.
Compared with the prior art, the surgical system provided by the invention has the following advantages:
(1) compared with the operation system in the prior art, the operation system provided by the invention can ensure the accuracy of the operation, avoid numerous risks brought by the fact that the operation process completely depends on the clinical experience of a doctor, effectively reduce the workload of the doctor, improve the working efficiency of the doctor and further enable the doctor to have more energy to put into the optimization of the disease analysis and treatment scheme.
(2) The lesion identification module acquires a preoperative medical image and an intraoperative real-time medical image, and registers the preoperative medical image and the intraoperative real-time medical image to acquire a real-time registration image, so that a high-definition intraoperative real-time image can be acquired, and then real-time lesion information is acquired according to the real-time registration image, so that the accuracy of lesion identification can be effectively improved, the workload of a doctor is reduced, a good basis is laid for a subsequent operation path planning stage and an operation execution stage, the acquired lesion information and the operation time can be ensured to be synchronous, the problem that the current lesion cannot be completely displayed due to the fact that the imaging time precedes the operation time in the prior art can be solved, and the lesion is more favorably eliminated.
(3) The control device also comprises a functional safety module, the functional safety module can monitor the real-time motion track of the surgical equipment in the surgical execution process according to the real-time registration image, and the real-time registration image is high-definition and accurate, so that the rapid and accurate monitoring of the surgical equipment can be realized; meanwhile, the functional safety module can acquire safety operation boundary information according to the real-time focus information, so that the accuracy of a safety boundary is ensured, and the safety performance of the surgical system in the surgical process is greatly improved.
(4) The operation system provided by the invention also comprises the human-computer interaction module, so that the whole operation process can be displayed in real time through the human-computer interaction module, the whole operation process can be completely carried out under the monitoring of a doctor, and the doctor can observe real-time images in the operation process, so that the doctor can obtain more operation information, and the operation risk is further reduced.
Drawings
FIG. 1 is a schematic view of an application scenario of a surgical system according to an embodiment of the present invention;
FIG. 2 is a block diagram of a surgical system in accordance with one embodiment of the present invention;
FIG. 3 is a schematic structural view of a stent according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating image registration according to a first embodiment of the present invention;
FIG. 5 is a flowchart illustrating image registration according to a second embodiment of the present invention;
FIG. 6 is a flowchart illustrating image registration according to a third embodiment of the present invention;
FIG. 7 is a schematic flow chart illustrating lesion identification using a deep neural network model according to an embodiment of the present invention;
FIG. 8 is a schematic flow chart illustrating lesion identification using a deep neural network model according to another embodiment of the present invention;
FIG. 9 is a schematic diagram illustrating a training process of a deep neural network model according to an embodiment of the present invention;
FIG. 10 is a schematic flow chart illustrating a puncture path planning process according to an embodiment of the present invention;
FIG. 11 is a schematic view of a procedure for obtaining a target puncture site according to an embodiment of the present invention;
FIG. 12 is a block diagram of a surgical system according to another embodiment of the present invention;
fig. 13 is a schematic workflow diagram of a functional security module according to an embodiment of the present invention;
FIG. 14 is a partial schematic view of a surgical device according to an embodiment of the present invention;
fig. 15 is a partial structural schematic diagram of a driving unit according to an embodiment of the present invention;
FIG. 16 is a schematic structural view of a stent according to another embodiment of the present invention;
fig. 17 is a schematic structural diagram of a refrigeration apparatus according to an embodiment of the present invention;
FIG. 18 is a partial schematic structural view of a cryoablation needle according to an embodiment of the present invention;
fig. 19 is a schematic flow chart of a cryoablation procedure according to an embodiment of the present invention.
Wherein the reference numbers are as follows:
a first image capture device-100; control device-200; surgical device-300; a lesion identification module-210; an image acquisition unit-211; an image registration unit-212; a lesion recognition unit-213; an automatic planning module-220; a control module-230; a second image capture device-400; a drive unit-310; a surgical instrument-320; a holder-311; functional security module-240; a human-computer interaction module-600; a data storage module-250; bracket-500; a base-510; a first firmware device-520; a second fixture-530; an ultrasound probe-110; a head cover-120; scale-321; a thermal barrier coating-322; a display screen-323; -a refrigeration unit-330; a pre-cooling device-331; heat exchange means-332; a source of refrigerated gas-333; a valve-334.
Detailed Description
The surgical system of the present invention is described in further detail below with reference to figures 1 through 19 and the detailed description. The advantages and features of the present invention will become more apparent from the following description. It is to be noted that the drawings in the present application are in a very simplified form and are all used in a non-precise scale for the purpose of facilitating and clearly aiding the description of the embodiments of the present invention. To make the objects, features and advantages of the present invention comprehensible, reference is made to the accompanying drawings. It should be understood that the structures, proportions, sizes, and other elements shown in the drawings and described herein are illustrative only and are not intended to limit the scope of the invention, which is to be given the full breadth of the appended claims and any and all modifications, equivalents, and alternatives to those skilled in the art should be construed as falling within the spirit and scope of the invention. Specific design features of the invention disclosed herein, including, for example, specific dimensions, orientations, locations, and configurations, will be determined in part by the particular intended application and use environment. In the embodiments described below, the same reference numerals are used in common between different drawings to denote the same portions or portions having the same functions, and a repetitive description thereof will be omitted. In this specification, like reference numerals and letters are used to designate like items, and therefore, once an item is defined in one drawing, further discussion thereof is not required in subsequent drawings. Additionally, if the method described herein comprises a series of steps, the order in which these steps are presented herein is not necessarily the only order in which these steps may be performed, and some of the described steps may be omitted and/or some other steps not described herein may be added to the method.
Moreover, it is noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The invention mainly aims to provide a surgical system which is used for performing automatic surgery, has the advantages of accuracy, reliability, safety, high efficiency and the like, and can solve the problems that in the prior art, a doctor judges a focus by depending on a preoperative image, plans a surgical path according to clinical experience, determines surgical operation parameters, and has a plurality of risks due to the fact that the surgical process completely depends on the clinical experience of the doctor.
To achieve the above object, the present invention provides a surgical system, please refer to fig. 1 and fig. 2, wherein fig. 1 schematically illustrates an application scenario of the surgical system according to an embodiment of the present invention; fig. 2 is a block schematic diagram of a surgical system according to an embodiment of the present invention. As shown in fig. 1 and 2, the surgical system includes a control device 200 and a surgical device 300, and the surgical device 300 is communicatively connected to the control device 200. Optionally, the surgical system further comprises a first image acquisition device 100. The first image acquisition device 100 is in communication connection with the control device 200 and is used for acquiring intraoperative real-time medical images. Specifically, the first image acquisition apparatus 100 may be an ultrasound apparatus, a CT apparatus or an MR apparatus, through which an intraoperative real-time ultrasound image may be acquired, an intraoperative real-time CT image may be acquired, and an intraoperative real-time MRI image may be acquired. Of course, as will be understood by those skilled in the art, the first image acquisition device may be other imaging devices capable of acquiring medical images besides an ultrasound apparatus, a CT apparatus, and an MR apparatus, which is not limited by the present invention.
Referring to fig. 3, a schematic structural diagram of a bracket according to an embodiment of the present invention is shown. As shown in fig. 3, when the first image capturing device 100 is an ultrasound apparatus, the surgical system further includes a bracket 500, the bracket 500 includes a base 510, and a first fixing device 520 and a second fixing device 520 mounted on the base 510, the first fixing device 520 is used for fixing the ultrasound probe 110 of the ultrasound apparatus, the second fixing device 530 is used for fixing the headgear 120 of the ultrasound apparatus, and the first fixing device 520 can be close to and far from the second fixing device 530. Therefore, the placement of the ultrasonic probe 110 and the headgear 120 can be more convenient by providing the bracket 500, wherein the headgear 120 is used for wrapping the ultrasonic probe 110 when the ultrasonic probe 110 is used, so that the headgear 120 can effectively prevent cross infection of germs, and further improve the safety performance in the operation process. When the ultrasound probe 110 is required to acquire an image, the first fixing device 520 may be moved toward the position of the second fixing device 530, so as to insert the ultrasound probe 110 into the head cover 120, and finally, the ultrasound probe 110 with the head cover 120 sleeved thereon is inserted into the patient, so as to acquire a real-time medical image during an operation.
Specifically, a slide rail may be disposed on the base 510, and the first fixing device 520 may be connected to a driving device, such as a motor, and under the driving of the driving device, the first fixing device 520 may slide along the slide rail to approach and separate from the second fixing device 530. Of course, as will be appreciated by those skilled in the art, in other embodiments, the first fastening device 520 may be moved toward and away from the second fastening device 530 by other means known in the art.
Referring to fig. 1 and 2, as shown in fig. 1 and 2, the control device 200 includes a lesion identification module 210, an automatic planning module 220, and a control module 230, which are communicatively connected, wherein the lesion identification module 210 is configured to obtain real-time lesion information according to an intra-operative real-time medical image and a pre-operative medical image; the automatic planning module 220 is configured to plan a surgical path according to the real-time lesion information to obtain a target surgical path; the control module 230 is configured to control the surgical device 300 to perform a surgical procedure according to the acquired surgical operation parameters and the target surgical path. Therefore, the surgical system provided by the invention can automatically acquire real-time lesion information according to the intraoperative real-time medical image and the preoperative medical image through the lesion identification module 210; the automatic planning module 220 can automatically plan a surgical path according to the real-time lesion information, such as planning a puncture surgical path, planning a cutting surgical path, and the like; the control module 230 can automatically control the surgical device 300 to perform the surgery according to the planned path (i.e., the target path) so as to achieve the purpose of eliminating the lesion. Therefore, the surgical system provided by the invention can plan the surgical path according to the acquired real-time lesion information, so that the real-time performance of the planned target surgical path can be ensured, and further the subsequent surgical execution stage can be more favorably and more accurately eliminated.
Preferably, the real-time lesion information may include real-time lesion position information, real-time lesion volume information, real-time lesion shape information, real-time key organ and tissue information, and the like. It is noted that, as will be appreciated by those skilled in the art, in some embodiments, the real-time lesion information may include only real-time lesion location information; in some embodiments, the real-time lesion information may include only real-time lesion location information and real-time lesion volume information; in some embodiments, the real-time lesion information may include only real-time lesion location information and real-time lesion shape information; in some embodiments, the real-time lesion information may include real-time lesion location information, real-time lesion volume information, real-time lesion shape information; in some embodiments, the real-time lesion information may include only real-time lesion location information and real-time critical organ tissue information. In addition, the real-time critical organ and tissue information includes information of the organ where the real-time lesion is located, the peripheral organs at risk, and the like.
Further, as shown in fig. 1 and 2, the lesion identification module 210 includes an image acquisition unit 211, an image registration unit 212, and a lesion identification unit 213; the image acquiring unit 211 is configured to acquire a preoperative medical image and an intraoperative real-time medical image; the image registration unit 212 is configured to register the preoperative medical image and the intraoperative real-time medical image to obtain a real-time registration image; the lesion recognizing unit 213 is configured to obtain real-time lesion information according to the real-time registration image. Therefore, the lesion identification module 210 in the present invention acquires a preoperative medical image and an intraoperative real-time medical image, and registers the preoperative medical image and the intraoperative real-time medical image to acquire a real-time registration image, so as to acquire a high-definition intraoperative real-time image, and then acquires real-time lesion information according to the real-time registration image, which not only can effectively improve the lesion identification accuracy, reduce the doctor workload, lay a good foundation for the subsequent surgical path planning stage and the surgical execution stage, but also can ensure that the acquired real-time lesion information is synchronous with the surgical time, thereby solving the problem that the current lesion cannot be completely displayed because the imaging time precedes the surgical time in the prior art, and being more beneficial to the elimination of the lesion.
Continuing to refer to fig. 4, a schematic flow chart of image registration according to a first embodiment of the present invention is schematically shown. As shown in fig. 4, when the preoperative medical image is a CT image or an MRI image, and the intraoperative real-time medical image is a CT image or an MRI image, the image registration unit 212 registers the preoperative medical image and the intraoperative real-time medical image to obtain a real-time registration image by specifically:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
performing three-dimensional modeling on the intraoperative real-time medical image to obtain an intraoperative real-time three-dimensional medical image;
registering the pre-operative three-dimensional medical image to the intra-operative real-time three-dimensional medical image to obtain a real-time registered image.
Image registration refers to finding a spatial transformation or series of spatial transformations for one image to make it spatially matched with a corresponding point on the other image, and for medical images, this matching refers to the same anatomical point on the human body having the same spatial position on the two matched medical images. There are many registration methods for medical images, and the classification can be performed according to different classification methods, for example: according to the difference of the selected image characteristic quantity, the registration method based on the image internal characteristic and the registration method based on the image external characteristic can be divided. The method based on the image internal features comprises the following steps: boundary-based methods, voxel similarity-based methods; the method based on the external features of the image comprises the following steps: a calibration frame method, a skin external marking method and the like. If the registration geometric transformation is divided into linear and non-linear ones, it can be divided into linear registration transformation and non-linear registration transformation. The linear registration transformation comprises a rigid transformation, an affine transformation and a projective transformation; the non-linear registration transformation is what we generally call an elastic registration transformation. Specifically, in this embodiment, an elastic registration method may be adopted, that is, by constructing an appropriate elastic transformation model, the preoperative three-dimensional medical image is registered to the intraoperative real-time three-dimensional medical image based on the elastic transformation model, so as to obtain a registered intraoperative real-time three-dimensional medical image, that is, a registration image. Since the local adaptability of the elastic registration is stronger, more accurate registration can be realized by adopting the elastic registration method. It should be noted that, as will be understood by those skilled in the art, in other embodiments, other image registration methods besides the elastic registration method may be used to register the preoperative three-dimensional medical image to the intraoperative real-time three-dimensional medical image, such as a rigid transformation registration method, an affine transformation registration method, a projection transformation registration method, etc., which is not limited by the present invention.
Since the preoperative CT image or MRI image acquired in the embodiment is a high-definition image, the preoperative three-dimensional medical image is registered to the intraoperative real-time three-dimensional medical image, so that the definition of the registered intraoperative real-time three-dimensional medical image can be effectively improved, a good foundation can be laid for subsequent lesion identification, and the accuracy of lesion identification can be improved. In addition, according to the embodiment, the preoperative medical image and the intraoperative real-time medical image are subjected to three-dimensional modeling, and then the preoperative three-dimensional medical image is registered to the intraoperative real-time three-dimensional medical image, so that a three-dimensional registration image can be obtained, and further the real-time lesion information can be displayed more comprehensively through the three-dimensional registration image.
Further, the real-time registration image is an image obtained by fusing the preoperative three-dimensional medical image after registration and the intraoperative real-time three-dimensional medical image. The image fusion is to synthesize two or more images into a new image by using a specific algorithm, and the fusion result can utilize the correlation of the two (or more) images on time and space and the complementarity of the information, so that the fused image has more comprehensive and clearer description on a scene. Therefore, the preoperative three-dimensional medical image after registration and the intraoperative real-time three-dimensional medical image are fused to obtain the real-time registration image, so that the real-time registration image fuses information in the preoperative three-dimensional medical image and the intraoperative real-time three-dimensional medical image, the focus in the real-time registration image can be displayed more comprehensively, and the accuracy of subsequent focus identification can be further improved.
Continuing to refer to fig. 5, a schematic flow chart of image registration according to a second embodiment of the present invention is schematically shown. As shown in fig. 5, in the present embodiment, the preoperative medical image is a CT image or an MRI image, the intraoperative real-time medical image is an ultrasound image, and the image registration unit 212 registers the preoperative medical image and the intraoperative real-time medical image specifically through the following processes to obtain a real-time registration image:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
registering the preoperative three-dimensional medical image to the intraoperative real-time medical image to obtain a first real-time registered image;
registering the intraoperative real-time medical image to the preoperative three-dimensional medical image to obtain a second real-time registered image.
Thus, by registering the preoperative three-dimensional medical image to the intraoperative real-time medical image, a local two-dimensional real-time image, i.e. a first real-time registered image, may be obtained; by registering the intraoperative real-time medical image to the preoperative three-dimensional medical image, a global three-dimensional real-time image, i.e. a second real-time registered image, may be obtained, whereby more information about the lesion may be obtained by the first real-time registered image and the second real-time registered image.
Further, in order to improve the registration efficiency, the registering the intraoperative real-time medical image to the preoperative three-dimensional medical image to acquire a second real-time registered image includes:
performing three-dimensional modeling on the intraoperative real-time medical image to obtain an intraoperative real-time three-dimensional medical image;
registering the intraoperative real-time three-dimensional medical image to the preoperative three-dimensional medical image to obtain a second real-time registered image.
Therefore, the intraoperative real-time three-dimensional medical image is obtained by performing three-dimensional modeling on the intraoperative real-time medical image, and then the intraoperative real-time three-dimensional medical image is registered to the preoperative three-dimensional medical image to obtain a second real-time registration image, so that the registration efficiency can be greatly improved. Specifically, the ultrasound probe may be used to perform continuous scanning to obtain ultrasound images of different scanning layers (i.e., intraoperative real-time medical images), and then the ultrasound images of different scanning layers are subjected to three-dimensional modeling to obtain intraoperative real-time three-dimensional medical images. It is noted that, as will be appreciated by those skilled in the art, in other embodiments, the ultrasound images obtained for the different scan layers may also be registered layer by layer to the preoperative three-dimensional medical image to acquire the second real-time registered image.
With continued reference to fig. 2, as shown in fig. 2, the surgical system further includes a second image capturing device 400 communicatively connected to the control device, wherein the second image capturing device 400 is configured to capture real-time intraoperative patient skin images in real time.
Referring to fig. 6, which schematically shows a flowchart of image registration according to a third embodiment of the present invention, as shown in fig. 6, the image registration unit 212 registers the preoperative medical image and the intraoperative real-time medical image to obtain a real-time registration image, specifically, through the following processes:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
performing three-dimensional modeling on the intraoperative real-time medical image to obtain an intraoperative real-time three-dimensional medical image;
carrying out three-dimensional modeling on the intraoperative real-time patient skin image to obtain an intraoperative real-time human body model image;
registering and fusing the preoperative three-dimensional medical image and the intraoperative real-time three-dimensional medical image to obtain a first real-time fusion image;
registering the first real-time fusion image to the intra-operative real-time manikin image to obtain a real-time registration image.
Therefore, in the present embodiment, the second image acquisition device 400 acquires the real-time intraoperative patient skin image, the image registration unit 212 performs three-dimensional modeling on the real-time intraoperative patient skin image, so as to obtain an intraoperative real-time human body model image, and then the first real-time fusion image obtained by registering and fusing the preoperative three-dimensional medical image and the intraoperative real-time three-dimensional medical image is registered to the intraoperative real-time human body model image, so as to obtain a three-dimensional real-time registration image including a human body contour.
Specifically, in this embodiment, the preoperative medical image may be a CT image or an MRI image, and the intraoperative real-time medical image may be a CT image, an MRI image, or an ultrasound image. The second image capturing device 400 may be an optical monitor, and in this case, the second image capturing device 400 may capture real-time intraoperative patient skin images based on an optical tracking method. Of course, as will be understood by those skilled in the art, the second image capturing device 400 may also be a binocular camera, in which case the second image capturing device 400 may capture real-time intraoperative patient skin images based on binocular vision measurement principles.
Further, the real-time registration image is an image obtained by fusing the registered intraoperative real-time human body model image and the first real-time fusion image. Thus, the real-time registration image obtained at this time simultaneously fuses the information in the preoperative three-dimensional medical image, the intraoperative real-time three-dimensional medical image and the intraoperative three-dimensional human body model image. Therefore, the focus in the real-time registration image can be displayed more comprehensively, the accuracy of subsequent focus identification is further improved, a good foundation is laid for the subsequent automatic operation path planning and the automatic operation executing stage, and the automatic operation effect of the operation system provided by the invention is further improved.
In an exemplary embodiment, the lesion identification unit 213 performs lesion identification on the real-time registration image by using a pre-trained deep neural network model to obtain real-time lesion information.
The real-time registration image may be a real-time registration image obtained by using the registration method provided in the first embodiment or the third embodiment. Therefore, the method adopts the pre-trained deep neural network model to identify the focus of the real-time registration image, so that the focus can be automatically identified, a good foundation is laid for the subsequent automatic operation path planning and automatic operation execution stages, and the automatic operation effect of the operation system provided by the invention is further improved; meanwhile, the workload of doctors is effectively reduced, the working efficiency of the doctors is improved, and the doctors can put more energy into the analysis of the patient's condition and the optimization of the treatment scheme. In addition, the real-time registration image is identified by adopting a pre-trained deep neural network model, so that the accuracy and efficiency of focus identification can be effectively improved. Specifically, the identified lesion, the organ where the lesion is located, and the peripheral organs at risk may be displayed in different colors and shades in the real-time registration image, so that the doctor may more conveniently observe the location of the lesion.
Referring to fig. 7, a schematic flow chart of lesion identification using a deep neural network model according to an embodiment of the present invention is shown. As shown in fig. 7, in the present embodiment, the lesion may be identified using a neural network model through the following process:
step one, inputting a real-time registration image;
performing modular division according to the characteristics of the real-time registration image to obtain a plurality of image modules;
identifying the characteristics of each image module, and reconstructing the image based on the identified characteristics to obtain a corresponding reconstructed image;
fourthly, obtaining real-time focus information and accuracy rate information through deep learning based on the reconstructed image;
step five, judging whether the accuracy is greater than a preset threshold value or not;
if yes, outputting the real-time focus information;
if not, returning to the step two, and re-identifying the focus until the accuracy is greater than the preset threshold.
In this embodiment, the real-time registered image may be a real-time registered image obtained by using the registration method provided in the first embodiment or the third embodiment.
Referring to fig. 8, a schematic flow chart of identifying a lesion using a deep neural network model according to another embodiment of the present invention is shown. As shown in fig. 8, in this embodiment, the lesion identification unit 213 acquires real-time lesion information by the following processes:
segmenting the preoperative three-dimensional medical image by adopting a pre-trained deep neural network model to obtain a segmented image;
fusing the segmented image with the first real-time registration image to obtain a second real-time fused image;
fusing the segmented image with the second real-time registration image to obtain a third real-time fused image;
fusing the second real-time fusion image with the third real-time fusion image to obtain a fourth real-time fusion image;
and acquiring real-time focus information according to the fourth real-time fusion image.
Therefore, the acquired preoperative three-dimensional medical image is segmented by adopting a pre-trained deep neural network model, so that a segmented image containing a focus, a focus organ and a peripheral endangered organ can be acquired. Specifically, the focus, the organ where the focus is located, and the peripheral organs at risk may be displayed in different colors and shades, the segmented image may be fused with the first real-time registration image and the second real-time registration image, respectively, to obtain a second real-time fusion image and a third real-time fusion image, and finally the second real-time fusion image and the third real-time fusion image may be fused, so that a fourth real-time fusion image including real-time coordinates of the focus, the organ where the focus is located, and the peripheral organs at risk may be output.
Please continue to refer to fig. 9, which schematically shows a training process of the deep neural network model according to an embodiment of the present invention. As shown in fig. 9, the deep neural network model can be obtained by the following process:
obtaining a training sample;
carrying out preprocessing such as region segmentation, focus and key organ labeling on the training sample;
constructing a three-dimensional segmentation depth neural network loss function based on the idea of clustering, and designing a three-dimensional depth neural network structure;
inputting the preprocessed training sample into the three-dimensional deep neural network structure for training to obtain a deep neural network model.
Wherein the training samples are from existing case images. Regarding how to construct a three-dimensional segmentation deep neural network loss function based on a clustering idea, and to design a three-dimensional deep neural network structure, and how to train the three-dimensional deep neural network structure through a preprocessed training sample to obtain a deep neural network model, reference may be made to the prior art, so that the present invention is not described herein again.
Please refer to fig. 10, which schematically shows a flow chart of planning a puncture path according to an embodiment of the present invention. As shown in fig. 10, the automatic planning module 220 performs the planning of the puncture surgical path specifically through the following processes:
acquiring position information of at least one focus target point and a plurality of puncture points according to the real-time focus information;
evaluating each puncture point according to preset conditions to obtain a target puncture point;
and connecting the corresponding focus target point and the target puncture point to obtain a target puncture path.
Therefore, according to the target puncture path, the needle insertion angle and the needle insertion depth of the puncture instrument can be obtained, and the control module 230 can automatically control the surgical device 300 to reach the lesion position according to the target puncture path so as to automatically execute the puncture operation, thereby achieving the purpose of automatically eliminating the lesion.
Further, the obtaining of the position information of at least one lesion target point and a plurality of puncture points according to the real-time lesion information includes:
acquiring a space mapping relation between an image coordinate system and a surgical equipment coordinate system;
acquiring real-time position information of the focus under the coordinate system of the surgical equipment according to the space mapping relation and the real-time focus information;
and acquiring the position information of at least one focus target point and a plurality of puncture points in the coordinate system of the surgical equipment according to the position information of the focus in the coordinate system of the surgical equipment.
Specifically, the image coordinate system is a coordinate system of a real-time registration image or a fourth real-time fusion image containing the real-time lesion information. The obtaining of the spatial mapping relationship between the coordinate systems can refer to the existing method, and the invention is not repeated herein. Therefore, according to the present invention, a spatial mapping relationship between an image coordinate system and a coordinate system of the surgical device 300 is established, and according to the spatial mapping relationship, the position information of the lesion under the image coordinate system is converted into the position information of the lesion under the coordinate system of the surgical device 300, and then according to the position information of the lesion under the coordinate system of the surgical device 300, the position information of at least one lesion target point and a plurality of puncture points under the coordinate system of the surgical device 300 is obtained, so that it can be ensured that a finally obtained target puncture path is directly associated with the surgical device 300, and the control module 230 can automatically control the surgical device 300 to perform a puncture operation according to the target puncture path.
The focus target point is the end point of the planned puncture path, the position of the acquired focus target point is the end point position of the planned puncture path, the puncture point is the starting point of the planned puncture path, and the position of the acquired puncture point is the starting point position of the planned puncture path. In practical operation, the number and the position of the focus target points can be set according to the position and the number of the focuses, for example, when the focuses are multiple, the number of the focus target points is also multiple, that is, one focus at least corresponds to one focus target point, and multiple target points can be set according to the actual conditions of the focus volume, the focus type and the like corresponding to the same focus. When the puncture point is obtained according to the real-time lesion information, a plurality of puncture points which uniformly cover the lesion can be set according to the information of the position, the volume and the like of the lesion, and can also be set by a doctor according to experience. In one embodiment of the present invention, a plurality of puncture points may be selected based on body surface data of a patient acquired in advance. And evaluating each puncture point according to preset conditions to obtain a target puncture point, and finally connecting the corresponding focus target point and the target puncture point to obtain a target puncture path.
Please continue to refer to fig. 11, which schematically shows a schematic flow chart of acquiring a target puncture point according to an embodiment of the present invention. As shown in fig. 11, the evaluating each of the puncture points according to the preset conditions to obtain the target puncture point specifically includes the following steps:
a, scoring each puncture point according to a preset scoring criterion, and taking the puncture point with the highest score as a target puncture point;
step B, judging whether the target puncture point can cover all focuses;
if not, executing the step C;
c, scoring each non-target puncture point according to a preset scoring criterion, and taking the non-target puncture point with the highest score as a target puncture point;
d, judging whether all the target puncture points can cover all the focuses together;
if not, repeating the steps C and D until all the target puncture points can cover the focus together.
Specifically, if the determination result in the step B is that the target puncture point can cover all lesions, the selection of the target puncture point is ended, and the corresponding lesion target point and the target puncture point are directly connected to obtain the target puncture path. The judgment of whether the target puncture point covers all the lesions means whether the surgical instrument for puncturing can reach all the lesions through the target puncture point without touching the peripheral organs at risk. Similarly, the step of judging whether all the target puncture points can cover all the lesions together means whether the surgical instrument for puncturing can reach all the lesions through all the target puncture points without touching the peripheral organs at risk.
Further, the scoring each puncture point according to a preset scoring criterion, and taking the puncture point with the highest score as a target puncture point, includes:
scoring each puncture point according to a preset multi-item scoring criterion so as to obtain each score of each puncture point;
respectively calculating the comprehensive scores of the puncture points according to the weights corresponding to the preset scoring criteria;
taking the puncture point with the highest comprehensive score as a target puncture point;
the scoring of each non-target puncture point according to a preset scoring criterion and the taking of the non-target puncture point with the highest score as the target puncture point comprise the following steps:
scoring each non-target puncture point according to a preset multi-item scoring criterion so as to obtain each score of each non-target puncture point;
calculating the comprehensive score of each non-target puncture point according to the weight corresponding to each preset score criterion;
and taking the non-target puncture point with the highest score as the target puncture point.
Specifically, the scoring criteria include: the puncture distance (i.e. the distance between the puncture point and the focus target point), whether the puncture path touches the peripheral organs at risk, the number of focus target points that the puncture point can reach, etc. In a specific operation, the plurality of puncture points are connected with the focus target points one by one to obtain a plurality of puncture paths, for example, when the number of the puncture points is N and the number of the focus target points is M, N × M puncture paths can be obtained, that is, each puncture point corresponds to M puncture paths. According to the scoring criterion of the puncture distance of each puncture path, performing first scoring on the corresponding puncture point, wherein the more the number of the puncture paths with the shortest puncture distance in the M puncture paths corresponding to the puncture point is, the higher the first scoring of the puncture point is; according to a scoring criterion that whether each puncture path is touched with the peripheral organs at risk or not, performing second scoring on the corresponding puncture points, wherein the more the number of the puncture paths which are not touched with the peripheral organs at risk in the M puncture paths corresponding to the puncture points is, the higher the second scoring of the puncture points is; thirdly scoring is carried out on each puncture point according to a scoring criterion of the number of the focus target points which can be reached by the puncture point, wherein the more the number of the focus target points which can be reached by the puncture point is, the higher the third scoring corresponding to the puncture point is; and calculating the comprehensive score of each puncture point according to the weight corresponding to each score criterion, and taking the puncture point with the highest comprehensive score as a target puncture point. Since the first-obtained target puncture point may not cover all lesions when the number of lesions is plural, it is also necessary to newly select a target puncture point for an uncovered lesion that is not covered by the first-obtained target puncture point. During specific operation, connecting the other puncture points except the first obtained target puncture point, namely non-target puncture points, with the focus target points corresponding to uncovered focuses one by one to obtain a plurality of puncture paths, scoring each non-target puncture point according to each scoring criterion, obtaining a comprehensive score of each non-target puncture point, taking the non-target puncture point with the highest score as the target puncture point, then judging whether the first obtained target puncture point and the target puncture point obtained at this time can jointly cover all focuses, if not, connecting the other non-target puncture points except all the target puncture points with the focuses corresponding focuses which are not covered by the remaining uncovered focuses one by one to obtain a plurality of puncture paths, and scoring each non-target puncture point according to each scoring criterion, and acquiring a comprehensive score of each non-target puncture point, taking the non-target puncture point with the highest score as a target puncture point, then judging whether all target puncture points (namely all target puncture points acquired for the first time, the second time and this time) can cover all focuses together, if not, repeating the steps, and selecting a new target puncture point until all focuses can be covered.
In order to further improve the operation effect and the operation efficiency of the surgical system provided by the present invention, in an exemplary embodiment, the automatic planning module 220 is further configured to plan the operation parameters according to the real-time lesion information. For example, for a cryoablation operation, information such as the volume and shape of a lesion may be acquired based on the real-time lesion information, and parameters such as a cryoablation volume, a freezing time, the number of freezing cycles, and a freezing dose may be automatically set based on the information such as the volume and shape of the lesion. Therefore, the operation parameters of the operation are automatically planned according to the obtained real-time focus information, so that the workload of doctors can be further reduced, and the operation efficiency is improved. In addition, compared with the mode of determining the operation parameters according to the experience of doctors in the prior art, the method can further reduce the operation risk. It is noted that, as will be appreciated by those skilled in the art, in other embodiments, the surgical procedure parameters may also be parameters that are determined manually by the physician based on the acquired real-time lesion information.
Referring to fig. 12, a block diagram of a surgical system according to another embodiment of the present invention is schematically shown. As shown in fig. 12, in this embodiment, the surgical system further includes a human-machine interaction module 600 communicatively connected to the control device 200, and the human-machine interaction module 600 is configured to display and interact data. The human-computer interaction module 600 may include a display device and interaction software, so that a doctor can view a planned surgical path through the display device and can adjust the surgical path in real time according to actual conditions. In addition, the operation process can be stereoscopically displayed in real time through the display device, and the interactive software can also receive control information of a doctor in real time, such as pausing or controlling the operation process. In addition, the human-computer interaction module 600 can also realize the recording and display of the operation related information. Therefore, the human-computer interaction module 600 is arranged, so that the whole operation process can be completely carried out under the monitoring of a doctor, the doctor has complete control right and can confirm, interrupt and modify the operation process at any time, and meanwhile, the doctor can observe real-time images in the operation through the human-computer interaction module 600, so that the doctor can obtain more operation information, and the operation risk is further reduced.
Further, as shown in fig. 2 and fig. 12, the control device 200 further includes a data storage module 250, and the data storage module 250 is configured to store and manage data. Thus, image data, patient data, surgery-related data, and data management functions may be stored and provided by the data storage module 250.
Further, as shown in fig. 2 and 12, the control device 200 further includes a functional safety module 240 communicatively connected to the lesion identification module 210, wherein the functional safety module 240 is configured to monitor a real-time motion trajectory of the surgical device 300 according to the real-time registration image output by the image registration unit 212. And when the deviation between the real-time motion track of the surgical equipment 300 and the planned path is overlarge, alarm information is output. Specifically, by setting a deviation threshold, when the deviation between the monitored real-time motion trajectory and the planned path is greater than the threshold, alarm information is output. Therefore, in the process of performing an operation on the surgical equipment 300, the functional safety module 240 monitors the real-time motion track of the surgical equipment 300 to avoid deviation of an actual operation path, so that the surgical equipment 300 can be stopped in time when an accident occurs, a patient is prevented from being injured, and the safety performance of the surgical equipment 300 in the process of performing the operation is ensured.
Further, the functional safety module 240 further obtains the real-time lesion information through the lesion identification unit 213 to generate safety operation boundary information, for example, set a region range (i.e., a safety operation boundary region) that does not damage other tissues according to key organ tissue information, and determine whether the real-time motion trajectory of the surgical device 300 exceeds the safety operation boundary region according to the obtained safety operation boundary information. When the real-time motion track of the surgical equipment 300 touches or exceeds a safe operation boundary area, alarm information is output.
Therefore, the control module 230 may control the surgical device 300 to perform a surgery according to the acquired surgical operation parameters, the target surgical path and the safety operation boundary information, the functional safety module 240 may monitor a real-time motion trajectory of the surgical device 300, and when the functional safety module 240 monitors that the surgical device 300 touches an operation boundary, the system may automatically stop operating and give an alarm, so as to further improve safety performance during a surgical procedure. For example, for a puncture operation, when the puncture needle meets an operation boundary, the system automatically stops the operation and automatically retracts the needle to provide safety.
Please continue to refer to fig. 13, which schematically illustrates a workflow of the functional security module according to an embodiment of the present invention. As shown in fig. 13, an intra-operative real-time medical image may be acquired by the first image acquisition apparatus 100, an intra-operative real-time medical image acquired by the first image acquisition apparatus 100 in real time may be acquired by the image acquisition unit 211, a pre-operative medical image may be registered with the intra-operative real-time medical image in real time by the image registration unit 212 to obtain a real-time registered image, the real-time registered image may be displayed by the human-computer interaction module 600, so that a motion trajectory (e.g., a puncture trajectory) of the surgical device 300 may be displayed in real time, the functional safety module 240 may determine in real time whether the motion trajectory (e.g., the puncture trajectory) of the surgical device 300 deviates from a target surgical path and whether the motion trajectory (e.g., the puncture trajectory) of the surgical device 300 exceeds a safety operation boundary, if the judgment result is that the motion track (e.g., puncture track) of the surgical device 300 deviates from the target surgical path too much, or the motion track (e.g., puncture track) of the surgical device 300 exceeds the safe operation boundary, the system will automatically stop and output alarm information, which can be output in modes such as sound, light, man-machine interaction interface display alarm information, etc.
Continuing to refer to fig. 14, a schematic partial structural view of a surgical device 300 according to an embodiment of the present invention is shown. As shown in fig. 14, the surgical device 300 includes a driving unit 310 and a surgical instrument 320, the surgical instrument 320 is mounted on the driving unit 310, and the control module 230 is configured to control the driving unit 310 to drive the surgical instrument 320 to perform a surgery according to the acquired surgical operation parameters and the target surgical path.
Specifically, please refer to fig. 15, which schematically shows a partial structural diagram of a driving unit according to an embodiment of the present invention. As shown in fig. 14 and 15, the driving unit 310 may be a robot arm, and a distal end of the robot arm is mounted with a holder 311 for holding the surgical instrument 320, for example, when the surgical instrument 320 is a puncture instrument, the holder 311 is a puncture instrument by which the puncture instrument can be held. Therefore, the surgical instrument 320 is driven by the mechanical arm to perform the surgery, the surgical risk can be further reduced, and the safety performance in the surgery process can be improved. It should be noted that, as will be understood by those skilled in the art, in other embodiments, the driving unit 310 may be other automatic equipment besides a mechanical arm, and the invention is not limited thereto.
Please continue to refer to fig. 16, which schematically shows a structural diagram of a stent according to another embodiment of the present invention. As shown in fig. 16, in the present embodiment, the holder 500 is provided with a holder 311 for fixing the surgical instrument 320, for example, a puncture device for fixing a puncture instrument, in addition to a first fixing device 520 for fixing the ultrasonic probe 110 of the ultrasonic instrument and a second fixing device 530 for fixing the head cover 120 of the ultrasonic instrument. In a specific use, the ultrasonic probe 110 and the surgical instrument 320 may be fixed together on the bracket 500, and the bracket 500 may be mounted on the driving unit 310, such as a robot arm, so that the ultrasonic probe 110 and the surgical instrument 320 may be simultaneously controlled by the driving unit 310, thereby being more convenient to operate.
Further, the surgical instrument 320 may be an instrument for performing a puncture procedure, such as a biopsy needle for performing a biopsy puncture procedure or a cryoablation needle for performing an ablation procedure, etc. Of course, as will be appreciated by those skilled in the art, the surgical device 320 may be other than a device used to perform a lancing procedure, and the present invention is not limited in this regard.
When the surgical instrument 320 is an instrument for performing a puncture operation, that is, when the surgical device 300 is used for performing a puncture operation, the automatic planning module 220 is configured to plan a puncture path according to the real-time lesion information to obtain a target puncture path; the control module 230 is configured to control the surgical device 300 to perform a puncturing operation according to the acquired surgical operation parameters and the target puncturing path. As to how the automatic planning module 220 plans the puncture path according to the real-time lesion information, reference may be made to the relevant contents in the above surgical path planning, and therefore, the details thereof are not repeated.
When the surgical device 320 is a cryoablation needle, as shown in fig. 2 and 12, the surgical apparatus 300 further includes a cooling device 330. The control module 230 is configured to control the refrigerating device 330 to provide a cold source to the cryoablation needle according to the surgical operation parameters, so that the cryoablation needle can perform a cryoablation operation according to the required surgical operation parameters, such as freezing time, freezing cycle number, and freezing dose. Therefore, the control module 230 controls the refrigerating device 330 to provide a cold source for the cryoablation needle according to the operation parameters, so that the purpose of autonomously eliminating the focus can be achieved.
Referring to fig. 17, a schematic structural diagram of a refrigeration apparatus according to an embodiment of the invention is schematically shown. As shown in fig. 17, the refrigerating device 330 includes a pre-cooling device 331, a heat exchanging device 332, a freezing gas source 333, and the like, and both the pre-cooling device 331 and the heat exchanging device 332 are connected to the freezing gas source 333, so that the temperature and the flow rate of the gas flowing into the cryoablation needle can be better controlled through the pre-cooling device 331 and the heat exchanging device 332. Wherein, a valve 334 is provided on the pipeline connecting the heat exchanging device 332 and the freezing gas source 333, so that the flow rate of the gas flowing through the heat exchanging device 332 can be controlled through the valve 334. The operation principle of the pre-cooling device 331 and the heat exchanging device 332 can refer to the prior art, and the present invention is not described in detail herein.
Continuing to refer to fig. 18, a schematic partial structure diagram of a cryoablation needle according to an embodiment of the present invention is shown. As shown in fig. 18, the cryoablation needle is provided with a scale 321, a thermal barrier coating 322, a temperature sensor (not shown) and a display screen 323. Specifically, the scales 321 are provided on the outer surface of the cryoablation needle, so that the depth of the cryoablation needle can be precisely adjusted by setting the scales 321. The thermal insulation coating 322 is arranged on the inner surface of the other part of the cryoablation needle except the tip part (the end close to the patient), so that the cryoablation needle can be effectively prevented from frostbite normal tissues except a focus in the treatment process by arranging the thermal insulation coating 322, and the safety in the operation process is improved. The temperature sensor is arranged at the tip part of the cryoablation needle, so that the current temperature of the tip of the cryoablation needle can be monitored in real time through the temperature sensor. The display screen 323 is arranged at the needle holding end (the end close to the operator (i.e. the doctor)) of the cryoablation needle, so that the result measured by the temperature sensor can be displayed in real time through the display screen 323, so that the doctor can monitor the current temperature of the tip of the cryoablation needle in real time, and the cryoablation effect can be further improved.
Continuing to refer to fig. 19, a schematic flow chart of a cryoablation procedure according to an embodiment of the present invention is shown. As shown in fig. 19, the cryoablation process includes the following processes:
the cryoablation needle is inserted into the focus according to a target puncture path;
starting a freezing mode, and gradually increasing ice balls generated at the needle point;
continuously freezing and melting, and enlarging an ice ball and covering a focus;
starting a rewarming mode, and melting ice balls;
judging whether the number of freezing cycles is reached;
if so, withdrawing the cryoablation needle to complete the operation;
if not, the freezing mode is restarted.
In summary, compared with the prior art, the surgical system provided by the invention has the following advantages:
(1) the operation system provided by the invention can automatically acquire real-time focus information according to the intraoperative real-time medical image and the preoperative medical image through the focus identification module; the automatic planning module can automatically plan the operation path in real time according to the real-time lesion information; the control module can automatically control the surgical equipment to perform the surgery according to the planned real-time surgery path (namely the target path) so as to achieve the aim of eliminating the focus. Therefore, the surgical system provided by the invention can plan the surgical path according to the acquired real-time lesion information, so that the real-time performance of the planned target surgical path can be ensured, and the subsequent surgical execution stage can be more favorably and accurately eliminated.
(2) The lesion identification module acquires a preoperative medical image and an intraoperative real-time medical image, and registers the preoperative medical image and the intraoperative real-time medical image to acquire a real-time registration image, so that a high-definition intraoperative real-time image can be acquired, and then real-time lesion information is acquired according to the real-time registration image, so that the accuracy of lesion identification can be effectively improved, the workload of a doctor is reduced, a good basis is laid for a subsequent operation path planning stage and an operation execution stage, the acquired lesion information and the operation time can be ensured to be synchronous, the problem that the current lesion cannot be completely displayed due to the fact that the imaging time precedes the operation time in the prior art can be solved, and the lesion is more favorably eliminated.
(3) The control device of the invention also comprises a functional safety module which can monitor the real-time motion track of the surgical equipment in the surgical execution process according to the real-time registration image, and the real-time registration image is high-definition and accurate, so that the rapid and accurate monitoring of the surgical equipment can be realized; meanwhile, the functional safety module can acquire safety operation boundary information according to the real-time focus information, so that the accuracy of a safety boundary is ensured, and the safety performance of the surgical system in the surgical process is greatly improved.
(4) The operation system provided by the invention also comprises the human-computer interaction module, so that the whole operation process can be displayed in real time through the human-computer interaction module, the whole operation process can be completely carried out under the monitoring of a doctor, and the doctor can observe real-time images in the operation process, so that the doctor can obtain more operation information, and the operation risk is further reduced.
It should be noted that the apparatuses and methods disclosed in the embodiments herein can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments herein. In this regard, each block in the flowchart or block diagrams may represent a module, a program, or a portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments herein may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
Furthermore, in the description of the present specification, reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the various embodiments or examples and features of the various embodiments or examples described in this specification can be combined and combined by those skilled in the art without contradiction.
In summary, the above embodiments have described the lesion identification method, the surgical path planning method, the storage medium and the surgical system in detail, but it is understood that the above description is only for describing the preferred embodiments of the present invention and is not intended to limit the scope of the present invention, and those skilled in the art of the present invention should make any changes and modifications according to the above disclosure, which fall within the scope of the present invention. It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, it is intended that the present invention also include such modifications and variations as come within the scope of the invention and their equivalents.

Claims (25)

1. A surgical system comprising a control device and a surgical device communicatively coupled to the control device;
the control device comprises a focus identification module, an automatic planning module and a control module which are in communication connection;
the focus identification module is used for acquiring real-time focus information according to the intraoperative real-time medical image and the preoperative medical image;
the automatic planning module is used for planning a surgical path according to the real-time lesion information so as to obtain a target surgical path;
the control module is used for controlling the surgical equipment to execute the surgery according to the acquired surgical operation parameters and the target surgery path.
2. The surgical system according to claim 1, further comprising a first image acquisition device communicatively coupled to the control device for acquiring intraoperative real-time medical images.
3. The surgical system of claim 1, wherein the lesion identification module includes an image acquisition unit, an image registration unit, and a lesion identification unit communicatively coupled;
the image acquisition unit is used for acquiring preoperative medical images and intraoperative real-time medical images;
the image registration unit is used for registering the preoperative medical image and the intraoperative real-time medical image to obtain a real-time registration image;
the focus identification unit is used for acquiring real-time focus information according to the real-time registration image.
4. The surgical system according to claim 3, wherein if the pre-operative medical image is a CT image or an MRI image and the intra-operative real-time medical image is a CT image or an MRI image, the image registration unit acquires a real-time registration image, including:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
performing three-dimensional modeling on the intraoperative real-time medical image to obtain an intraoperative real-time three-dimensional medical image;
registering the pre-operative three-dimensional medical image to the intra-operative real-time three-dimensional medical image to obtain a real-time registered image.
5. The surgical system according to claim 3, wherein if the pre-operative medical image is a CT image or an MRI image and the intra-operative real-time medical image is an ultrasound image, the image registration unit acquires a real-time registration image, including:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
registering the preoperative three-dimensional medical image to the intraoperative real-time medical image to obtain a first real-time registered image;
registering the intraoperative real-time medical image to the preoperative three-dimensional medical image to obtain a second real-time registered image.
6. The surgical system according to claim 3, further comprising a second image capturing device communicatively coupled to the control device, the second image capturing device configured to capture intraoperative real-time patient skin images;
the image acquisition unit is also used for acquiring an intraoperative real-time patient skin image;
the image registration unit acquires a real-time registration image, including:
performing three-dimensional modeling on the preoperative medical image to obtain a preoperative three-dimensional medical image;
performing three-dimensional modeling on the intraoperative real-time medical image to obtain an intraoperative real-time three-dimensional medical image;
carrying out three-dimensional modeling on the intraoperative real-time patient skin image to obtain an intraoperative real-time human body model image;
registering and fusing the preoperative three-dimensional medical image and the intraoperative real-time three-dimensional medical image to obtain a first real-time fusion image;
registering the first real-time fusion image to the intra-operative real-time manikin image to obtain a real-time registration image.
7. The surgical system according to claim 4 or 6, wherein the lesion recognition unit acquires real-time lesion information, including:
and identifying the focus of the real-time registration image by adopting a pre-trained deep neural network model so as to acquire real-time focus information.
8. The surgical system of claim 5, wherein the lesion recognition unit obtains real-time lesion information, comprising:
segmenting the preoperative three-dimensional medical image by adopting a pre-trained deep neural network model to obtain a segmented image;
fusing the segmented image with the first real-time registration image to obtain a second real-time fused image;
fusing the segmented image with the second real-time registration image to obtain a third real-time fused image;
fusing the second real-time fusion image with the third real-time fusion image to obtain a fourth real-time fusion image;
and acquiring real-time focus information according to the fourth real-time fusion image.
9. The surgical system of claim 1, wherein the real-time lesion information includes real-time lesion location information.
10. The surgical system of claim 9, wherein the real-time lesion information further comprises one or more of real-time lesion volume information, real-time lesion shape information, and real-time critical organ tissue information.
11. The surgical system of claim 1, wherein the automated planning module acquires a target surgical path, comprising:
acquiring position information of at least one focus target point and a plurality of puncture points according to the real-time focus information;
evaluating each puncture point according to preset conditions to obtain a target puncture point;
and connecting the corresponding focus target point and the target puncture point to obtain a target puncture path.
12. The surgical system according to claim 11, wherein the obtaining the location information of the at least one lesion target point and the plurality of puncture points based on the real-time lesion information comprises:
acquiring a space mapping relation between an image coordinate system and a surgical equipment coordinate system;
acquiring real-time position information of the focus under the coordinate system of the surgical equipment according to the space mapping relation and the real-time focus information;
and acquiring the position information of at least one focus target point and a plurality of puncture points in the coordinate system of the surgical equipment according to the real-time position information of the focus in the coordinate system of the surgical equipment.
13. The surgical system according to claim 11, wherein the evaluating each of the puncture points to obtain a target puncture point according to a predetermined condition comprises:
a, scoring each puncture point according to a preset scoring criterion, and taking the puncture point with the highest score as a target puncture point;
step B, judging whether the target puncture point can cover all focuses;
if not, executing the step C;
c, scoring each non-target puncture point according to a preset scoring criterion, and taking the non-target puncture point with the highest score as a target puncture point;
d, judging whether all the target puncture points can cover all the focuses together;
if not, repeating the steps C and D until all the target puncture points can cover the focus together.
14. The surgical system according to claim 13, wherein the scoring each of the puncture points according to a predetermined scoring criterion, the scoring a highest scoring puncture point as a target puncture point, comprises:
scoring each puncture point according to a preset multi-item scoring criterion so as to obtain each score of each puncture point;
respectively calculating the comprehensive scores of the puncture points according to the weights corresponding to the preset scoring criteria;
taking the puncture point with the highest comprehensive score as a target puncture point;
the scoring of each non-target puncture point according to a preset scoring criterion and the taking of the non-target puncture point with the highest score as the target puncture point comprise the following steps:
scoring each non-target puncture point according to a preset multi-item scoring criterion so as to obtain each score of each non-target puncture point;
calculating the comprehensive score of each non-target puncture point according to the weight corresponding to each preset score criterion;
and taking the non-target puncture point with the highest comprehensive score as a target puncture point.
15. The surgical system according to claim 3, wherein the control device further comprises a functional safety module communicatively connected to the lesion identification module, the functional safety module being configured to monitor a real-time motion trajectory of the surgical device based on the real-time registered image output by the image registration unit.
16. The surgical system of claim 15, wherein the functional safety module is further configured to obtain safety operation boundary information according to the real-time lesion information, and determine whether a real-time motion trajectory of the surgical device exceeds a safety operation boundary region according to the safety operation boundary information.
17. The surgical system of claim 1, wherein the automated planning module is further configured to plan surgical operating parameters based on the real-time lesion information.
18. The surgical system of claim 1, wherein the surgical device includes a drive unit and a surgical instrument mounted on the drive unit, and the control module is configured to control the drive unit to drive the surgical instrument to perform a surgical procedure according to the acquired surgical operation parameter and the target surgical path.
19. The surgical system according to claim 18, wherein the driving unit is a robot arm having a distal end mounted with a holder for holding the surgical instrument.
20. The surgical system of claim 19, wherein the surgical instrument is an instrument for performing a puncture procedure;
the automatic planning module is used for planning a puncture path according to the real-time lesion information so as to obtain a target puncture path;
the control module is used for controlling the surgical equipment to execute the puncture operation according to the acquired operation parameters and the target puncture path.
21. The surgical system of claim 20, wherein the surgical instrument is a cryoablation needle, the surgical device further comprising a cooling device, and the control module is configured to control the cooling device to provide a cooling source to the cryoablation needle based on the surgical operating parameter.
22. The surgical system of claim 21, wherein the surgical procedure parameters include a freezing time, a number of freezing cycles, and a freezing dose.
23. The surgical system of claim 1, further comprising a human-machine interaction module communicatively coupled to the control device, the human-machine interaction module configured to display and interact data.
24. The surgical system of claim 1, wherein the control device further comprises a data storage module for storing and managing data.
25. The surgical system according to claim 2, wherein the first image capturing device is an ultrasound apparatus, the surgical system further comprising a support, the support comprising a base and a first fixing device and a second fixing device mounted on the base, the first fixing device being used for fixing an ultrasound probe of the ultrasound apparatus, the second fixing device being used for fixing a head cover of the ultrasound apparatus, the first fixing device being capable of approaching and departing from the second fixing device.
CN202110272335.9A 2021-03-12 2021-03-12 Surgical system Pending CN113057734A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110272335.9A CN113057734A (en) 2021-03-12 2021-03-12 Surgical system
PCT/CN2022/078302 WO2022188651A1 (en) 2021-03-12 2022-02-28 Surgical system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110272335.9A CN113057734A (en) 2021-03-12 2021-03-12 Surgical system

Publications (1)

Publication Number Publication Date
CN113057734A true CN113057734A (en) 2021-07-02

Family

ID=76560255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110272335.9A Pending CN113057734A (en) 2021-03-12 2021-03-12 Surgical system

Country Status (2)

Country Link
CN (1) CN113057734A (en)
WO (1) WO2022188651A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113729940A (en) * 2021-09-23 2021-12-03 上海卓昕医疗科技有限公司 Operation auxiliary positioning system and control method thereof
CN113856067A (en) * 2021-09-08 2021-12-31 中山大学 Multi-mode data fusion radiotherapy position determination method and auxiliary robot system
CN113907883A (en) * 2021-09-23 2022-01-11 佛山市第一人民医院(中山大学附属佛山医院) 3D visualization operation navigation system and method for ear-side skull-base surgery
CN113952029A (en) * 2021-09-14 2022-01-21 杭州微引科技有限公司 Real-time stepping type percutaneous puncture planning system
CN114305690A (en) * 2021-12-31 2022-04-12 杭州三坛医疗科技有限公司 Surgical navigation positioning method and device
WO2022188651A1 (en) * 2021-03-12 2022-09-15 上海微创医疗机器人(集团)股份有限公司 Surgical system
CN116138715A (en) * 2023-04-13 2023-05-23 南京诺源医疗器械有限公司 Internal cavity mirror system with adjustable fluorescence shooting angle
CN116473673A (en) * 2023-06-20 2023-07-25 浙江华诺康科技有限公司 Path planning method, device, system and storage medium for endoscope
CN116563379A (en) * 2023-07-06 2023-08-08 湖南卓世创思科技有限公司 Marker positioning method, device and system based on model fusion
WO2023216947A1 (en) * 2022-05-07 2023-11-16 武汉联影智融医疗科技有限公司 Medical image processing system and method for interventional operation

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115294124B (en) * 2022-10-08 2023-01-06 卡本(深圳)医疗器械有限公司 Ultrasonic puncture guiding planning system based on multi-mode medical image registration

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103356284A (en) * 2012-04-01 2013-10-23 中国科学院深圳先进技术研究院 Surgical navigation method and system
CN103886581A (en) * 2012-11-15 2014-06-25 西门子公司 System And Method For Registering Pre-operative And Intra-operative Images Using Biomechanical Model Simulations
CN107580716A (en) * 2015-05-11 2018-01-12 西门子公司 2D/2.5D laparoscopes and the endoscopic images data method and system registering with 3D stereoscopic image datas
CN109688934A (en) * 2016-08-01 2019-04-26 戈尔丹斯医疗公司 The opening of the blood-brain barrier of ultrasonic guidance
CN110151309A (en) * 2018-02-14 2019-08-23 上海交通大学 The preoperative planing method of multi-modal ablation and its equipment
CN110236674A (en) * 2019-05-09 2019-09-17 苏州大学 A kind of operation on liver navigation methods and systems based on structure light scan
CN110537960A (en) * 2018-05-29 2019-12-06 上海联影医疗科技有限公司 Puncture path determination method, storage device and robot-assisted surgery system
CN110946654A (en) * 2019-12-23 2020-04-03 中国科学院合肥物质科学研究院 Bone surgery navigation system based on multimode image fusion
CN112043383A (en) * 2020-09-30 2020-12-08 复旦大学附属眼耳鼻喉科医院 Ophthalmic surgery navigation system and electronic equipment
CN112245004A (en) * 2020-10-20 2021-01-22 哈尔滨医科大学 Ablation planning inspection method based on preoperative model and intraoperative ultrasonic image

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102512246B (en) * 2011-12-22 2014-03-26 中国科学院深圳先进技术研究院 Surgery guiding system and method
US20140142591A1 (en) * 2012-04-24 2014-05-22 Auris Surgical Robotics, Inc. Method, apparatus and a system for robotic assisted surgery
CN111093516B (en) * 2017-11-21 2023-01-10 深圳迈瑞生物医疗电子股份有限公司 Ultrasound system and method for planning ablation
CN108272508A (en) * 2018-01-19 2018-07-13 北京工业大学 Puncture path planing method in a kind of CT guiding liver tumour radio-frequency ablation procedure
CN111743626B (en) * 2020-07-03 2021-09-10 海杰亚(北京)医疗器械有限公司 Tumor puncture path acquisition device and storage medium
CN113057734A (en) * 2021-03-12 2021-07-02 上海微创医疗机器人(集团)股份有限公司 Surgical system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103356284A (en) * 2012-04-01 2013-10-23 中国科学院深圳先进技术研究院 Surgical navigation method and system
CN103886581A (en) * 2012-11-15 2014-06-25 西门子公司 System And Method For Registering Pre-operative And Intra-operative Images Using Biomechanical Model Simulations
CN107580716A (en) * 2015-05-11 2018-01-12 西门子公司 2D/2.5D laparoscopes and the endoscopic images data method and system registering with 3D stereoscopic image datas
CN109688934A (en) * 2016-08-01 2019-04-26 戈尔丹斯医疗公司 The opening of the blood-brain barrier of ultrasonic guidance
CN110151309A (en) * 2018-02-14 2019-08-23 上海交通大学 The preoperative planing method of multi-modal ablation and its equipment
CN110537960A (en) * 2018-05-29 2019-12-06 上海联影医疗科技有限公司 Puncture path determination method, storage device and robot-assisted surgery system
CN110236674A (en) * 2019-05-09 2019-09-17 苏州大学 A kind of operation on liver navigation methods and systems based on structure light scan
CN110946654A (en) * 2019-12-23 2020-04-03 中国科学院合肥物质科学研究院 Bone surgery navigation system based on multimode image fusion
CN112043383A (en) * 2020-09-30 2020-12-08 复旦大学附属眼耳鼻喉科医院 Ophthalmic surgery navigation system and electronic equipment
CN112245004A (en) * 2020-10-20 2021-01-22 哈尔滨医科大学 Ablation planning inspection method based on preoperative model and intraoperative ultrasonic image

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022188651A1 (en) * 2021-03-12 2022-09-15 上海微创医疗机器人(集团)股份有限公司 Surgical system
CN113856067A (en) * 2021-09-08 2021-12-31 中山大学 Multi-mode data fusion radiotherapy position determination method and auxiliary robot system
CN113952029A (en) * 2021-09-14 2022-01-21 杭州微引科技有限公司 Real-time stepping type percutaneous puncture planning system
CN113907883A (en) * 2021-09-23 2022-01-11 佛山市第一人民医院(中山大学附属佛山医院) 3D visualization operation navigation system and method for ear-side skull-base surgery
CN113729940A (en) * 2021-09-23 2021-12-03 上海卓昕医疗科技有限公司 Operation auxiliary positioning system and control method thereof
CN113729940B (en) * 2021-09-23 2023-05-23 上海卓昕医疗科技有限公司 Operation auxiliary positioning system and control method thereof
CN114305690A (en) * 2021-12-31 2022-04-12 杭州三坛医疗科技有限公司 Surgical navigation positioning method and device
CN114305690B (en) * 2021-12-31 2023-12-26 杭州三坛医疗科技有限公司 Surgical navigation positioning method and device
WO2023216947A1 (en) * 2022-05-07 2023-11-16 武汉联影智融医疗科技有限公司 Medical image processing system and method for interventional operation
CN116138715A (en) * 2023-04-13 2023-05-23 南京诺源医疗器械有限公司 Internal cavity mirror system with adjustable fluorescence shooting angle
CN116473673A (en) * 2023-06-20 2023-07-25 浙江华诺康科技有限公司 Path planning method, device, system and storage medium for endoscope
CN116473673B (en) * 2023-06-20 2024-02-27 浙江华诺康科技有限公司 Path planning method, device, system and storage medium for endoscope
CN116563379B (en) * 2023-07-06 2023-09-29 湖南卓世创思科技有限公司 Marker positioning method, device and system based on model fusion
CN116563379A (en) * 2023-07-06 2023-08-08 湖南卓世创思科技有限公司 Marker positioning method, device and system based on model fusion

Also Published As

Publication number Publication date
WO2022188651A1 (en) 2022-09-15

Similar Documents

Publication Publication Date Title
CN113057734A (en) Surgical system
CA2948257C (en) Operating room safety zone
CA2958494C (en) Feedback for providing artificial bone flap
JP6483133B2 (en) System and method for tracking an insertion device
US20170245940A1 (en) Intermodal synchronization of surgical data
KR101531620B1 (en) Method of and system for overlaying nbs functional data on a live image of a brain
CN103735312B (en) Multimode image navigation system for ultrasonic guidance operation
EP2720636B1 (en) System for guided injection during endoscopic surgery
US20140078138A1 (en) Apparatus and methods for localization and relative positioning of a surgical instrument
US20140276001A1 (en) Device and Method for Image-Guided Surgery
US20210056699A1 (en) Patient registration systems, devices, and methods for a medical procedure
CA2968879C (en) Hand guided automated positioning device controller
US11191595B2 (en) Method for recovering patient registration
CA2976573C (en) Methods for improving patient registration
CA2963865C (en) Phantom to determine positional and angular navigation system error
Shahin et al. Ultrasound-based tumor movement compensation during navigated laparoscopic liver interventions
KR20210086871A (en) System and method of interventional procedure using medical images
US20220414914A1 (en) Systems and methods for determining a volume of resected tissue during a surgical procedure
WO2023161775A1 (en) Mri based navigation
CN116236280A (en) Interventional therapy guiding method and system based on multi-mode image fusion
CN113940756A (en) Operation navigation system based on mobile DR image
CN115607275A (en) Image display mode, device, storage medium and electronic equipment
Sarkarian Image Guided Surgery: Navigation and Surgical Technologies

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination