CN114948199B - Surgical operation auxiliary system and operation path planning method - Google Patents

Surgical operation auxiliary system and operation path planning method Download PDF

Info

Publication number
CN114948199B
CN114948199B CN202210534426.XA CN202210534426A CN114948199B CN 114948199 B CN114948199 B CN 114948199B CN 202210534426 A CN202210534426 A CN 202210534426A CN 114948199 B CN114948199 B CN 114948199B
Authority
CN
China
Prior art keywords
path
surgical
planning
dimensional
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210534426.XA
Other languages
Chinese (zh)
Other versions
CN114948199A (en
Inventor
顾佩华
陈光耀
韩磊
王慧聪
胡顺顺
王凯峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang International Institute Of Innovative Design And Intelligent Manufacturing Tianjin University
Tianjin University
Original Assignee
Zhejiang International Institute Of Innovative Design And Intelligent Manufacturing Tianjin University
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang International Institute Of Innovative Design And Intelligent Manufacturing Tianjin University, Tianjin University filed Critical Zhejiang International Institute Of Innovative Design And Intelligent Manufacturing Tianjin University
Priority to CN202210534426.XA priority Critical patent/CN114948199B/en
Publication of CN114948199A publication Critical patent/CN114948199A/en
Application granted granted Critical
Publication of CN114948199B publication Critical patent/CN114948199B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a surgical operation auxiliary system, which comprises a preoperative operation planning system and an intraoperative operation navigation system; the preoperative surgical planning system includes: the system comprises an image processing module for reconstructing a three-dimensional model, a database for storing medical information of a patient and a surgery planning module for planning a moving path of a surgery instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module for measuring the spatial positions of a surgical instrument and a part to be operated, a surgical navigation module for recording the moving path of the surgical instrument and a path indicating device for prompting a doctor; the operation planning module comprises an operation path optimizing module and/or an operation path planning model established based on a neural network; the operation path optimization module obtains an optimal operation path in the operation constraint area range; the surgical path planning model outputs an auxiliary surgical path for assisting a physician in making decisions. The invention assists the doctor to plan the operation path and prompts the doctor to perform the operation according to the planned path.

Description

Surgical operation auxiliary system and operation path planning method
Technical Field
The invention relates to the field of medical treatment, in particular to a surgical operation auxiliary system and a surgical path planning method.
Background
Currently, doctors often operate on patients during surgery, mainly by virtue of past experience. Taking cosmetic surgery as an example, doctors generally record the state of a patient according to observation and photographing modes, and the actual surgical effect is very dependent on the clinical experience and skill of the surgery of the doctors. If the operation of the doctor is not standard or the operation experience of the doctor is insufficient, the problems of unnecessary skin and tissue injuries and the like can be caused, and even the operation failure can be caused. The face of a person has important influence on the aspects of personal growth, job hunting, family building and the like, and local minor defects can influence the overall image. At present, most people carry careful attitudes to cosmetic surgery, the fact that the surgical effect is difficult to guarantee is the most main reason, and the problem to be solved is urgent to improve the surgical quality.
With the development and improvement of computer level, computer-aided surgery is becoming a new direction of development of medical clinical surgery. The computer can obtain a three-dimensional model through a three-dimensional reconstruction method according to the original data of a patient, so that a doctor can conveniently formulate an operation scheme, the visual limitation of the surgeon is overcome, the data measurement is more accurate, and the diagnosis is more accurate. However, the medical three-dimensional reconstruction technology is mainly used for improving the understanding degree of doctors on the illness state of patients, and an auxiliary tool for making a surgical path plan by the doctors is lacked. Some medical auxiliary robot auxiliary operations exist in the operation process, wherein the da vinci operation robot is widely used and can be used for abdominal surgery, urinary surgery and the like. Such surgical assist robots, while convenient for the surgeon to operate, are generally expensive and require additional machine-operated learning. Therefore, a surgical assistance system that can assist a doctor in performing preoperative diagnosis and operation planning and is easy to operate during operation is particularly necessary.
Disclosure of Invention
The invention provides a surgical operation auxiliary system and a surgical path planning method for solving the technical problems in the prior art.
The invention adopts the technical proposal for solving the technical problems in the prior art that: a surgical auxiliary system, which comprises a preoperative operation planning system and an intraoperative operation navigation system; the preoperative surgical planning system includes: the system comprises an image processing module for reconstructing a three-dimensional model, a database for storing medical information of a patient and a surgery planning module for planning a moving path of a surgery instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module for measuring the spatial positions of a surgical instrument and a part to be operated, a surgical navigation module for recording the moving path of the surgical instrument and a path indicating device for prompting a doctor; the operation planning module comprises an operation path optimizing module and/or an operation path planning model established based on a neural network; the operation path optimization module is used for solving an optimal operation path in the range of the operation constraint area based on a space track planning algorithm; a surgical path planning model, which is trained using a training set constructed from preoperative medical information of a patient who has been operated, a surgical path, and post-operative result data, which inputs medical information of a patient to be operated and post-operative desired data, which outputs an auxiliary surgical path for assisting a doctor in decision making.
Further, the image processing module comprises a point cloud data three-dimensional reconstruction module and a medical image three-dimensional reconstruction module; the point cloud data three-dimensional reconstruction module comprises a three-dimensional scanning device for carrying out three-dimensional scanning on the surface of the part to be operated and obtaining point cloud data, and a three-dimensional model reconstruction module for carrying out three-dimensional reconstruction on the point cloud data by adopting a neural network to obtain a three-dimensional model of the part to be operated; the medical image three-dimensional reconstruction module is used for obtaining a three-dimensional model of the region of the part to be operated and peripheral nerves and blood vessels thereof through a three-dimensional reconstruction algorithm based on the slice image of the part to be operated.
Further, the positioning module comprises an electromagnetic positioning module and/or an optical positioning module; the electromagnetic positioning module comprises an electromagnetic position positioner; the optical positioning module comprises a binocular vision positioning system, and the binocular vision positioning system is used for positioning three-dimensional coordinates of the space points.
Further, the surgical navigation module records the spatial position of the surgical instrument in real time, compares and analyzes the actual surgical path with the pre-operative planning path, sends the next-step surgical instrument path information to the path indicating device on one hand, and sends an early warning signal of a corresponding level according to the level of the deviation degree when the actual surgical path deviates from the pre-operative planning path on the other hand.
Further, the path indication device comprises a light path indication device or an augmented reality path indication device; the light path indicating device comprises a laser lamp with adjustable indicating angle, which receives signals from the operation navigation module, the indicating lamp is on or off to indicate the moving path of the operation instrument, and the color difference of the indicating lamp indicates the path deviation degree; the augmented reality path indicating device comprises wearable augmented reality glasses or a wearable augmented reality helmet; the display screen of the augmented reality path indicating device is used for displaying the planned path before operation and the actual deviation degree.
The invention also provides a surgical path planning method, which is provided with a preoperative surgical planning system and an intraoperative surgical navigation system; preoperative surgical planning system setup: the system comprises an image processing module for reconstructing a three-dimensional model, a database for storing medical information of a patient and a surgery planning module for planning a moving path of a surgery instrument; intraoperative surgical navigation system settings: the surgical instrument comprises a positioning module for measuring the spatial positions of a surgical instrument and a part to be operated, a surgical navigation module for recording the moving path of the surgical instrument and a path indicating device for prompting a doctor; the operation planning module is provided with an operation path optimizing module and/or an operation path planning model established based on a neural network; the operation path optimization module is used for solving an optimal operation path in the range of the operation constraint area based on a space track planning algorithm; a surgical path planning model, which is trained using a training set constructed from preoperative medical information of a patient who has been operated, a surgical path, and post-operative result data, which inputs medical information of a patient to be operated and post-operative desired data, which outputs an auxiliary surgical path for assisting a doctor in decision making.
Further, the method for planning the moving path of the surgical instrument by the surgical planning module comprises the following steps:
firstly, establishing a medical image data set, labeling important blood vessels, nerves and key tissues and organs in the image of the data set, and training a neural network model for medical image segmentation by a deep learning method;
step two, after obtaining the image data of the focus of the patient, automatically identifying and segmenting important blood vessels, nerves and key tissues and organs of a focus area according to a neural network model for medical image segmentation, and constructing a three-dimensional model of the focus area of the patient through a medical image three-dimensional reconstruction technology;
step three, a three-dimensional model of the body surface of the focus area of the patient is obtained through a three-dimensional scanning device, and the three-dimensional model of the focus area of the patient is registered with the three-dimensional model of the body surface;
step four, defining an operation constraint area range by taking a safe distance which is kept in accordance with medical requirements with blood vessels, nerves, key tissues and organs to be avoided as a constraint condition;
and fifthly, obtaining an optimal operation path through a space track planning algorithm in the range of the operation constraint area, and realizing automatic planning of the operation path.
Further, the fifth step comprises the following sub-steps:
Step C1, creating n mutually parallel two-dimensional sections in the range of an operation constraint area, selecting one of the two-dimensional sections, and obtaining an ideal operation path in the plane by using a space path planning algorithm on the plane;
step C2, sequentially obtaining ideal operation paths based on the plane for the n two-dimensional sections, and superposing the obtained n plane paths in the direction perpendicular to the plane to obtain a three-dimensional curved surface of the space;
and C3, determining a safe activity interval of the surgical instrument according to the medical priori conditions and medical requirements, and selecting a smooth three-dimensional curved surface in the interval as an optimal surgical path according to the operation difficulty of a doctor.
Further, the space trajectory planning algorithm adopts a Q-learning algorithm to iteratively calculate an optimal operation path, and the method comprises the following specific steps:
step A1, initializing various parameters: establishing a Q-value table, setting the current iteration number as I, setting the maximum iteration number as I, and initializing i=0; define the current state as s t The action in this state is a t Setting Q(s) in the Q-value table i ,a i ) =0; initializing learning rate α=0.2, discount rate γ=0.8, and setting prize obtained after each action as r t The method comprises the steps of carrying out a first treatment on the surface of the Setting an upper boundary point of an operation area range as a starting point and a lower boundary point as an end point; setting the range of a surgical restriction area of a three-dimensional model of a focus area of a patient as an environmental space E, discretizing the environment space E into n effective two-dimensional sections, selecting one of the two-dimensional sections, establishing a coordinate system for the two-dimensional section, discretizing the two-dimensional section to obtain t optional states, and setting an optional action set as A;
step A2, setting a starting point as an initial state and an end point as a terminal state, and selecting an optimal action a according to an epsilon-greedy strategy t Action a is selected t The probability of (2) is:
wherein epsilon is greedy value; 1-epsilon is the optimal action a of choice t Probability of (2);to estimate the maximum prize available in the future; a represents an action; s represents a state; prob (a) t ) To select action a t Probability of (2);
step A3, selecting action a t After that, get the prize r t For determining the next action; obtaining the next state s t+1
Step A4, updating a Q-value table and greedy value epsilon;
the Q-value updating formula is as follows;
the greedy value ε update formula is:
step A5, let s t =s t+1 Judging whether or not to arriveTerminal state and Q(s) t ,a t ) And (3) if the value is converged, returning to the step (A2) if the value is not satisfied, and obtaining an optimal planning path after T times of iterative training until the iterative condition is satisfied.
Further, the path indication device adopts an augmented reality path indication device; the display screen of the augmented reality path indicating device is used for displaying a planned path before operation and the actual deviation degree; the operation path prompt information is displayed on the display screen in an image mode, and the prompt information is the feed direction and the feed depth.
The invention has the advantages and positive effects that: setting a surgery path planning model established based on a neural network; the surgical path planning model inputs medical information of a patient to be operated and postoperative expected data, and can output a surgical path for assisting a doctor in decision-making. A reference may be provided for a physician to make a surgical path. The space trajectory planning algorithm module is arranged, so that a doctor can be helped to find an optimal operation path in the range of the operation constraint area. The path indicating device of the surgical navigation system can provide path indication for doctors in real time in the operation, and can send out early warning signals after the actual surgical path deviates, so that the surgical accuracy of the doctors is improved. A path indicating device is arranged, and a doctor can perform operation according to a planned path according to prompt information such as sound and light signals.
Drawings
Fig. 1 is a schematic view of the structure of a surgical assistance system of the present invention.
Detailed Description
For a further understanding of the invention, its features and advantages, reference is now made to the following examples, which are illustrated in the accompanying drawings in which:
referring to fig. 1, a surgical assistance system includes a preoperative surgical planning system and an intraoperative surgical navigation system; the preoperative surgical planning system includes: the system comprises an image processing module for reconstructing a three-dimensional model, a database for storing medical information of a patient and a surgery planning module for planning a moving path of a surgery instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module for measuring the spatial positions of a surgical instrument and a part to be operated, a surgical navigation module for recording the moving path of the surgical instrument and a path indicating device for prompting a doctor; the operation planning module comprises an operation path optimizing module and/or an operation path planning model established based on a neural network; the operation path optimization module is used for solving an optimal operation path in the range of the operation constraint area based on a space track planning algorithm; a surgical path planning model, which is trained using a training set constructed from preoperative medical information of a patient who has been operated, a surgical path, and post-operative result data, which inputs medical information of a patient to be operated and post-operative desired data, which outputs an auxiliary surgical path for assisting a doctor in decision making.
Preferably, the image processing module can comprise a point cloud data three-dimensional reconstruction module and a medical image three-dimensional reconstruction module; the point cloud data three-dimensional reconstruction module can comprise a three-dimensional scanning device for carrying out three-dimensional scanning on the surface of the part to be operated and obtaining point cloud data, and a three-dimensional model reconstruction module for carrying out three-dimensional reconstruction on the point cloud data by adopting a neural network to obtain a three-dimensional model of the part to be operated; the medical image three-dimensional reconstruction module can obtain a three-dimensional model of the region of the part to be operated and peripheral nerves and blood vessels thereof through a three-dimensional reconstruction algorithm based on the slice image of the part to be operated.
Preferably, the positioning module may comprise an electromagnetic positioning module and/or an optical positioning module; the electromagnetic positioning module may include an electromagnetic position locator; the optical locating module may comprise a binocular vision locating system for three-dimensional coordinate locating of the spatial points.
Preferably, the surgical navigation module can record the spatial position of the surgical instrument in real time, and can compare and analyze the actual surgical path with the pre-operative planning path, on the one hand, the next surgical instrument path information can be sent to the path indicating device, and on the other hand, when the actual surgical path deviates from the pre-operative planning path, the pre-warning signal of the corresponding level can be sent according to the deviation level.
Preferably, the path indication means may comprise a light path indication means or an augmented reality path indication means; the light path indicating device can comprise a laser lamp with adjustable indicating angle, can receive signals from the operation navigation module, can indicate the moving path of the surgical instrument by the on and off of the indicating lamp, and indicates the deviation degree of the path by the color difference of the indicating lamp; the augmented reality path indication device may include wearable augmented reality glasses or a wearable augmented reality helmet; the display screen of the augmented reality path indicating device can be used for displaying the planned path before operation and the actual deviation degree.
The invention also provides a surgical path planning method, which is provided with a preoperative surgical planning system and an intraoperative surgical navigation system; preoperative surgical planning system setup: the system comprises an image processing module for reconstructing a three-dimensional model, a database for storing medical information of a patient and a surgery planning module for planning a moving path of a surgery instrument; intraoperative surgical navigation system settings: the surgical instrument comprises a positioning module for measuring the spatial positions of a surgical instrument and a part to be operated, a surgical navigation module for recording the moving path of the surgical instrument and a path indicating device for prompting a doctor; the operation planning module is provided with an operation path optimizing module and/or an operation path planning model established based on a neural network; the operation path optimization module is used for solving an optimal operation path in the range of the operation constraint area based on a space track planning algorithm; a surgical path planning model, which is trained using a training set constructed from preoperative medical information of a patient who has been operated, a surgical path, and post-operative result data, which inputs medical information of a patient to be operated and post-operative desired data, which outputs an auxiliary surgical path for assisting a doctor in decision making.
Preferably, the method of the surgical planning module planning the movement path of the surgical instrument may comprise the steps of:
firstly, a medical image data set can be established, important blood vessels, nerves and key tissues and organs in the data set image are marked, and a neural network model for medical image segmentation is trained through a deep learning method;
step two, after obtaining the image data of the focus of the patient, automatically identifying and segmenting important blood vessels, nerves and key tissues and organs of the focus area according to a neural network model for medical image segmentation, and constructing a three-dimensional model of the focus area of the patient through a medical image three-dimensional reconstruction technology;
step three, a three-dimensional model of the body surface of the focus area of the patient can be obtained through a three-dimensional scanning device, and the three-dimensional model of the focus area of the patient is registered with the three-dimensional model of the body surface;
step four, the safe distance which accords with medical requirements with blood vessels, nerves, key tissues and organs to be avoided can be kept as constraint conditions, and the scope of an operation constraint area is defined;
and fifthly, obtaining an optimal operation path through a space track planning algorithm in the range of the operation constraint area, and realizing automatic planning of the operation path.
Preferably, the fifth step may comprise the following sub-steps:
step C1, in the range of the operation constraint area, n mutually parallel two-dimensional sections can be created, one of the two-dimensional sections is selected, and an ideal operation path in the plane is obtained by utilizing a space path planning algorithm on the plane;
step C2, sequentially obtaining ideal operation paths based on the plane for the n two-dimensional sections, and superposing the obtained n plane paths in the direction perpendicular to the plane to obtain a three-dimensional curved surface of the space
And C3, determining a safe activity interval of the surgical instrument according to the medical priori conditions and medical requirements, and selecting a smooth three-dimensional curved surface in the interval as an optimal surgical path according to the operation difficulty of a doctor.
Preferably, the spatial trajectory planning algorithm iteratively calculates the optimal surgical path using a Q-learning algorithm. The Q-Learning algorithm is a value-based algorithm in the reinforcement Learning algorithm, Q is Q (S, a), namely, under the S state (S epsilon S) at a certain moment, an action a (a epsilon A) is adopted to obtain the expected value of the benefit; s represents a set of states; a represents a set of actions: s is an element in the set of states; a is an element in a set of actions; the environment feeds back corresponding benefits r according to an Action a of a agent, constructs a State and an Action into a Q-value table to store the Q value, and then selects an Action a capable of obtaining the maximum benefits r according to the Q value, wherein the benefits r are also called rewards r; the specific steps of the method can be as follows:
Step A1, initializing various parameters: establishing a Q-value table, setting the current iteration number as I, setting the maximum iteration number as I, and initializing i=0; define the current state as s t The action in this state is a t Setting Q(s) in the Q-value table i ,a i ) =0; initializing learning rate α=0.2, discount rate γ=0.8, and setting prize obtained after each action as r t The method comprises the steps of carrying out a first treatment on the surface of the Setting an upper boundary point of an operation area range as a starting point and a lower boundary point as an end point; setting the range of a surgical restriction area of a three-dimensional model of a focus area of a patient as an environmental space E, discretizing the environment space E into n effective two-dimensional sections, selecting one of the two-dimensional sections, establishing a coordinate system for the two-dimensional section, discretizing the two-dimensional section to obtain t optional states, and setting an optional action set as A;
a2, setting a starting point as an initial state and a terminal point as a terminal state; in order to enable the maximum prizes to be obtained under different behaviors as much as possible in the path searching process, the greedy value epsilon needs to be adaptively changed so as to prevent the algorithm from falling into local optimum. Thus, to ensure that the algorithm can converge quickly to an optimal Q value, an optimal action a can be selected according to the ε -greedy policy t Action a is selected t The probability of (2) is:
Wherein epsilon is greedy value; 1-epsilon is the optimal action a of choice t Probability of (2);to estimate the maximum prize available in the future; a represents an action; s represents a state; prob (a) t ) To select action a t Is a probability of (2).
Step A3, selecting action a t After that, get the prize r t For determining the next action; obtaining the next state s t+1
Step A4, updating a Q-value table and greedy value epsilon;
the Q-value updating formula is as follows;
the greedy value ε update formula is:
step A5, let s t =s t+1 Judging whether the terminal state and Q(s) are reached t ,a t ) And (3) if the value is converged, returning to the step (A2) if the value is not satisfied, and obtaining an optimal planning path after T times of iterative training until the iterative condition is satisfied.
Epsilon-greedy strategy, also called greedy strategy, each step takes the choice of the best (locally optimal solution) in the current state, and thus it is desirable to derive a globally optimal solution.
argmax is a function that parameterizes the function (set). When we have another function y=f (x), if there is a result x0=argmax (f (x)), it means that when the function f (x) takes x=x0, the maximum value of the value range of f (x) is obtained; if there are multiple points such that f (x) takes the same maximum value, then the result of argmax (f (x)) is a set of points. In other words, argmax (f (x)) is a variable point x (or a set of x) corresponding to which f (x) is made to take a maximum value. arg is defined herein as an "argument". "
Preferably, the path indication device adopts an augmented reality path indication device; the display screen of the augmented reality path indicating device is used for displaying a planned path before operation and the actual deviation degree; the operation path prompt information is displayed on the display screen in an image mode, and the prompt information is the feed direction and the feed depth.
The image processing module, the database, the operation planning module, the positioning module, the operation navigation module, the path indicating device, the operation path optimizing module, the operation path planning model, the point cloud data three-dimensional reconstruction module, the medical image three-dimensional reconstruction module, the electromagnetic positioning module, the optical positioning module, the electromagnetic position positioner, the binocular vision positioning system, the lamplight path indicating device, the augmented reality path indicating device, the laser lamp, the wearable augmented reality glasses, the wearable augmented reality helmet and other functional modules and devices can be all adopted to be applicable to the functional modules and devices in the prior art or to be constructed by adopting the functional modules and devices in the prior art and adopting the conventional technical means.
The construction and operation of the present invention will be further described with reference to a preferred embodiment thereof.
As shown in fig. 1, a surgical assistance system includes a preoperative surgical planning system and an intraoperative surgical navigation system; the preoperative surgical planning system includes: the system comprises an image processing module for reconstructing a three-dimensional model, a database for storing medical information of a patient and a surgery planning module for planning a moving path of a surgery instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module for measuring the spatial positions of a surgical instrument and a part to be operated, a surgical navigation module for recording the moving path of the surgical instrument and a path indicating device for prompting a doctor; the operation planning module comprises an operation path planning model established based on a neural network; a surgical path planning model, which is trained using a training set constructed from preoperative medical information of a patient who has been operated, a surgical path, and post-operative result data, which inputs medical information of a patient to be operated and post-operative desired data, which outputs an auxiliary surgical path for assisting a doctor in decision making.
A preoperative surgical planning system, comprising: the system comprises an image processing module for reconstructing a three-dimensional model, an acquired operation information database and an operation planning module for making an operation scheme.
The image processing module comprises a point cloud data three-dimensional reconstruction module and a medical image three-dimensional reconstruction module; the point cloud data three-dimensional reconstruction module comprises a three-dimensional scanning device for carrying out three-dimensional scanning on the surface of the part to be operated and obtaining point cloud data, and a three-dimensional model reconstruction module for carrying out three-dimensional reconstruction on the point cloud data by adopting a neural network to obtain a three-dimensional model of the part to be operated; the medical image three-dimensional reconstruction module is used for obtaining a three-dimensional model of the region of the part to be operated and peripheral nerves and blood vessels thereof through a three-dimensional reconstruction algorithm based on the slice image of the part to be operated.
The image processing module is used for reconstructing three-dimensional scanning of the body surface of a patient and reconstructing three-dimensional medical images. The three-dimensional scanning reconstruction means that a high-precision three-dimensional scanning device is used for scanning the surface of a focus area of a patient, and a three-dimensional model of the body surface of the patient is obtained through three-dimensional reconstruction based on point cloud data; medical image three-dimensional reconstruction refers to obtaining a three-dimensional model of a lesion area and peripheral nerves and blood vessels of a patient through a three-dimensional reconstruction algorithm based on slice images of the patient, such as Computed Tomography (CT) images and Magnetic Resonance Imaging (MRI) images.
The operation information database stores information of patients (including personal basic information, diagnosis and treatment information and the like) after the operation, a pre-operation scheme and an operation path to be adopted are planned, and information of an operation scheme and an operation path in the operation, an operation effect diagnosis after the operation and the like are actually completed, so that references are provided for the establishment of the operation scheme of the existing patients and the planning of the operation path.
The method for making the surgical scheme and the surgical path in the surgical planning module is computer-aided making, and the computer simulates the expected result after the surgery according to the condition of the patient, automatically completes the path planning through the surgical path planning method and assists the doctor in making the surgical scheme. It should be noted that the results of the computer program may be used as a reference for a doctor or may be used for clinical surgery after the doctor has confirmed the result, and not as the only way to program the program.
The operation planning module comprises an operation path planning model established based on a neural network; a surgical path planning model, which may be trained using a training set constructed from preoperative medical information of a patient who has been operated, a surgical path, and post-operative result data, which inputs medical information of a patient to be operated and post-operative desired data, which outputs an auxiliary surgical path for assisting a doctor in decision making.
The operation planning module further comprises a space track planning algorithm module, and the space track planning algorithm module is used for solving the optimal operation path in the operation constraint area range.
Surgical path planning the surgical plan and surgical path plan may be automatically generated by a surgical path planning model established based on a neural network. The operation path planning can also be used for preparing an ideal operation scheme in the operation constraint area range by analyzing the existing patient information, and obtaining the optimal operation path planning by a space track planning algorithm. The surgical path planning apparatus and method can be used as part of a surgical assistance system or can be used alone to assist a surgeon in making a surgical plan and surgical path planning.
The surgical navigation system comprises a positioning module for measuring the spatial positions of a surgical instrument and a patient, a surgical navigation module for recording a surgical path and a path indicating device for prompting a doctor.
The positioning module comprises an electromagnetic positioning module and/or an optical positioning module; the electromagnetic positioning module comprises an electromagnetic position positioner; the optical positioning module comprises a binocular vision positioning system, and the binocular vision positioning system is used for positioning three-dimensional coordinates of the space points. An electromagnetic position locator, which emits a magnetic field near the patient's surgical site by a magnetic field generator, calculates the spatial position by signal feedback of the positioning sensor and electromagnetic positioning probe in the magnetic field.
Positioning means of the positioning module include, but are not limited to, electromagnetic positioning and optical positioning. When the interference of electromagnetic positioning signals in the operation environment is small, an electromagnetic positioning mode is preferred; conversely, an optical positioning method is preferable.
The surgical instrument can be a surgical knife, and the positioning mode can be electromagnetic positioning. In order to reduce the interference of metal materials on electromagnetic positioning precision, the scalpel handle is replaced by a nonmetallic material, which can be plastic and the like. The electromagnetic positioning sensor is arranged in the scalpel handle, and each surgical instrument has a unique number and can be called at any time when a doctor performs surgery planning.
The surgical navigation module can record the spatial position of the surgical instrument in real time, compares and analyzes the actual surgical path with the pre-operative planning path, sends the next-step surgical instrument path information to the path indicating device on one hand, and sends an early warning signal of a corresponding level according to the level of the deviation when the actual surgical path deviates from the pre-operative planning path on the other hand.
Implementations of path indication devices include, but are not limited to, light indication and augmented reality techniques. The path indicating device based on the light indicating mode is a laser lamp with an adjustable indicating angle, indicates the operation path of the doctor in real time, and can indicate the deviation degree of the path through the color difference, so that the doctor does not need to wear additional auxiliary equipment. The path indicating device based on the augmented reality technology can be wearable augmented reality glasses or wearable augmented reality helmets, a doctor wears augmented reality equipment in an operation, a pre-operation planning path and the actual deviation degree are displayed on an electronic screen, and the picture quality depends on the technical development degree of the augmented reality equipment.
The path indicating means may be a laser lamp with an adjustable indicating angle. After receiving the path planning information, the rotating mechanism provides steering capability, the laser lamp emits low-brightness indication light after being adjusted to a proper angle, and a doctor can perform operation according to the light spot position indication and indicate the path deviation degree through the color difference of the laser lamp.
The path indicating device can also be wearable augmented reality glasses, and a doctor can observe the matching degree of a preset operation scheme model and an actually completed operation scheme model and real-time operation path prompt information in operation through equipment such as Google Glass, so that operation can be completed under the condition of not converting a visual angle. The surgical path prompt information can be displayed on the Google Glass in an image mode, the prompt information can be the feed direction and the feed depth, and accurate surgical guidance is realized. In the operation process, the operation navigation module detects and tracks the pose change of the operated position by using the electromagnetic positioning probe, and when the pose change of the operated position exceeds a set threshold, the operation navigation module can correct the path planning prompt information in the Google Glass display interface in real time, so that the accuracy of the relative pose between the planned path before operation and the operated position in the operation process is ensured. When a doctor encounters a problem which is difficult to solve in the operation process, the operation picture can be transmitted to an on-line expert group member in real time through a camera on the Google Glass, and the expert group member can provide real-time operation guidance or advice for the doctor in an on-line audio mode. In addition, the whole operation process can be recorded from the view angle of a doctor, and the postoperative evaluation and the doctor operation training are realized by combining the database technology.
To further illustrate the present disclosure, a nasal plastic surgery embodiment in facial cosmetic surgery is described.
In facial cosmetic surgery practice, the present invention can improve the rationality of the surgical scheme and the accuracy of the surgical procedure. As shown in the figure, the preferred operation planning implementation method of the present invention is shown in steps S101-S104.
Step S101, selecting and labeling body surface marker points of the patient' S face. And selecting a certain number of key points of the face of the patient as body surface marker points, and pasting a certain number of markers on the key points. It should be noted that the markers are attached at a position that avoids the nasal surgical area of the patient.
Step S102, scanning the face of the patient through a three-dimensional scanning device, and acquiring three-dimensional face data before operation of the patient.
Step S103, extracting position information of the characteristic mark points of the three-dimensional model of the face of the patient in the image space.
Step S104, the computer assists in planning the surgical path. After receiving the image data of the patient, the operation planning module simulates an optimal result after the nose plastic operation according to the personalized characteristics of the face of the patient, and automatically generates an operation path according to the simulation result.
The doctor can refer to the scheme of the operation planning module, combine the own pathology knowledge and operation experience, and complete operation path planning. The specific implementation steps of the operation planning are as follows:
Step S201, acquiring nose region CT and MRI image data (n=200 in the present embodiment) of n cases of patients, and establishing a nose medical image dataset; important blood vessels, nerves and critical tissues in the image are manually marked as sample images by an experienced doctor. And training a three-dimensional convolutional neural network model for medical image segmentation by a deep learning method.
Step S202, acquiring image data of the nose of the patient, and automatically identifying and segmenting important blood vessels, nerves and key tissues of a nose shaping region according to a three-dimensional convolutional neural network model for medical image segmentation. And constructing a three-dimensional model of important blood vessels, nerves, critical tissues and the like in the nose of the patient through a medical image three-dimensional reconstruction technology.
Step S203, a three-dimensional model of the nose of the patient is obtained through a three-dimensional scanning device, the three-dimensional model of the nose is taken as a reference, and the three-dimensional model registration of the interior of the nose reconstructed in the previous step and the three-dimensional model registration of the body surface of the nose are registered, so that a conversion matrix among different models is obtained.
Step S204, according to the medical priori information and the medical acceptable errors, combining with the simulation of the expected optimal result, taking the safe distance which is kept in line with the medical requirements with the blood vessels, nerves, key tissues and organs to be avoided as a constraint condition, and setting the constraint area range of the nose plastic surgery.
Step S205, in the range of the operation restriction area, taking blood vessels, nerves, tissues and organs which need to be avoided as barriers, taking the safe distance which accords with the medical requirements of the barriers as a restriction condition, and taking the shortest path or the smallest wound as a target; and obtaining an optimal operation path through a space track planning algorithm, and realizing automatic planning of the operation path.
Preferably, the surgical path generation in step S205 may include the steps of:
step C1, creating n mutually parallel two-dimensional sections in the range of a nose plastic surgery constraint area, selecting one of the two-dimensional sections, and obtaining an ideal surgery path in the plane by using a space path planning algorithm on the plane;
step C2, sequentially obtaining ideal operation paths based on the plane for the n two-dimensional sections, and superposing the obtained n plane paths in the direction perpendicular to the plane, thereby obtaining a three-dimensional curved surface of the space;
and C3, determining a safe movement space of the surgical instrument according to the medical priori conditions and medical requirements, selecting a smooth three-dimensional curved surface in the interval as an optimal surgical path according to the operation difficulty of a doctor, namely determining the cutting direction and depth of a surgical knife in the nose plastic surgery, and realizing the planning of the surgical path.
Preferably, the surgical path generation in step S205 may be a path planning method based on path learning, including the steps of:
step D1, recording an actual path of a scalpel when a doctor with abundant experience performs nose plastic surgery, establishing a nose plastic surgery path sample database, extracting path information characteristics, and manufacturing a training data set and a test data set; training a three-dimensional convolutional neural network model for path planning by a deep learning method;
and D2, acquiring real-time image data of the nose of the patient, and obtaining a corresponding relation between the nose of the patient and the operation path according to a three-dimensional convolutional neural network model for path planning, thereby obtaining the operation planning path for shaping the nose of the patient.
Step S206, three-dimensionally simulating a nose plastic surgery process, verifying the safety of a surgery planning path result, and checking whether the surgery instrument damages important blood vessels, nerves and other parts. After checking the safety, the planning of the operation path is completed.
The space trajectory planning algorithm adopts a reinforcement learning algorithm to iteratively calculate an optimal operation path, and the Q-learning algorithm is selected to carry out path planning, and the method comprises the following specific steps:
step S301, initializing each parameter: establishing a Q-value table, setting the current iteration number as I, setting the maximum iteration number as I=, and initializing i=0; define the current state as s t The action in this state is a t Setting Q(s) in the Q-value table i ,a i ) =0; initializing learning rate α=0.2, discount rate γ=0.8, and setting prize obtained after each action as r t The method comprises the steps of carrying out a first treatment on the surface of the Setting an upper boundary point of an operation area range as a starting point and a lower boundary point as an end point; setting the range of a surgical restriction area of a three-dimensional model of a focus area of a patient as an environmental space E, discretizing the environment space E into n effective two-dimensional sections, selecting one of the two-dimensional sections, establishing a coordinate system for the two-dimensional section, discretizing the two-dimensional section to obtain t optional states, and setting an optional action set as A, wherein the actions in the example comprise front, back, left and right actions;
Step S302, setting the starting point as the initial state S 0 The terminal point is the terminal state, and the optimal action a is selected according to the epsilon-greedy strategy t Action a is selected t The probability of (2) is:
wherein 1- ε is the optimal action a t Probability of (2);
step S303, selecting action a t After that, get the prize r t For determining the next action; obtaining the next state s t+1
Step S304, update the Q-value table and greedy value ε.
The Q-value updating formula is as follows;
the greedy value ε update formula is:
step S305, let S t =s t+1 Judging whether the terminal state and Q(s) are reached t ,a t ) If the value is converged, if the condition is not satisfied, continuing to step S302 until the iteration condition is satisfied, and obtaining the optimal planning path after T times of iteration training.
The preferred surgical navigation implementation method of the invention is as follows: registering the three-dimensional image of the face of the patient with the actual face image to realize the conversion between the image space and the real space. And (3) placing a magnetic field generator in the positioning module near the operated part of the patient to emit a magnetic field, and measuring the position information of the face mark point of the patient by using an electromagnetic positioning probe to finish the registration process in operation.
In this embodiment, the mathematical model of the registration algorithm can be expressed as: a set of patient facial marker points is known, the set of coordinates in real space being T and the coordinates in image space being M, the two having the following relationship:
obtaining a transformation matrix from T to M through operationThe position conversion relation of two different spaces can be obtained, and registration is realized.
Calibrating a set of surgical instruments with positioning function. In this embodiment, an optical positioning method is used for calibration, and the calibration method includes the following steps:
and E1, detecting the displacement of the tail end of the surgical instrument in the optical positioning mode and the electromagnetic positioning mode simultaneously, and recording displacement data. According to the requirement of calibration precision, the acquired data volume and the acquisition method can be adjusted, and preferably, the acquired group data are compared.
Step E2, calculating the calibration coefficient x of the coordinates by using the calibration coefficient calculation formula c 、y c 、z c And further, the compensated position coordinates are obtained, so that the accuracy of positioning the surgical instrument is improved.
And the doctor performs the operation according to the planned path according to the prompt information such as the sound signal, the light signal and the like. The prompt information is transmitted to the doctor through the path indicating device, and the mode of transmitting the information can be a voice prompt or a track lamp prompt, but is not limited to the voice prompt or the track lamp prompt.
The above-described embodiments are only for illustrating the technical spirit and features of the present invention, and it is intended to enable those skilled in the art to understand the content of the present invention and to implement it accordingly, and the scope of the present invention is not limited to the embodiments, i.e. equivalent changes or modifications to the spirit of the present invention are still within the scope of the present invention.

Claims (5)

1. A surgical operation auxiliary system is characterized by comprising a preoperative operation planning system and an intraoperative operation navigation system; the preoperative surgical planning system includes: the system comprises an image processing module for reconstructing a three-dimensional model, a database for storing medical information of a patient and a surgery planning module for planning a moving path of a surgery instrument; the intraoperative surgical navigation system comprises: the surgical instrument comprises a positioning module for measuring the spatial positions of a surgical instrument and a part to be operated, a surgical navigation module for recording the moving path of the surgical instrument and a path indicating device for prompting a doctor; the operation planning module comprises an operation path optimizing module and/or an operation path planning model established based on a neural network; the operation path optimization module is used for solving an optimal operation path in the range of the operation constraint area based on a space track planning algorithm; a surgical path planning model, which is trained by a training set constructed by preoperative medical information of a patient who has been operated, a surgical path and postoperative result data, inputs medical information of a patient to be operated and postoperative expected data, and outputs an auxiliary surgical path for assisting a doctor in decision; wherein:
The operation planning module realizes planning of the moving path of the operation instrument through the following method steps:
firstly, establishing a medical image data set, labeling important blood vessels, nerves and key tissues and organs in the image of the data set, and training a neural network model for medical image segmentation by a deep learning method;
step two, after obtaining the image data of the focus of the patient, automatically identifying and segmenting important blood vessels, nerves and key tissues and organs of a focus area according to a neural network model for medical image segmentation, and constructing a three-dimensional model of the focus area of the patient through a medical image three-dimensional reconstruction technology;
step three, a three-dimensional model of the body surface of the focus area of the patient is obtained through a three-dimensional scanning device, and the three-dimensional model of the focus area of the patient is registered with the three-dimensional model of the body surface;
step four, defining an operation constraint area range by taking a safe distance which is kept in accordance with medical requirements with blood vessels, nerves, key tissues and organs to be avoided as a constraint condition;
step five, obtaining an optimal operation path through a space track planning algorithm in the range of the operation constraint area, and realizing automatic planning of the operation path;
the fifth step comprises the following sub-steps:
Step C1, creating n mutually parallel two-dimensional sections in the range of an operation constraint area, selecting one of the two-dimensional sections, and obtaining an ideal operation path in the plane by using a space path planning algorithm on the plane;
step C2, sequentially obtaining ideal operation paths based on the plane for the n two-dimensional sections, and superposing the obtained n plane paths in the direction perpendicular to the plane to obtain a three-dimensional curved surface of the space;
step C3, determining a safe activity interval of the surgical instrument according to the medical priori conditions and medical requirements, and selecting a smooth three-dimensional curved surface in the interval as an optimal surgical path according to the operation difficulty of a doctor;
the surgical path optimization module is based on a space track planning algorithm adopting a Q-learning algorithm, and the optimal surgical path is calculated through the following specific method steps:
step A1, initializing various parameters: establishing a Q-value table, setting the current iteration number as I, setting the maximum iteration number as I, and initializing i=0; define the current state as s t The action in this state is a t Setting Q(s) in the Q-value table i ,a i ) =0; initializing learning rate α=0.2, discount rate γ=0.8, and setting prize obtained after each action as r t The method comprises the steps of carrying out a first treatment on the surface of the Setting an upper boundary point of an operation area range as a starting point and a lower boundary point as an end point; setting the range of a surgical restriction area of a three-dimensional model of a focus area of a patient as an environmental space E, discretizing the environment space E into n effective two-dimensional sections, selecting one of the two-dimensional sections, establishing a coordinate system for the two-dimensional section, discretizing the two-dimensional section to obtain t optional states, and setting an optional action set as A;
step A2, setting a starting point as an initial state and an end point as a terminal state, and selecting an optimal action a according to an epsilon-greedy strategy t Action a is selected t The probability of (2) is:
wherein epsilon is greedy value; 1-epsilon is the optimal action a of choice t Probability of (2);to estimate the maximum prize available in the future; a represents an action; s represents a state; prob (a) t ) To select action a t Probability of (2);
step A3, selecting action a t After that, get the prize r t For determining the next action; obtaining the next state s t+1
Step A4, updating a Q-value table and greedy value epsilon;
the Q-value updating formula is as follows;
the greedy value ε update formula is:
step A5, let s t =s t+1 Judging whether the terminal state and Q(s) are reached t ,a t ) And (3) if the value is converged, returning to the step (A2) if the value is not satisfied, and obtaining an optimal planning path after T times of iterative training until the iterative condition is satisfied.
2. The surgical assist system of claim 1 wherein the image processing module comprises a point cloud data three-dimensional reconstruction module and a medical image three-dimensional reconstruction module; the point cloud data three-dimensional reconstruction module comprises a three-dimensional scanning device for carrying out three-dimensional scanning on the surface of the part to be operated and obtaining point cloud data, and a three-dimensional model reconstruction module for carrying out three-dimensional reconstruction on the point cloud data by adopting a neural network to obtain a three-dimensional model of the part to be operated; the medical image three-dimensional reconstruction module is used for obtaining a three-dimensional model of the region of the part to be operated and peripheral nerves and blood vessels thereof through a three-dimensional reconstruction algorithm based on the slice image of the part to be operated.
3. A surgical assistance system as claimed in claim 1 wherein the positioning module comprises an electromagnetic positioning module and/or an optical positioning module; the electromagnetic positioning module comprises an electromagnetic position positioner; the optical positioning module comprises a binocular vision positioning system, and the binocular vision positioning system is used for positioning three-dimensional coordinates of the space points.
4. The surgical assistance system according to claim 1, wherein the surgical navigation module records the spatial position of the surgical instrument in real time and compares and analyzes the actual surgical path with the pre-operative planned path, and on the one hand, transmits the next surgical instrument path information to the path indication device, and on the other hand, when there is a deviation between the actual surgical path and the pre-operative planned path, transmits the pre-warning signal of the corresponding level according to the degree of deviation level.
5. The surgical assist system of claim 4 wherein the path indication device comprises a light path indication device or an augmented reality path indication device; the light path indicating device comprises a laser lamp with adjustable indicating angle, which receives signals from the operation navigation module, the indicating lamp is on or off to indicate the moving path of the operation instrument, and the color difference of the indicating lamp indicates the path deviation degree; the augmented reality path indicating device comprises wearable augmented reality glasses or a wearable augmented reality helmet; the display screen of the augmented reality path indicating device is used for displaying the planned path before operation and the actual deviation degree.
CN202210534426.XA 2022-05-17 2022-05-17 Surgical operation auxiliary system and operation path planning method Active CN114948199B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210534426.XA CN114948199B (en) 2022-05-17 2022-05-17 Surgical operation auxiliary system and operation path planning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210534426.XA CN114948199B (en) 2022-05-17 2022-05-17 Surgical operation auxiliary system and operation path planning method

Publications (2)

Publication Number Publication Date
CN114948199A CN114948199A (en) 2022-08-30
CN114948199B true CN114948199B (en) 2023-08-18

Family

ID=82983069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210534426.XA Active CN114948199B (en) 2022-05-17 2022-05-17 Surgical operation auxiliary system and operation path planning method

Country Status (1)

Country Link
CN (1) CN114948199B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115359896B (en) * 2022-10-20 2023-01-24 山东曲阜康尔健医疗科技有限公司 Operation and monitoring analysis system based on data analysis and remote control
CN116473673B (en) * 2023-06-20 2024-02-27 浙江华诺康科技有限公司 Path planning method, device, system and storage medium for endoscope
CN116919599B (en) * 2023-09-19 2024-01-09 中南大学 Haptic visual operation navigation system based on augmented reality
CN116935009B (en) * 2023-09-19 2023-12-22 中南大学 Operation navigation system for prediction based on historical data analysis
CN117274506B (en) * 2023-11-20 2024-02-02 华中科技大学同济医学院附属协和医院 Three-dimensional reconstruction method and system for interventional target scene under catheter
CN117393107B (en) * 2023-12-12 2024-03-15 北京唯迈医疗设备有限公司 Iterative learning method and system for automatic surgical intervention robot and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004223128A (en) * 2003-01-27 2004-08-12 Hitachi Ltd Medical practice supporting apparatus and method
EP2277441A1 (en) * 2009-07-22 2011-01-26 Surgica Robotica S.p.A. Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures
CN103479430A (en) * 2013-09-22 2014-01-01 江苏美伦影像系统有限公司 Image guiding intervention operation navigation system
CN106890025A (en) * 2017-03-03 2017-06-27 浙江大学 A kind of minimally invasive operation navigating system and air navigation aid
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
CN110464459A (en) * 2019-07-10 2019-11-19 丽水市中心医院 Intervention plan navigation system and its air navigation aid based on CT-MRI fusion
CN112155729A (en) * 2020-10-15 2021-01-01 中国科学院合肥物质科学研究院 Intelligent automatic planning method and system for surgical puncture path and medical system
WO2021114226A1 (en) * 2019-12-12 2021-06-17 珠海横乐医学科技有限公司 Surgical navigation system employing intrahepatic blood vessel registration
CN113081257A (en) * 2019-12-23 2021-07-09 四川医枢科技股份有限公司 Automatic planning method for operation path
CN113693725A (en) * 2021-10-22 2021-11-26 杭州维纳安可医疗科技有限责任公司 Needle insertion path planning method, device, equipment and storage medium
CN113940755A (en) * 2021-09-30 2022-01-18 南开大学 Surgical operation planning and navigation method integrating operation and image

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030220557A1 (en) * 2002-03-01 2003-11-27 Kevin Cleary Image guided liver interventions based on magnetic tracking of internal organ motion
US20070238981A1 (en) * 2006-03-13 2007-10-11 Bracco Imaging Spa Methods and apparatuses for recording and reviewing surgical navigation processes
US9925009B2 (en) * 2013-03-15 2018-03-27 Covidien Lp Pathway planning system and method
WO2014139024A1 (en) * 2013-03-15 2014-09-18 Synaptive Medical (Barbados) Inc. Planning, navigation and simulation systems and methods for minimally invasive therapy
TWI772917B (en) * 2020-10-08 2022-08-01 國立中央大學 Computer-implemented method, computer-assisted processing device and computer program product for computer-assisted planning of surgical path

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004223128A (en) * 2003-01-27 2004-08-12 Hitachi Ltd Medical practice supporting apparatus and method
EP2277441A1 (en) * 2009-07-22 2011-01-26 Surgica Robotica S.p.A. Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures
CN103479430A (en) * 2013-09-22 2014-01-01 江苏美伦影像系统有限公司 Image guiding intervention operation navigation system
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
CN106890025A (en) * 2017-03-03 2017-06-27 浙江大学 A kind of minimally invasive operation navigating system and air navigation aid
CN110464459A (en) * 2019-07-10 2019-11-19 丽水市中心医院 Intervention plan navigation system and its air navigation aid based on CT-MRI fusion
WO2021114226A1 (en) * 2019-12-12 2021-06-17 珠海横乐医学科技有限公司 Surgical navigation system employing intrahepatic blood vessel registration
CN113081257A (en) * 2019-12-23 2021-07-09 四川医枢科技股份有限公司 Automatic planning method for operation path
CN112155729A (en) * 2020-10-15 2021-01-01 中国科学院合肥物质科学研究院 Intelligent automatic planning method and system for surgical puncture path and medical system
CN113940755A (en) * 2021-09-30 2022-01-18 南开大学 Surgical operation planning and navigation method integrating operation and image
CN113693725A (en) * 2021-10-22 2021-11-26 杭州维纳安可医疗科技有限责任公司 Needle insertion path planning method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN114948199A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN114948199B (en) Surgical operation auxiliary system and operation path planning method
US20230390002A1 (en) Path-based navigation of tubular networks
KR101886990B1 (en) System and method for navigation from a medical imaging-based procedure to a target dissection target
JP5702861B2 (en) Assisted automatic data collection method for anatomical surfaces
CN106714656B (en) System and method for dividing in art
US20180153626A1 (en) System and methods for intraoperative guidance feedbac
US8116847B2 (en) System and method for determining an optimal surgical trajectory
EP2277441A1 (en) Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures
CN109549689A (en) A kind of puncture auxiliary guide device, system and method
EP3282994B1 (en) Method and apparatus to provide updated patient images during robotic surgery
JP7418352B2 (en) Automatic tumor identification during surgery using machine learning
JP2016512973A (en) Tracking device for tracking an object relative to the body
EP3165192B1 (en) Updating a volumetric map
CN116650111A (en) Simulation and navigation method and system for bronchus foreign body removal operation
Liu et al. Study on robot-assisted minimally invasive neurosurgery and its clinical application
EP4329580A1 (en) Method and device for generating an uncertainty map for guided percutaneous procedures
US20220354579A1 (en) Systems and methods for planning and simulation of minimally invasive therapy
US20230240750A1 (en) Systems for evaluating registerability of anatomic models and associated methods
US20240099776A1 (en) Systems and methods for integrating intraoperative image data with minimally invasive medical techniques
CN117796927A (en) Mechanical arm auxiliary guiding combined AI ultrasonic image recognition system
CN118695821A (en) Systems and methods for integrating intraoperative image data with minimally invasive medical techniques
CN113643433A (en) Form and attitude estimation method, device, equipment and storage medium
US10376335B2 (en) Method and apparatus to provide updated patient images during robotic surgery
CN116419726A (en) Surgical visualization and guidance
CN118453115A (en) Real-time image guidance system based on surgery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant