CN110338907A - A kind of haptic navigation system for medical image guidance operation - Google Patents

A kind of haptic navigation system for medical image guidance operation Download PDF

Info

Publication number
CN110338907A
CN110338907A CN201810302520.6A CN201810302520A CN110338907A CN 110338907 A CN110338907 A CN 110338907A CN 201810302520 A CN201810302520 A CN 201810302520A CN 110338907 A CN110338907 A CN 110338907A
Authority
CN
China
Prior art keywords
module
haptic
deep learning
navigation
prediction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810302520.6A
Other languages
Chinese (zh)
Inventor
邰永航
石俊生
李琼
魏磊
陈载清
黄小乔
柳昱辰
秦芝宝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan University YNU
Yunnan Normal University
Original Assignee
Yunnan Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan Normal University filed Critical Yunnan Normal University
Priority to CN201810302520.6A priority Critical patent/CN110338907A/en
Publication of CN110338907A publication Critical patent/CN110338907A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention discloses a kind of haptic navigation systems for Minimally Invasive Surgery, to improve the accuracy of Minimally Invasive Surgery.It specifically includes: haptic data acquisition module (1), deep learning training module (2), operation estimation module (3), surgical classification module (4) haptic visualization module (5) and view touching Fusion Module (6);The haptic navigation rate of accurateness that the present invention is built by the acquisition of the Mechanical Data built during surgery by true Minimally Invasive Surgery instrument, training pattern, in conjunction with deep learning algorithm, artificial intelligence decision system and haptic visualization algorithm is high, real-time is good.Compensate for the disadvantages of imaging effect is poor, organizational boundary is unintelligible, tactilely-perceptible lacks in the existing Minimally Invasive Surgery guided by iconography merely, substantially increase the accuracy of highly difficult, complicated Minimally Invasive Surgery, doctor's operating time is saved, operation is alleviated and gives sufferer bring pain.

Description

A kind of haptic navigation system for medical image guidance operation
Technical field:
The invention belongs to artificial intelligence assisted surgery medical domains, and in particular to the tactile for medical image guidance operation Air navigation aid.
Background technique:
Surgical operation is still the essential therapeutic arsenals of many complex disease treatments.However operative treatment theory, not It is limited only to traditional large area, open modus operandi, the universal of the accurate Minimally Invasive Surgery of individuation means " precisely doctor Learn " just gradually it is applied to Clinical Surgery's medical field.Image guided surgery (IGS, Image Guide Surgery) belongs to meter Surgical navigational field in calculation machine assisted surgery, by introducing patient's real-time medical image number in surgical procedures According to (CT/MRI/X-ray/ ultrasound), by virtual reality technology, the locating and tracking systems such as light/magnetic/electricity, real-time display hand are relied on The spatial relations such as art instrument, sufferer, pathological tissues and Dynamic medical image information, to realize the reality for surgical procedure When guide.The technology is applied to neurosurgery at first, is then gradually widely used in spinal surgery, plastic surgery, knee and closes In the aspiration biopsy operation of section, thorax abdomen surgery and tumour.
However existing medical image navigation system be primarily present following two in terms of difficult point: in terms of image-guided: First is that pre-operative image is registrated with image in art, in the Different Organs position data of operation different moments, need accurately to pass through Pre-operative data extracts associated valid data and is added to intraoperative state;Second is that the real-time of navigation system, doctor behaviour in surgical procedure Make and sufferer is breathed, the change of position bring spatial position, blocked, mark sensor and navigation equipment is needed to correct in real time Registration, to seriously affect the accuracy of navigation system;Third is that the registration between multi-modality images, CT, MRI, X-ray, ultrasound Equal image datas cause difference between image larger due to imaging mechanism difference, existing based on geometrical characteristic and based on image Multi-modal registration Algorithm of density feature still can not find optimal compatibility between precision is high and real-time is good.From operation Operational aspect, operation skill is fundamentally a kind of muscle memory, from the haptic feedback table in each operative process It is existing, it is that doctor in the course of surgery most directly feeds back patient's real-time status, for example whether wearing out destination organization, if into Enter operating area of performing the operation, the threshold range etc. of power, angle, depth in surgical procedure.And existing essentially all navigation equipment The important function of haptic navigation in the course of surgery is all had ignored mostly.
Summary of the invention:
The purpose of the present invention is to provide a kind of haptic navigation systems for medical image guidance operation, to improve shadow As the accuracy of guidance Minimally Invasive Surgery.It specifically includes: haptic data acquisition module (1), deep learning training module (2), operation Prediction module (3), surgical classification module (4) haptic visualization module (5) and view touching Fusion Module (6);The present invention by It is acquired in operation by the Mechanical Data that true Minimally Invasive Surgery instrument is built, training pattern, in conjunction with deep learning algorithm, artificial intelligence The haptic navigation rate of accurateness that energy decision system and haptic visualization algorithm are built is high, real-time is good.It compensates for existing Merely by the disadvantages of imaging effect in image-guided Minimally Invasive Surgery is poor, organizational boundary is unintelligible, tactilely-perceptible lacks, mention significantly High operation precision, shortens operating time, alleviates operation and give sufferer bring pain.
The haptic data acquisition module, including six degree of freedom (6DOF) mechanics sensor module, surgical instrument module With customization connector modules.The six degree of freedom mechanics sensor can record surgical instrument in surgical procedure in real time and grasp Make the power and torque characteristic in (such as: puncturing, lift, tearing image guided surgery operation):It is described Surgical instrument module, (such as: puncture needle, interposing catheter, hysteroscope etc. are image-guided for the surgical instrument in as true surgical procedure Surgical instrument).The customization connector (Connector), as by the mechanics sensor of custom design realization and very It is the connection of surgical instrument, preferably, the customization connector printed by 3D printing technique.
The deep learning training module refers to and handles the big number of tactile obtained by deep neural network study According to, for the haptic navigation in surgical procedure, mainly include operation haptic data prediction module and operation haptic data Classify two modules.
The operation estimation module refers to through the prediction algorithm in recurrent neural network (RNN), passes through training experiment The haptic data of acquisition obtains the force diagram prediction model in surgical procedure, and for haptic navigation process of performing the operation in real time.
Preferably, the shot and long term memory network model (LSTM) in recurrent neural network, as the prediction of haptic navigation Model, to prevent disappearance (Vanishing Gradients) phenomenon of gradient present in conventional recursive neural network.It is substantially former Reason are as follows: as input x=(x1,x2,x3,…,xT,), it exports as h=(h1,h2,h3,…,hT), by t=1 to t=T iteration Training output afterwards is represented by:
fT=σ(Wf*[ht-1,xt]+bf)
it=σ (Wi*[ht-1,xt]+bi)
Ct=ft⊙Ct-1+it⊙Ct
ot=σ (Wo*[ht-1,xt]+bo)
ht=ot⊙tanh(Ct)
Wherein W indicates that weight matrix, b are bias vector, and σ and tanh are activation primitive, and it is single that C, i, f respectively indicate activation Member, input gate and out gate.
Tactile prediction model in the recurrent neural network art based on shot and long term memory, comprising: preoperative exercise training With two parts of navigating in art.The preoperative exercise module includes: data extending, and shot and long term neural network deep learning is followed Ring prediction;Haptic navigation module includes that real time data imports in the art, activation primitive parameter prediction, surgical procedure prediction.
The surgical classification module, comprising: two modules of navigating in preoperative exercise and art.The preoperative exercise module Refer to: it is preoperative by six degree of freedom mechanics sensor, the haptic model of human body different tissues organ is acquired, is instructed by deep learning After white silk, the tactile disaggregated model library of Different Organs, different depth, homologous organs' different layers structure is provided.It navigates in the art Module refers to: in true surgical procedure, by the tactile disaggregated model library obtained after deep learning, providing histoorgan in art Haptic navigation between the tactile early warning of faulty operation, organ and inside multilayered structure organ.
Deep learning algorithm in the surgical classification module, comprising: depth convolutional neural networks (CNN), recurrence mind Through network (RNN) and depth residual error neural network (ReNet).
The haptic visualization module, including real tactile feedback display module in art, preoperative prediction haptic model are aobvious Show module, Multi-level Organization Structure display module, faulty operation warning module.
The real tactile feedback display module refers to: in medical image guidance surgical procedure, on video display The touch feedback data and curves of real-time display surgical procedure.
The preoperative prediction haptic model display module refers to: in medical image guidance surgical procedure, according to depth Learn prediction module judgement, the haptic data curve after the deep learning prediction of video display real-time display.
The Multi-level Organization Structure display module refers to: in medical image guidance surgical procedure, according to deep learning Categorization module judgement, video display real-time display multilayer tissue puncture or isolate that (such as: each layer of skin, fat, muscle punctures Display afterwards).
The faulty operation warning module refers to: in medical image guidance surgical procedure, being classified according to deep learning Module judgement, video display real-time display puncture or isolate non-target tissues organ early warning.
The view touches Fusion Module, refers to 5 haptic navigation visualization model of module, guides with existing medical image The superposition of operation visualization interface specifically includes that the haptic navigation interface at CT/MRI/X-ray/ ultrasonic navigation image interface is folded Add.
Detailed description of the invention:
Fig. 1 is present system structural schematic diagram.
Fig. 2 is the haptic data acquisition of lung tumors aspiration biopsy operation customization.
Fig. 3 haptic data collection process.
Fig. 4 is tactile depth of assortment learning process figure.
Fig. 5 is tactile predetermined depth learning process figure.
Fig. 6 is view touching navigation interface in art.
Specific embodiment:
It elaborates with reference to the accompanying drawing to the present invention.As shown in Fig. 1 schematic structural view of the invention, below with medicine shadow As guidance lung tumors puncturing operation (PBNL), it is described further as specific embodiment, but not as of the invention It limits:
(1) data acquisition module: as shown in Fig. 2, medical image guidance lung tumors puncturing operation uses surgical instrument For 18G paracentesis trocar, it is real-time CT imaging that navigational figure, which uses,.The sensor that the surgical data acquisition module uses is ATI NANO17 six degree of freedom sensor, customizing connector (Connector) is to print to complete by 3D printing technique.Specific portion Part figure is as shown in Fig. 2 left figure, and wherein component 1 is 18G paracentesis trocar, and component 2 is ATI mechanics sensor, and component 3 is M5 locking screw Silk, component 4 are DOL Data Output Line, and component 5 is the 3D printing connector part of customization.If Fig. 2 right figure is, in specific embodiment Whole picture after each component installation of data acquisition module.If Fig. 3 is that medical image guides the specific implementation of lung tumors puncturing operation Data acquisition system setting in the process, all experimental setups and experiment flow are all meeting ethics moral standards and acquisition Under the premise of sufferer is agreed to, is set and completed in standardization operating room by professional Cardiac surgeon.Wherein component 1 is operation behaviour Instrument needed for making, component 2 are to customize data acquisition module, and component 3 is data transmission module, and component 4 is equipment disinfection module, Component 5 is data acquiring and recording equipment, and component 6 is tumor patient.
(2) deep learning training module: referring to and handle tactile big data obtained by deep neural network study, from And it is used for the haptic navigation in surgical procedure, it mainly include operation haptic data prediction module and operation haptic data classification two A module.It is as shown in Figure 4 and Figure 5 specific training implementing procedure.
(3) operation estimation module, as shown in figure 4, in deep neural network model used in specific implementation process Shot and long term memory network model, the optimal algorithm framework of selection are as follows: LSTM (shot and long term)+DropOut (discarding)+RMSprop (ladder Degree optimization), the specific steps are as follows:
Step 1: data processing is carried out firstly for collected mechanics initial data, by error information and negative data It rejects;We set 100 as sequence length, predict next group by preceding 99 groups of data, and overall sequence sampling length is 11330, 100, wherein 80% data as training data, are left to be verify data.
Step 2: we are expanded data volume by double sampling algorithm, to obtain more accurately trained values, Data volume after expansion is 34020,100.
Step 3: preventing the overfitting of model, we, in shot and long term and density layer, are being learned using 20% Drop-out It practises and uses the loss function function of standard mean square error for 30 (due to convergences) in algorithm, verification the verifying results: MSE:0.0005 ± 0.0001,MAE:0.014±0.002,R2: 0.999 ± 9.12E-5 has reached higher forecasting precision.
Step 4: the Prediction Parameters that training pattern is obtained are imported into activation primitive, guide lung in true medical image Portion's tumor puncture operative process gives real-time navigation, predicts that the corresponding time is only 0.7s, has reached the requirement of real-time prediction.
(4) surgical classification module, as shown in figure 5, in deep neural network model used in specific implementation process Depth residual error training pattern, and Precision, Recall and the F-measure after training are verified, specific steps are such as Under:
Step 1: in the image-guided lung tumors puncturing operation operating process of our selective medicines, involved mechanics The typical level of changing features has carried out the puncture experiment of preoperative cadaver sample tissue, has obtained the power of each organ-tissue Characteristic curve is learned, is specifically included: skin layer, dorsal muscles layer, intercostal muscle layer, lung and the tactile characteristics for puncturing target tumor, number It is 3363,2000 groups according to amount.
Step 2: the different layers haptic data measured carries out classification based training by depth residual error training pattern, we pass through The jump link for setting different levels allows each layer data directly and bottom layer link, each layer of residual error module (Residual Block) setting is as follows:
h1=block1(x)
h2=block (h1)
h3=block (h2)+input
We set 3 layers of convolution module, as shown in figure 5, respectively 64 layers, 128 layers and 256 layers of filter, convolution kernel Number is respectively set as 8,5 and 3.Precision after being obtained by training are as follows: 0.900 ± 0.079;Recall be 0.993 ± 0.009, F-measure value is 0.942 ± 0.039.
Step 3: the sorting parameter that training pattern is obtained imported into the residual error network after training optimizes, is really curing It learns image-guided lung tumors puncturing operation operating process and gives real-time navigation.
(5) haptic visualization module: medical image guides in the operating process of lung tumors puncturing operation, we will be according to depth Degree study prediction module judgement, the haptic data curve after the deep learning prediction of CT images display real-time display, such as Fig. 6 institute Show, A, B, C be puncture needle puncture skin, lung, tumour outer rim three moments CT images and corresponding haptic navigation visualization Interface.Such as figure, in the case where navigational figure is smudgy, haptic navigation system of the invention obtains doctor's in actual measurement Consistent favorable comment gives surgical procedure anticipation clear, accurate, in advance.Hand is punctured in medical image guidance lung tumors simultaneously In art operating process, judged according to deep learning categorization module, CT images display real-time display skin layer, dorsal muscles layer, rib Between the process that is punctured of muscle layer, lung and tumour, provide accurate haptic visualization navigation for surgical procedure.
(6) view touching Fusion Module, medical image guide in the operating process of lung tumors puncturing operation, we are compiled by GUI It writes, the haptic navigation information superposition in module 5 has been arrived on the display of CT guidance lung tumors puncturing operation, realized image The haptic navigation interface of navigation interface is superimposed.
The above is only a kind of embodiments of the invention, it is noted that for the those of ordinary skill of technical field, Improvement and polishing without departing from the principles of the present invention, are all considered as protection scope of the present invention.

Claims (7)

1. a kind of haptic navigation system for medical image guidance operation, characterized by comprising: haptic data acquisition module (1), deep learning training module (2), operation estimation module (3), surgical classification module (4) haptic visualization module (5) and Depending on touching Fusion Module (6).
2. according to claim 1, the haptic data acquisition module (1), it is characterized in that: including six degree of freedom (6DOF) mechanics sensor module, surgical instrument module and customization connector modules, in which:
The six degree of freedom mechanics sensor, can record in real time surgical instrument operation in surgical procedure (such as: puncture, lifting, Tear equal image guided surgeries operation) in power and torque characteristic;
The surgical instrument module, the surgical instrument in as true surgical procedure is (such as: puncture needle, interposing catheter, hysteroscope etc. Image guided surgery instrument);
The customization connector (Connector), the mechanics sensor as realized by custom design are performed the operation with true The connection of instrument, preferably, the customization connector printed by 3D printing technique.
3. according to claims 1, the deep learning training module refers to and is handled by deep neural network study Tactile big data obtained mainly includes operation haptic data prediction mould for the haptic navigation in surgical procedure Block and operation haptic data two modules of classification.
4. the operation estimation module refers to by recurrent neural network (RNN) according to claim 1 with claim 3 Prediction algorithm, the haptic data obtained by training experiment obtains the force diagram prediction model in surgical procedure, and is used for Operation haptic navigation process in real time.It include: two parts of navigation in preoperative exercise and art, in which:
The preoperative exercise module includes: data extending, shot and long term neural network deep learning, circular prediction;
Haptic navigation module includes that real time data imports in the art, activation primitive parameter prediction, surgical procedure prediction.
5. according to claim 1 with claim 3, the surgical classification module, comprising: in preoperative exercise and art navigate two Module, in which:
The preoperative exercise module refers to: it is preoperative by six degree of freedom mechanics sensor, acquire human body different tissues organ Haptic model provides the tactile point of Different Organs, different depth, homologous organs' different layers structure after deep learning training Class model library.
Navigation module refers in the art: in true surgical procedure, passing through the tactile disaggregated model obtained after deep learning Library provides the haptic navigation between the tactile early warning of histoorgan faulty operation in art, organ and inside multilayered structure organ;
Deep learning algorithm in the surgical classification module, comprising: depth convolutional neural networks (CNN), recurrent neural net Network (RNN) and depth residual error neural network (ReNet).
6. according to claim 1, haptic visualization module, it is characterized in that include: in art real tactile feedback show Module, preoperative prediction haptic model display module, Multi-level Organization Structure display module, faulty operation warning module, in which:
The real tactile feedback display module refers to: in medical image guidance surgical procedure, on video display in real time Show the touch feedback data and curves of surgical procedure;
The preoperative prediction haptic model display module refers to: in medical image guidance surgical procedure, according to deep learning Prediction module judgement, the haptic data curve after the deep learning prediction of video display real-time display;
The Multi-level Organization Structure display module refers to: in medical image guidance surgical procedure, being classified according to deep learning Module judgement, video display real-time display multilayer tissue puncture or isolate (such as: after each layer of skin, fat, muscle punctures Display);
The faulty operation warning module refers to: in medical image guidance surgical procedure, according to deep learning categorization module Judgement, video display real-time display puncture or isolate non-target tissues organ early warning.
7. according to claim 1, depending on touching Fusion Module, it is characterized in that: it is and existing by haptic navigation visualization model The superposition of medical image guidance operation visualization interface, comprising: the haptic navigation at CT/MRI/X-ray/ ultrasonic navigation image interface Interface superposition.
CN201810302520.6A 2018-04-05 2018-04-05 A kind of haptic navigation system for medical image guidance operation Pending CN110338907A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810302520.6A CN110338907A (en) 2018-04-05 2018-04-05 A kind of haptic navigation system for medical image guidance operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810302520.6A CN110338907A (en) 2018-04-05 2018-04-05 A kind of haptic navigation system for medical image guidance operation

Publications (1)

Publication Number Publication Date
CN110338907A true CN110338907A (en) 2019-10-18

Family

ID=68172888

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810302520.6A Pending CN110338907A (en) 2018-04-05 2018-04-05 A kind of haptic navigation system for medical image guidance operation

Country Status (1)

Country Link
CN (1) CN110338907A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111714164A (en) * 2020-06-19 2020-09-29 上海交通大学 Tactile sensing device for minimally invasive surgery and use method thereof
CN111832656A (en) * 2020-07-17 2020-10-27 复旦大学 Medical human-computer interaction assistance system and computer-readable storage medium containing the same
CN111951946A (en) * 2020-07-17 2020-11-17 合肥森亿智能科技有限公司 Operation scheduling system, method, storage medium and terminal based on deep learning
CN113786239A (en) * 2021-08-26 2021-12-14 哈尔滨工业大学(深圳) Method and system for tracking and real-time early warning of surgical instruments under stomach and digestive tract
CN115607297A (en) * 2022-10-19 2023-01-17 山东大学 Tremor-suppression master-slave surgical robot control system and method
CN117017488A (en) * 2023-10-10 2023-11-10 华中科技大学同济医学院附属协和医院 Puncture arm path planning method comprising non-autonomous motion compensation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101160104A (en) * 2005-02-22 2008-04-09 马科外科公司 Haptic guidance system and method
CN101522134A (en) * 2006-06-05 2009-09-02 泰克尼恩研究和发展基金有限公司 Controlled steering of a flexible needle
CN103458810A (en) * 2011-02-10 2013-12-18 促动医疗股份有限公司 Medical tool with electromechanical control and feedback
CN104739519A (en) * 2015-04-17 2015-07-01 中国科学院重庆绿色智能技术研究院 Force feedback surgical robot control system based on augmented reality
CN106473804A (en) * 2015-08-25 2017-03-08 韦伯斯特生物官能(以色列)有限公司 The system and method that conduit power is controlled based on contact force
WO2017042823A1 (en) * 2015-09-10 2017-03-16 Xact Robotics Ltd. Systems and methods for guiding the insertion of a medical tool
CN107582193A (en) * 2017-09-15 2018-01-16 中国人民解放军第四军医大学 A kind of intelligent robot system for tooth-planting operation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101160104A (en) * 2005-02-22 2008-04-09 马科外科公司 Haptic guidance system and method
CN101522134A (en) * 2006-06-05 2009-09-02 泰克尼恩研究和发展基金有限公司 Controlled steering of a flexible needle
CN103458810A (en) * 2011-02-10 2013-12-18 促动医疗股份有限公司 Medical tool with electromechanical control and feedback
CN104739519A (en) * 2015-04-17 2015-07-01 中国科学院重庆绿色智能技术研究院 Force feedback surgical robot control system based on augmented reality
CN106473804A (en) * 2015-08-25 2017-03-08 韦伯斯特生物官能(以色列)有限公司 The system and method that conduit power is controlled based on contact force
WO2017042823A1 (en) * 2015-09-10 2017-03-16 Xact Robotics Ltd. Systems and methods for guiding the insertion of a medical tool
CN107582193A (en) * 2017-09-15 2018-01-16 中国人民解放军第四军医大学 A kind of intelligent robot system for tooth-planting operation

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111714164A (en) * 2020-06-19 2020-09-29 上海交通大学 Tactile sensing device for minimally invasive surgery and use method thereof
CN111714164B (en) * 2020-06-19 2022-03-01 上海交通大学 Tactile sensing device for minimally invasive surgery and use method thereof
CN111832656A (en) * 2020-07-17 2020-10-27 复旦大学 Medical human-computer interaction assistance system and computer-readable storage medium containing the same
CN111951946A (en) * 2020-07-17 2020-11-17 合肥森亿智能科技有限公司 Operation scheduling system, method, storage medium and terminal based on deep learning
CN111951946B (en) * 2020-07-17 2023-11-07 合肥森亿智能科技有限公司 Deep learning-based operation scheduling system, method, storage medium and terminal
CN113786239A (en) * 2021-08-26 2021-12-14 哈尔滨工业大学(深圳) Method and system for tracking and real-time early warning of surgical instruments under stomach and digestive tract
CN115607297A (en) * 2022-10-19 2023-01-17 山东大学 Tremor-suppression master-slave surgical robot control system and method
CN115607297B (en) * 2022-10-19 2024-04-30 山东大学 Master-slave operation robot control system and method for tremor suppression
CN117017488A (en) * 2023-10-10 2023-11-10 华中科技大学同济医学院附属协和医院 Puncture arm path planning method comprising non-autonomous motion compensation
CN117017488B (en) * 2023-10-10 2024-01-09 华中科技大学同济医学院附属协和医院 Puncture arm path planning method comprising non-autonomous motion compensation

Similar Documents

Publication Publication Date Title
CN110338907A (en) A kind of haptic navigation system for medical image guidance operation
CN112155729B (en) Intelligent automatic planning method and system for surgical puncture path and medical system
Li et al. An overview of systems and techniques for autonomous robotic ultrasound acquisitions
Burgner et al. A telerobotic system for transnasal surgery
Ning et al. Autonomic robotic ultrasound imaging system based on reinforcement learning
Chiu et al. 3-D image guidance for minimally invasive robotic coronary artery bypass
US20040009459A1 (en) Simulation system for medical procedures
CN105796177A (en) Systems and methods for guiding a medical instrument
CN109646089A (en) A kind of spine and spinal cord body puncture based on multi-mode medical blending image enters waypoint intelligent positioning system and method
CN110711030B (en) Femoral head necrosis minimally invasive surgery navigation system and navigation method based on AR technology
CN109273091A (en) A kind of percutaneous nephrolithy based on data in art takes stone system of virtual operation
Li et al. Image-guided navigation of a robotic ultrasound probe for autonomous spinal sonography using a shadow-aware dual-agent framework
Zheng et al. A novel respiratory follow-up robotic system for thoracic-abdominal puncture
Li et al. Rl-tee: Autonomous probe guidance for transesophageal echocardiography based on attention-augmented deep reinforcement learning
Riener et al. VR for medical training
He et al. Endoscopic path planning in robot-assisted endoscopic nasal surgery
Zhang et al. Summary of medical robot technology development
Narula et al. Future prospects of artificial intelligence in robotics software, a healthcare perspective
Wang et al. Study on haptic feedback functions for an interventional surgical robot system
Patel et al. Improved automatic bone segmentation using large-scale simulated ultrasound data to segment real ultrasound bone surface data
Gültekin et al. “Hey Siri! Perform a type 3 hysterectomy. Please watch out for the ureter!” What is autonomous surgery and what are the latest developments?
Stoianovici et al. Robotic tools for minimally invasive urologic surgery
Abolmaesumi et al. Introduction to special section on surgical robotics
Troccaz et al. Medical image computing and computer-aided medical interventions applied to soft tissues: Work in progress in urology
Ma et al. Liver tumor segmentation and radio frequency ablation treatment design based on CT image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20191018

WD01 Invention patent application deemed withdrawn after publication