WO2017206519A1 - Système et procédé de navigation chirurgicale fondés sur un robot médical - Google Patents

Système et procédé de navigation chirurgicale fondés sur un robot médical Download PDF

Info

Publication number
WO2017206519A1
WO2017206519A1 PCT/CN2017/070666 CN2017070666W WO2017206519A1 WO 2017206519 A1 WO2017206519 A1 WO 2017206519A1 CN 2017070666 W CN2017070666 W CN 2017070666W WO 2017206519 A1 WO2017206519 A1 WO 2017206519A1
Authority
WO
WIPO (PCT)
Prior art keywords
organ
surgical navigation
lesion area
target tissue
tissue
Prior art date
Application number
PCT/CN2017/070666
Other languages
English (en)
Chinese (zh)
Inventor
张贯京
Original Assignee
深圳市前海康启源科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市前海康启源科技有限公司 filed Critical 深圳市前海康启源科技有限公司
Publication of WO2017206519A1 publication Critical patent/WO2017206519A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots

Definitions

  • the present invention relates to the field of medical device technology, and in particular, to a surgical navigation system and method based on a medical robot.
  • kidney stone surgery takes X-rays or nuclear magnetic images before surgery. Doctors search for and record the specific location of stones or lesions based on these images, and then plan the surgical plan. In the absence of image guidance, the surgeon's hand-held surgical equipment determines the location of the lesion by experience. The operation is extremely risky, the success rate of the operation is low, the number of surgical wounds is large, and the length of the operation is long, and many problems need to be solved.
  • kidney stone stone surgery the common practice in domestic hospitals is: First, a nuclear magnetic scan is performed. The doctor calculates the position of the stone through the nuclear magnetic image, thereby determining the direction of the puncture and the depth of the puncture.
  • the position of the stone will change greatly.
  • the doctor only estimates the position of the stone by experience, or enlarges the puncture hole to observe the stone through the naked eye, which is easy to cause the puncture too deep and puncture the blood vessel. , or cause the wound to expand, causing greater pain to the patient.
  • the surgical navigation system cannot automatically identify the lesion area of the tissue and organ, nor can it automatically guide the doctor to find the lesion area, thereby failing to ensure the accuracy and safety of the operation.
  • the ambient light will produce backlight shadows on the lesions of the tissues and organs, which will bring visual effects to the doctor during the operation of the human body, which is not conducive to the doctor's surgical operation, thus affecting the accuracy of the operation.
  • security due to the structural complexity of the target tissues and organs, the existing surgical navigation system requires the doctor to manually adjust the display size, display angle, and display direction of the ultrasonic image and the lesion area, which inconveniences the doctor's surgical procedure and affects the efficiency of the operation.
  • the main object of the present invention is to provide a surgical navigation system and method based on a medical robot, which aims to solve the problem that the existing surgical navigation system cannot automatically recognize the lesion area of the tissue and organ, and because the ambient light will produce a backlight shadow on the lesion area. And affect the accuracy and safety of the surgery.
  • the present invention provides a medical robot-based surgical navigation system including a first robot arm, a second robot arm, and a robot body.
  • the medical robot-based surgical navigation system includes: an image acquisition module, configured to control an ultrasonic probe of the first robot arm to take an ultrasound image of a target tissue and organ during a patient's operation; a pathological positioning module, configured to target the tissue The ultrasound image of the organ is compared with the reference image of the normal tissue organ to locate the lesion area of the target tissue and organ, the patient's lying operating table is the horizontal plane, and the spatial coordinate system is established with the position and direction of the ultrasonic probe, and based on the The spatial coordinate system calculates the position coordinate of the lesion area; the surgical navigation module is configured to generate a surgical navigation command according to the position coordinate of the lesion area, and control an infrared locator in the ultrasonic probe to generate an infrared light guiding point, and according to the The surgical navigation command drives the first robot arm to move the infrared light
  • the pathological positioning module is configured to determine a difference in texture distribution between an ultrasound image of a target tissue organ and a reference image of a normal tissue organ, and locate a target tissue organ according to a difference in texture distribution between the two.
  • the difference in texture distribution includes differences in tissue structure, size difference, and contour difference of human tissue and organogenesis lesions.
  • the surgical navigation instruction includes distance and direction information between the first mechanical arm and a lesion area of a target tissue organ, and the infrared light guiding point is used for guiding during a patient's operation.
  • the doctor finds visible infrared dots at the location of the tissue organ lesion.
  • the medical robot-based surgical navigation system further includes a navigation display module, configured to enlarge the ultrasound image and the lesion area according to the surgical navigation instruction, and display the robot body according to different angles. On the surface of the display, for the doctor to make a surgical reference during the operation
  • the navigation display module is configured to automatically adjust the display size of the ultrasound image and the lesion area according to the distance between the first robot arm and the target tissue organ, and according to the first robot arm and the target tissue organ The direction between the ultrasound image and the lesion area are automatically adjusted on the display Show angle and display direction.
  • the present invention also provides a surgical navigation method based on a medical robot, which includes a first robot arm, a second robot arm, and a robot body.
  • the medical robot-based surgical navigation method includes the steps of: controlling an ultrasonic probe of the first robot arm to actually take an ultrasound image of a target tissue and organ during a patient's operation; and performing an ultrasound image of the target tissue and a reference image of a normal tissue and organ Comparing to locate the lesion area of the target tissue organ; taking the operating table lying flat of the patient as a horizontal plane and establishing a spatial coordinate system with the position and direction of the ultrasonic probe; calculating the position coordinate of the lesion area based on the spatial coordinate system; Position coordinates of the lesion area generate a surgical navigation command; controlling an infrared positioner in the ultrasonic probe to generate an infrared light guiding point; driving the first mechanical arm moving direction according to the surgical navigation instruction to cause the infrared light guiding point to be irradiated
  • the medical robot-based surgical navigation method
  • the step of comparing the ultrasound image of the target tissue organ with the reference image of the normal tissue organ to locate the lesion region of the target tissue organ comprises the following steps: comparing the ultrasound image of the target tissue organ with the normal tissue organ The reference image determines a difference in texture distribution between the two; and the lesion region of the target tissue and organ is located according to the difference in the texture distribution, wherein the texture distribution difference includes a difference in tissue structure, a size difference, and a contour difference of the tissue organogenesis lesion.
  • the surgical navigation command includes distance and direction information between the first mechanical arm and a lesion area of a target tissue organ, and the infrared light guiding point is used for guiding during a patient's operation.
  • the doctor finds visible infrared dots at the location of the tissue organ lesion.
  • the medical robot-based surgical navigation method further comprises the steps of: amplifying the ultrasound image and the lesion area according to the surgical navigation instruction and displaying the image on the display screen on the surface of the robot body according to different angles, For the doctor to make a surgical reference during the operation.
  • the step of amplifying the ultrasound image and the lesion area according to the surgical navigation instruction and displaying the image on the display screen on the surface of the robot body according to different angles comprises the following steps: According to the first machine The distance between the arm and the target tissue and organ automatically adjusts the display size of the ultrasound image and the lesion area; automatically adjusts the ultrasound image and the display angle of the lesion area on the display screen according to the direction between the first robot arm and the target tissue organ And display direction.
  • the medical robot-based surgical navigation system and method of the present invention adopts the above technical solutions, and achieves the following technical effects: the ultrasonic image of the target tissue and organs during the operation and the pre-operative reference The images are merged, and the lesions in the target tissues and organs are automatically identified and the lesions of the target tissues and organs are tracked, so that the position of the lesions is clearly visible and automatically navigated to the lesions to guide the doctors to operate, and the spotlights emitted by the spotlights are eliminated.
  • the backlighting shadow generated by the ambient light on the lesion area significantly improves the accuracy and safety of the operation.
  • the medical robot-based surgical navigation system and method of the present invention can also enlarge the ultrasonic image and the lesion area and display it on the display screen according to different angles for the doctor to make a surgical reference during the operation, thereby facilitating the doctor. Surgery and improved the efficiency of the operation.
  • FIG. 1 is a schematic diagram of an application environment of a preferred embodiment of a surgical robot based surgical navigation system according to the present invention
  • FIG. 2 is a schematic diagram of functional modules of the medical robot of FIG. 1.
  • FIG. 3 is a flow chart of a preferred embodiment of a surgical navigation method based on a medical robot of the present invention.
  • FIG. 1 is a schematic diagram of an application environment of a preferred embodiment of a surgical navigation system based on a medical robot according to the present invention.
  • the surgical navigation system 10 is installed and operated in a medical robot 01, which can be placed in an operating room, and an operating table 02 for the patient to lie flat for surgery is placed in the operating room.
  • the medical robot 01 includes, but is not limited to, a first robot arm 1.
  • the first robot arm 1 is provided with an ultrasonic probe 11 for taking an ultrasonic image of a patient's tissue and organ by ultrasonic waves during a patient's operation, and the ultrasonic image may be a 3D image of a tissue organ.
  • the ultrasonic probe 11 has an infrared locator 12 built therein for generating an infrared light guiding point for guiding a doctor to find a position of a tissue organ lesion during a patient's operation.
  • the second robot arm 2 is provided with a spotlight 13 for emitting light and illuminating the lesion area to eliminate backlight shadows generated by the ambient light on the lesion area.
  • the outer surface of the robot body 3 is provided with a display screen 14 for displaying an ultrasonic image of a tissue organ during a patient's operation for a doctor's surgical reference during the surgery.
  • FIG. 2 is a schematic diagram of functional modules of the medical robot of FIG. 1.
  • the medical robot 01 includes, but is not limited to, a surgical navigation system 10, an ultrasonic probe 11, an infrared positioner 12, a spotlight 13, a display screen 14, a memory 15, and a microprocessor 16.
  • the ultrasonic probe 1 1 , the infrared positioner 12 , the spotlight 13 , the display 14 and the memory 15 are both connected to the microprocessor 16 via a data bus and can be used by the microprocessor 16 to perform information with the surgical navigation system 10 . Interaction.
  • the memory 15 may be a read only memory unit ROM, an electrically erasable memory unit EEPROM or a flash memory storage unit FLASH or the like for storing program instruction codes constituting the surgical navigation system 10.
  • the microprocessor 16 can be a microcontroller (MCU), a data processing chip, or an information processing unit having data processing functions for performing the surgical navigation system 10 to provide surgical guidance during a patient's procedure.
  • the surgical navigation system 10 includes, but is not limited to, an image acquisition module 101, a pathological positioning module 102, a surgical navigation module 103, an illumination concentrating module 104, and a navigation display module 105.
  • the module referred to in the present invention refers to a series of computer program instruction segments that can be executed by the microprocessor 16 of the medical robot 01 and that can perform a fixed function, which is stored in the memory 15 of the medical robot 01.
  • the image acquisition module 101 is configured to control the ultrasonic probe 11 of the first robot arm 1 to take an ultrasound image of the target tissue and organ during the operation of the patient.
  • the ultrasonic probe 11 may employ a three-dimensional ultrasonic probe that acquires a three-dimensional ultrasonic image of a target tissue organ by emitting a pyramid-shaped volume ultrasonic beam.
  • the pathological positioning module 102 is configured to use an ultrasound image of a target tissue organ and a reference of a normal tissue organ The test images are used to compare the lesion areas of the target tissues and organs.
  • the reference image of the normal tissue organ is pre-stored in the memory 15 as a reference for comparison with the ultrasonic image.
  • the pathological positioning module 102 determines the difference in texture distribution between the ultrasound image of the target tissue organ and the reference image of the normal tissue organ, and locates the lesion region of the target tissue organ according to the difference in texture distribution between the two, the texture distribution Differences include differences in tissue structure, size differences, and contour differences in human tissue and organ damage.
  • the pathological positioning module 102 is further configured to establish a spatial coordinate system with the position and direction of the ultrasonic probe 11 by using the operating table 02 on which the patient lies, and calculate the position coordinate of the lesion based on the spatial coordinate system.
  • the position coordinates of the lesion area include the position and direction of the lesion area with respect to the ultrasonic probe 11.
  • the pathological positioning module 102 establishes a spatial coordinate system XYZ according to the position and direction of the ultrasonic probe 11 according to the operating table 02 on which the patient lies, and the position of the ultrasonic probe 11 in the space coordinate system.
  • the direction can calculate the position coordinates of any point in the ultrasonic image in the space coordinate system, and then the position and direction of the target tissue relative to the ultrasonic probe 11 can be known.
  • the surgical navigation module 103 is configured to generate a surgical navigation command according to the position coordinates of the lesion region, control the infrared locator 12 in the ultrasonic probe 11 to generate an infrared light guiding point, and drive the first mechanical arm 1 to move according to the surgical navigation instruction.
  • the direction causes the infrared light guiding spot to illuminate the lesion area.
  • the surgical navigation command includes distance and direction information between the first robot arm 1 and a lesion area of the target tissue organ, and the infrared light guiding point is a method for guiding the doctor during the patient's operation. Find visual infrared dots at the location of tissue organ lesions.
  • the surgical navigation module 103 controls the moving direction of the first robot arm 1 according to the surgical navigation instruction to illuminate the infrared light guiding point on the lesion area of the target tissue organ, thereby assisting the doctor to quickly and accurately find the lesion position of the target tissue and organ, which is beneficial to improve the operation. Accuracy and security.
  • the illumination concentrating module 104 is configured to control the spotlight 13 on the second robot arm 2 to condense and illuminate the lesion area of the target tissue organ to eliminate backlight shadow generated by the ambient light on the lesion area. Due to the structural complexity of the target tissues and organs, the ambient light will produce backlight shadows on the lesions of the tissues and organs, which will bring visual effects to the doctor during the operation of the human body, which is not conducive to the doctor's surgical operation, thus affecting the accuracy and safety of the operation. Sex.
  • the spotlight 13 is used to condense and illuminate the lesion area of the target tissue and organ to eliminate the backlight shadow generated by the ambient light on the lesion area, thereby improving the surgical illumination. The degree is conducive to the successful completion of the doctor's surgical operation, and improves the accuracy and safety of the operation.
  • the navigation display module 105 is configured to enlarge the ultrasound image and the lesion area according to the surgical navigation instruction and display it on the display screen 14 according to different angles.
  • the navigation display module 105 automatically adjusts the display size of the ultrasonic image and the lesion area according to the distance between the first robot arm 1 and the target tissue organ, and according to the first robot arm 1 and the target tissue organ.
  • the direction between the ultrasonic image and the display angle and display direction of the lesion area on the display screen 14 are automatically adjusted for the doctor to make a surgical reference during the operation, thereby further improving the accuracy and safety of the operation.
  • the present invention also provides a surgical navigation method based on a medical robot, which is applied to the medical robot 1.
  • Fig. 3 is a flow chart of a preferred embodiment of the surgical navigation method based on the medical robot of the present invention.
  • the medical robot-based surgical navigation method includes the following steps:
  • Step S31 the ultrasonic probe of the first robot arm is controlled to actually take an ultrasound image of the tissue and organs during the operation of the patient; specifically, the image acquisition module 101 controls the ultrasonic probe 11 of the first robot arm 1 to actually take the patient's surgical procedure.
  • the ultrasonic probe 11 can employ a three-dimensional ultrasonic probe that acquires a three-dimensional ultrasonic image of a target tissue and organ by emitting a pyramid-shaped volumetric ultrasonic beam.
  • Step S32 comparing the ultrasound image of the target tissue organ with the reference image of the normal tissue organ to locate the lesion region of the target tissue organ; specifically, the pathological positioning module 102 images the ultrasound image of the target tissue and organ with the normal tissue and organ The reference image is used to compare and locate the lesion area of the target tissue and organ.
  • the pathological positioning module 102 determines the difference in texture distribution between the ultrasound image of the target tissue organ and the reference image of the normal tissue organ to locate the lesion of the target tissue and organ according to the difference in texture distribution between the two tissues.
  • the difference in texture distribution includes differences in tissue structure, size difference, and contour difference of human tissue and organogenesis lesions.
  • Step S33 taking the operating table lying flat of the patient as a horizontal plane and establishing a spatial coordinate system with the position and direction of the ultrasonic probe; specifically, the pathological positioning module 102 takes the operating table 02 lying flat of the patient as a horizontal plane and the ultrasonic probe 11
  • the position and orientation establish a spatial coordinate system.
  • the position coordinates of the lesion area include the position and direction of the lesion area with respect to the ultrasonic probe 11.
  • the pathology The positioning module 102 establishes a spatial coordinate system XYZ according to the position and direction of the ultrasonic probe 11 according to the operating table 02 on which the patient lies.
  • Step S34 calculating the position coordinate of the lesion area based on the spatial coordinate system; specifically, the pathological positioning module 102 calculates an arbitrary point in the ultrasonic image in the spatial coordinate system by the position and direction of the ultrasonic probe 11 in the space coordinate system XYZ.
  • the position coordinates under XYZ can further know the position and orientation of the target tissue relative to the ultrasonic probe 11.
  • Step S35 generating a surgical navigation instruction according to the position coordinates of the lesion area; specifically, the surgical navigation module 103 generates a surgical navigation instruction according to the position coordinates of the lesion area.
  • the surgical navigation command includes distance and direction information between the first robot arm 1 and the lesion area of the target tissue organ.
  • Step S36 controlling the infrared locator in the ultrasonic probe to generate an infrared light guiding point; specifically, the surgical navigation module 103 controls the infrared locator 12 in the ultrasonic probe 11 to generate an infrared light guiding point.
  • the infrared light guiding point is a visible infrared dot for guiding a doctor to locate a tissue organ lesion during a patient's surgery.
  • Step S37 driving the first robot arm moving direction according to the surgical navigation instruction to illuminate the infrared light guiding point in the lesion area; specifically, the surgical navigation module 103 controls the moving direction of the first mechanical arm 1 according to the surgical navigation instruction to make the infrared
  • the light guiding point is irradiated on the lesion area of the target tissue and organ, which can help the doctor to quickly and accurately find the lesion position of the target tissue and organ, which is beneficial to improve the accuracy and safety of the operation.
  • Step S38 controlling the spotlight on the second robot arm to condense and illuminate the lesion area to eliminate the backlight shadow generated by the ambient light on the lesion area; specifically, the illumination concentrating module 104 controls the second robot arm 2
  • the spotlight 13 emits light and illuminates the lesion area of the target tissue and organ to eliminate backlight shadows generated by the ambient light on the lesion area. Due to the structural complexity of the target tissues and organs, the ambient light will produce backlight shadows on the lesions of the tissues and organs, which will bring visual effects to the doctor during the operation of the human body, which is not conducive to the doctor's surgical operation, thus affecting the accuracy and safety of the operation. Sex.
  • the spotlight 13 is used to condense and illuminate the lesion area of the target tissue and organ to eliminate the backlight shadow generated by the ambient light on the lesion area, thereby improving the surgical illumination degree, facilitating the smooth completion of the doctor's surgical operation, and improving the accuracy of the operation. Sex and safety.
  • step S39 according to the surgical navigation instruction, the ultrasound image and the lesion area are enlarged and according to different The angle display is displayed on the display screen; specifically, the navigation display module 105 enlarges the ultrasonic image and the lesion area according to the surgical navigation command and displays it on the display screen 14 according to different angles.
  • the navigation display module 105 automatically adjusts the display size of the ultrasonic image and the lesion area according to the distance between the first robot arm 1 and the target tissue organ, and according to the first robot arm 1 and the target tissue organ.
  • the direction between the ultrasonic image and the display angle and display direction of the lesion area on the display screen 14 are automatically adjusted for the doctor to make a surgical reference during the operation, thereby further improving the accuracy and safety of the operation.
  • the medical robot-based surgical navigation system and method of the present invention can fuse the ultrasonic image of the target tissue and organ during the operation with the reference image before the operation, automatically identify the lesion area in the target tissue and organ and track and locate the target
  • the lesion area of the target tissue and organ makes the position of the lesion area clearly visible and automatically navigates to the lesion area to guide the doctor's surgery.
  • the concentrated light irradiation by the spotlight eliminates the backlight shadow generated by the ambient light on the lesion area, thereby significantly improving the accuracy of the operation. Sex and safety.
  • the medical robot-based surgical navigation system and method of the present invention can also magnify the ultrasound image and the lesion area and display it on the display screen at different angles for the doctor to make a surgical reference during the operation, thereby It is convenient for doctors to operate and improve the efficiency of surgery.
  • the medical robot-based surgical navigation system and method of the present invention adopts the above technical solutions, and achieves the following technical effects:
  • the ultrasonic image of the target tissue and organs during the operation and the pre-operative reference can be obtained.
  • the images are merged, and the lesions in the target tissues and organs are automatically identified and the lesions of the target tissues and organs are tracked, so that the position of the lesions is clearly visible and automatically navigated to the lesions to guide the doctors to operate, and the spotlights emitted by the spotlights are eliminated.
  • the backlighting shadow generated by the ambient light on the lesion area significantly improves the accuracy and safety of the operation.
  • the medical robot-based surgical navigation system and method of the present invention can also enlarge the ultrasonic image and the lesion area and display it on the display screen according to different angles for the doctor to make a surgical reference during the operation, thereby facilitating the doctor. Surgery and improved the efficiency of the operation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Manipulator (AREA)
  • Surgical Instruments (AREA)

Abstract

L'invention concerne un système (10) et un procédé de navigation chirurgicale fondés sur un robot médical. Le procédé comprend les étapes consistant à donner l'ordre à une sonde ultrasonore (11) d'un premier bras robotisé (1) de capturer une image ultrasonore d'un tissu/organe cible en temps réel (S31) ; à comparer l'image ultrasonore du tissu/organe cible avec une image de référence d'un tissu/organe normal pour positionner une zone de lésion sur le tissu/organe cible (S32) ; à calculer les coordonnées de position de la zone de lésion (S34) ; à générer une instruction de navigation chirurgicale sur la base des coordonnées de position de la zone de lésion (S35) ; à donner l'ordre à un positionneur infrarouge (12), situé dans la sonde ultrasonore (11), de générer un point de guidage à lumière infrarouge (S36) ; à entraîner, en fonction de l'instruction de navigation chirurgicale, le premier bras robotisé (1) à se déplacer dans une autre direction de manière à émettre le point de guidage à lumière infrarouge au niveau de la zone de lésion (S37) ; et à donner l'ordre à une lampe de projecteur (13), située sur un second bras robotisé (2), d'émettre de la lumière en direction de la zone de la lésion de manière à éliminer toute ombre (S38). Le système de navigation chirurgicale (10) peut automatiquement positionner une zone de lésion et donner des directives à un médecin lors d'une intervention chirurgicale, ce qui facilite l'intervention chirurgicale pour le médecin. L'efficacité chirurgicale, mais aussi la précision et la sécurité de l'intervention sont ainsi améliorées.
PCT/CN2017/070666 2016-06-04 2017-01-09 Système et procédé de navigation chirurgicale fondés sur un robot médical WO2017206519A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610393200.7 2016-06-04
CN201610393200.7A CN105943161A (zh) 2016-06-04 2016-06-04 基于医疗机器人的手术导航系统及方法

Publications (1)

Publication Number Publication Date
WO2017206519A1 true WO2017206519A1 (fr) 2017-12-07

Family

ID=56908872

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/070666 WO2017206519A1 (fr) 2016-06-04 2017-01-09 Système et procédé de navigation chirurgicale fondés sur un robot médical

Country Status (2)

Country Link
CN (1) CN105943161A (fr)
WO (1) WO2017206519A1 (fr)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105943161A (zh) * 2016-06-04 2016-09-21 深圳市前海康启源科技有限公司 基于医疗机器人的手术导航系统及方法
CN106491216B (zh) * 2016-10-28 2019-06-28 苏州朗开医疗技术有限公司 一种诊断体内目标对象定位系统和医疗定位系统
CN107257430B (zh) * 2017-04-26 2019-10-15 努比亚技术有限公司 一种拍照控制方法、终端及计算机可读存储介质
CN107714178A (zh) * 2017-10-28 2018-02-23 深圳市前海安测信息技术有限公司 手术导航定位机器人及其控制方法
CN107669340A (zh) * 2017-10-28 2018-02-09 深圳市前海安测信息技术有限公司 3d影像手术导航机器人及其控制方法
CN108030980B (zh) * 2017-12-06 2020-11-24 泉州市如万电子商务有限公司 一种具有鼻腔加湿功能的加湿器
CN108114366B (zh) * 2018-01-31 2020-08-07 张振坤 一种肿瘤内科药物介入综合治疗装置
US11612438B2 (en) * 2018-09-05 2023-03-28 Point Robotics Medtech Inc. Navigation system and method for medical operation by a robotic system using a tool
CN113648058B (zh) * 2021-08-20 2023-01-13 苏州康多机器人有限公司 一种手术辅助定位方法及系统
CN114984412B (zh) * 2022-03-25 2023-07-21 清华大学 闭环式血流控制系统及其控制方法
CN116196111B (zh) * 2023-05-05 2023-10-31 北京衔微医疗科技有限公司 一种眼科手术机器人系统及其控制方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102512246A (zh) * 2011-12-22 2012-06-27 中国科学院深圳先进技术研究院 手术导航系统及方法
US20120265071A1 (en) * 2011-03-22 2012-10-18 Kuka Laboratories Gmbh Medical Workstation
US20120277585A1 (en) * 2011-04-29 2012-11-01 Medtronic Navigation, Inc. Method and Apparatus for Calibrating and Re-aligning an Ultrasound Image Plane to a Navigation Tracker
CN103230304A (zh) * 2013-05-17 2013-08-07 深圳先进技术研究院 手术导航系统及其方法
EP3025665A1 (fr) * 2014-11-26 2016-06-01 MASMEC S.p.A. Système assisté par ordinateur permettant de guider un instrument de diagnostic/chirurgical dans le corps d'un patient
CN105943161A (zh) * 2016-06-04 2016-09-21 深圳市前海康启源科技有限公司 基于医疗机器人的手术导航系统及方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050267368A1 (en) * 2003-07-21 2005-12-01 The Johns Hopkins University Ultrasound strain imaging in tissue therapies
US20050261591A1 (en) * 2003-07-21 2005-11-24 The Johns Hopkins University Image guided interventions with interstitial or transmission ultrasound
KR100681233B1 (ko) * 2004-10-28 2007-02-09 김재황 복강경수술 모니터장치 및 디스플레이 방법
US8808164B2 (en) * 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
EP2515761A4 (fr) * 2009-12-21 2015-04-29 Terumo Corp Système d'excitation, de détection, et de projection pour la visualisation d'un tissu cancéreux cible
US9498185B2 (en) * 2010-11-12 2016-11-22 Konica Minolta, Inc. Ultrasound diagnostic apparatus and ultrasound diagnostic system
JP5908852B2 (ja) * 2013-02-06 2016-04-26 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置及びその制御プログラム
US10292771B2 (en) * 2013-03-15 2019-05-21 Synaptive Medical (Barbados) Inc. Surgical imaging systems
EP3033987A4 (fr) * 2013-09-27 2017-05-17 Olympus Corporation Système endoscopique

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120265071A1 (en) * 2011-03-22 2012-10-18 Kuka Laboratories Gmbh Medical Workstation
US20120277585A1 (en) * 2011-04-29 2012-11-01 Medtronic Navigation, Inc. Method and Apparatus for Calibrating and Re-aligning an Ultrasound Image Plane to a Navigation Tracker
CN102512246A (zh) * 2011-12-22 2012-06-27 中国科学院深圳先进技术研究院 手术导航系统及方法
CN103230304A (zh) * 2013-05-17 2013-08-07 深圳先进技术研究院 手术导航系统及其方法
EP3025665A1 (fr) * 2014-11-26 2016-06-01 MASMEC S.p.A. Système assisté par ordinateur permettant de guider un instrument de diagnostic/chirurgical dans le corps d'un patient
CN105943161A (zh) * 2016-06-04 2016-09-21 深圳市前海康启源科技有限公司 基于医疗机器人的手术导航系统及方法

Also Published As

Publication number Publication date
CN105943161A (zh) 2016-09-21

Similar Documents

Publication Publication Date Title
WO2017206519A1 (fr) Système et procédé de navigation chirurgicale fondés sur un robot médical
US10582972B2 (en) 3D system and method for guiding objects
CN110573105B (zh) 用于对软组织进行微创医疗干预的机器人装置
ES2907252T3 (es) Sistema para realizar procedimientos quirúrgicos y de intervención automatizados
JP3492697B2 (ja) 基準および局所化フレームを備える外科用案内装置
CN108135563B (zh) 光和阴影引导的针定位系统和方法
US11602403B2 (en) Robotic tool control
JP2010200894A (ja) 手術支援システム及び手術ロボットシステム
CA3088277A1 (fr) Systeme et procede d'estimation de pose d'un dispositif d'imagerie et de determination de l'emplacement d'un dispositif medical par rapport a une cible
WO2013116240A1 (fr) Guidage de dispositifs médicaux multiples
WO2019080358A1 (fr) Robot de navigation chirurgicale utilisant des images 3d et son procédé de commande
CN109481018A (zh) 一种应用在医疗操作中的导航设备及方法
CN111870344B (zh) 术前导航方法、系统及终端设备
EP3398552A1 (fr) Commande de visionneuse d'images médicales à partir de la caméra du chirurgien
CN116077155B (zh) 基于光学追踪设备和机械臂的手术导航方法及相关装置
WO2019080317A1 (fr) Robot pour la navigation chirurgicale et l'indication de position et son procédé de commande
Adebar et al. Registration of 3D ultrasound through an air–tissue boundary
WO2017206520A1 (fr) Robot médical d'assistance chirurgicale
Adebar et al. Instrument-based calibration and remote control of intraoperative ultrasound for robot-assisted surgery
CN116585036A (zh) 持针器及其穿刺手术机器人、手术导航方法、存储介质
CN115462885A (zh) 一种经皮穿刺方法及系统
US20230210603A1 (en) Systems and methods for enhancing imaging during surgical procedures
WO2019075074A1 (fr) Système et procédé d'identification et de marquage d'une cible dans une reconstruction tridimensionnelle fluoroscopique
US20220071703A1 (en) Systems for monitoring ablation progress using remote temperature probes
US11711596B2 (en) System and methods for determining proximity relative to an anatomical structure

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17805470

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31.05.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17805470

Country of ref document: EP

Kind code of ref document: A1