WO2022188352A1 - 基于增强现实的介入机器人无接触遥操系统及标定方法 - Google Patents

基于增强现实的介入机器人无接触遥操系统及标定方法 Download PDF

Info

Publication number
WO2022188352A1
WO2022188352A1 PCT/CN2021/111946 CN2021111946W WO2022188352A1 WO 2022188352 A1 WO2022188352 A1 WO 2022188352A1 CN 2021111946 W CN2021111946 W CN 2021111946W WO 2022188352 A1 WO2022188352 A1 WO 2022188352A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
augmented reality
calibration
interventional
shape
Prior art date
Application number
PCT/CN2021/111946
Other languages
English (en)
French (fr)
Inventor
高安柱
林泽才
杨广中
陈卫东
艾孝杰
Original Assignee
上海交通大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海交通大学 filed Critical 上海交通大学
Publication of WO2022188352A1 publication Critical patent/WO2022188352A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/309Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs

Definitions

  • the present invention relates to the technical field of intracavity intervention, in particular, to a contactless teleoperation system and a calibration method of an interventional robot based on augmented reality.
  • a Chinese invention patent with application number 201510802654.0 discloses a surgical robot system using augmented reality technology and a control method thereof.
  • the technical solution is for augmented reality of rigid surgical instruments, which cannot be applied to the real-time deformation characteristics of flexible robots, and The operation mode of its operation requires the doctor to directly contact the operating handle, which increases the risk of cross-infection for the surgeon.
  • the inventor believes that the above-mentioned surgical robot system and its control method have the problem that the flexible robot cannot be enhanced and displayed by superimposing virtual and reality, and it is easy to cause cross-infection of doctors. Therefore, it is necessary to propose a technical solution. In order to improve the above technical problems.
  • the purpose of the present invention is to provide a non-contact teleoperation system and calibration method for interventional robots based on augmented reality.
  • An augmented reality-based non-contact telecontrol system for interventional robots includes a head-mounted AR device, PC1, PC2, a grating demodulator, a magnetic field generator, an insertion tube, a redundant robotic arm, a drive unit, Continuum robot, LED, endoscopic camera, shape sensor and electromagnetic sensor, the head-mounted AR device is connected with PC1, and the PC2 is respectively connected with grating demodulator, magnetic field generator, redundant robotic arm and drive unit connected.
  • the shape sensor, the electromagnetic sensor, the continuum robot, the insertion tube, the redundant robotic arm, the drive unit, the endoscopic camera and the LED constitute a flexible robot.
  • the head-mounted AR device superimposes the flexible robot model on the real scene, and the objects on which the flexible robot model is superimposed are the continuum robot and the endoscope insertion tube part of the catheter, and the continuum robot and the endoscope are inserted into
  • the overall length of the pipe section is determined by the length of the shape sensor.
  • the grating demodulator and the shape sensor constitute a shape sensing system; the magnetic field generator and the electromagnetic sensor constitute an electromagnetic tracking system.
  • the workflow of the system is as follows:
  • Step 1 PC1 reconstructs the 3D model of the intracavity anatomical structure using the 2D plan of the preoperative CT scan, and sets the best observation position of the endoscopic image in the AR device;
  • Step 2 calibrate the system
  • Step 3 PC2 uses the wavelength measured by the shape sensing system to reconstruct the shape of the flexible robot. According to the calibration information in step 2, the reconstructed shape is transformed into the AR device coordinate system, and the transformed data and endoscopic images are converted. Send to PC1 by wireless communication;
  • Step 4 The AR device communicates with PC1 in real time, and the doctor wears the AR device;
  • Step 5 The doctor controls the motion of the flexible robot through gesture recognition.
  • the step 2 includes the calibration of the AR device and the electromagnetic tracking system, the calibration of the AR device and the robotic arm, the calibration of the AR device and the anatomical structure, the calibration of the shape sensor and the electromagnetic sensor, and the calibration of the AR device and the world reference system
  • the doctor can observe the superimposed information of the three-dimensional shape of the flexible robot and the three-dimensional model of the intracavity anatomy in the real scene in the AR device, and the doctor can watch the image of the endoscope in real time.
  • the AR device in the step 5 recognizes the doctor's gesture and calculates the position of the wrist relative to the AR device, and parses different gestures into different sign positions. Do the appropriate exercise.
  • the robot moves forward/forwardly rotates/bends to the left.
  • the electromagnetic sensor adopts six degrees of freedom, and the relationship between the coordinate system of the electromagnetic sensor and the coordinate system of the shape sensor is calibrated through a redefined virtual plane.
  • the present invention also provides a method for calibrating a contactless telecontrol system for an interventional robot based on augmented reality, the method includes one of the above-mentioned contactless telecontrol systems for an interventional robot based on augmented reality, and the method includes the following steps:
  • the flexible robot performs a preliminary movement at the entrance of the intervening cavity, and uses the electromagnetic sensor at the end of the continuum robot to collect the point cloud data of the part of the anatomical wall as the original cloud G;
  • S5 Use the ICP algorithm to match the point clouds P and Q, and obtain the transformation relationship between the 3D model of the anatomical structure under the reference frame of the AR device and the actual anatomy through iteration.
  • the present invention has the following beneficial effects:
  • the present invention reduces the risk of cross-infection by doctors, adopts a non-contact gesture recognition method to control the robot, and maps different gestures to different movements of the robot.
  • the present invention applies the augmented reality technology to the flexible surgical robot, overcomes the defect that the previous augmented reality technology is only aimed at the display of rigid instruments, provides the surgeon with an immersive surgical experience during the intracavitary intervention of the flexible robot, and further increases the operation time. safety.
  • the present invention calibrates the relationship between the electromagnetic sensor coordinate system and the shape sensor coordinate system through the redefined virtual plane, and solves the problem that it is difficult to obtain absolute shape information because the shape sensor base coordinate system is a floating base.
  • Fig. 2 is the working flow chart of the present invention
  • Fig. 3 is the spatial calibration principle diagram between each component of the present invention.
  • Fig. 4 is the rigid sleeve structure diagram of the present invention.
  • FIG. 5 is an overall data flow diagram of the present invention.
  • the present invention provides a non-contact teleoperation system and calibration method for interventional robots based on augmented reality, as shown in FIG. 1 , which mainly includes: head-mounted AR equipment, PC1, PC2, grating demodulator, magnetic field generator, insertion tube , redundant robotic arms, drive units, continuum robots, LEDs, endoscopic cameras, shape sensors, and electromagnetic sensors.
  • the shape sensor, electromagnetic sensor, continuum robot, insertion tube, redundant robotic arm, drive unit, endoscopic camera and LED constitute the flexible robot.
  • the superimposed objects are the continuum robot and the endoscope insertion tube part of the catheter, and the total length is determined by the length of the shape sensor; the grating demodulator and the shape sensor constitute the shape sensor system; the magnetic field generator and the electromagnetic sensor constitute the electromagnetic tracking system.
  • the head-mounted AR device is worn by the doctor.
  • the binocular camera of the AR device can recognize the doctor's gesture, calculate the spatial position of the wrist relative to the AR device, and superimpose the flexible robot and anatomical structure model to the corresponding real scene;
  • PC1 and The AR device is connected, the AR device receives the shape information and endoscopic image of the flexible robot sent by PC1 in real time, and the PC1 receives the gesture information recognized by the AR device in real time;
  • PC2 is connected to the magnetic field generator, grating demodulator and flexible robot to reconstruct the flexible robot in real time.
  • the flexible body robot has three degrees of freedom of translation, rotation and bending.
  • the continuum robot is a self-contact structure and adopts a rope-driven driving method.
  • the left and right bending of the continuum robot can be realized by stretching the rope by two motors in the driving unit, and the redundant manipulator has a total of seven degrees of freedom. It is used to provide translation and rotation motion of the flexible robot; the electromagnetic sensor has six freedoms and is used to obtain the pose of the end of the flexible robot in real time; the shape sensor is a multi-core optical fiber embedded in the cavity of the flexible robot to sense the shape of the robot.
  • PC1 uses the 2D plan of the preoperative CT scan to reconstruct the 3D model of the intracavity anatomy, and sets the best observation position of the endoscopic image in the AR device according to the doctor's operating habits.
  • the system is calibrated, which includes a total of five aspects of calibration, as shown in Figure 3, which are the calibration of the AR device and the electromagnetic tracking system, the calibration of the AR device and the robotic arm, the calibration of the AR device and the anatomical structure, and the shape Calibration of sensors and electromagnetic sensors, and calibration of AR devices and world reference systems.
  • Figure 3 the calibration of the AR device and the electromagnetic tracking system, the calibration of the AR device and the robotic arm, the calibration of the AR device and the anatomical structure, and the shape Calibration of sensors and electromagnetic sensors, and calibration of AR devices and world reference systems.
  • the reference system of each part is transformed into the AR device coordinate system.
  • PC2 uses the wavelength measured by the shape sensing system to reconstruct the shape of the flexible robot in real time.
  • the reconstructed shape is transformed into the AR device coordinate system, and the transformed data and the endoscope The image is sent to PC1 via wireless communication.
  • the AR device communicates with PC1 in real time, and at the same time, the doctor wears the AR device.
  • the doctor can observe the superimposed information of the three-dimensional shape of the flexible robot and the three-dimensional model of the cavity anatomy in the real scene.
  • the doctor can also view the image of the endoscope in real time.
  • the doctor controls the motion of the soft robot through gesture recognition.
  • the AR device recognizes the doctor's gesture and calculates the position of the wrist relative to the AR device, and parses different gestures into different marker positions.
  • the flexible robot receives the marker positions, it performs corresponding movements according to the different marker positions.
  • the robot moves forward / rotates forward / bends to the left, and vice versa.
  • the overall data flow is shown in Figure 5, including real-time data, visualization data and a priori calibration data.
  • Equations (1) and (2) can be expressed as:
  • the base coordinate system of the shape sensor is a floating coordinate system. How to obtain the shape information of the flexible robot relative to the AR device coordinate system is a huge challenge.
  • the present invention calibrates the relationship between the coordinate system of the electromagnetic sensor and the coordinate system of the shape sensor through a redefined virtual plane by means of an electromagnetic sensor with six degrees of freedom. Furthermore, through the calibration relationship between the electromagnetic tracking system and the AR device, the shape information of the flexible robot is transformed into the AR device coordinate system, as shown in Figure 3.
  • the rigid sleeve Fix the rigid sleeve with two parallel holes at the end of the continuum robot, the two holes are parallel to the axial direction of the rigid sleeve.
  • the rigid sleeve is shown in Fig. 4, the 6DOF electromagnetic sensor (E1) is fixed to hole 1, and its coordinate system ⁇ E1 ⁇ is set as the coordinate system of hole 1, in addition, another 6DOF electromagnetic sensor ( E3) Insert hole 2.
  • the length of hole 1 and hole 2 are the same, and the length of the electromagnetic sensor is equal to the length of the hole.
  • the poses T E1 and T E3 of the two six-degree-of-freedom electromagnetic sensors can be obtained through the electromagnetic tracking system, and the relationship between the two electromagnetic sensors can be calculated by the following formula
  • hole 1 may not be parallel to hole 2.
  • the Z-axis of the two electromagnetic sensors are parallel to the axial direction of their respective holes, and the angle between hole 1 and hole 2 caused by manufacturing errors can be obtained:
  • the transformation and rotation matrix R' of the vectors z E1 and z E3 can be obtained by the Rodrigues rotation formula:
  • is the unit vector of z E1 ⁇ z E3 , is an obliquely symmetric matrix of ⁇ .
  • R R E1 -1
  • R'R E1 can compensate for the parallel error of hole 1 and hole 2 caused by manufacturing.
  • the electromagnetic sensor of hole 2 is replaced with a shape sensor, the last fiber Bragg grating FBG of the shape sensor is located in the sleeve, and the base coordinate system ⁇ S ⁇ of the shape sensor is defined in the last FBG. Then, with a flat plane as the reference plane, the root of the flexible robot with the shape sensing part is fixed on the reference plane, and the shape sensor is calibrated on the plane, including straight line setting and torsion compensation calibration. The torsion compensation process should allow The sensing part of the shape sensor is in a bent state. After the calibration is completed, the XY plane of the shape sensor coordinate system ⁇ S ⁇ is parallel to the reference plane, and at the same time, the hole 2 is parallel to its Z axis without considering the manufacturing error.
  • the rigid sleeve at the end of the continuum robot is moved to three points on the reference plane respectively.
  • the position of the electromagnetic sensor on the flexible robot is P 1 , P 2 , P 3 at each point, and the rotation matrix R c of the third point is collected.
  • the information of the shape reconstruction can be transformed into the space of the AR device as follows:
  • C(s) is the three-dimensional shape information of the flexible robot in the shape sensor coordinate system
  • C(s) H is the three-dimensional shape information of the flexible robot in the AR device coordinate system.
  • the main calibration process is as follows:
  • the flexible robot performs a preliminary movement at the entrance of the intervening cavity, and uses the electromagnetic sensor at the end of the continuum robot to collect the point cloud data of the part of the anatomical wall, and use it as the original cloud G.
  • the traditional contact-based master-slave control method is improved, and the non-contact gesture recognition method is used to perform teleoperation control of the flexible robot.
  • three gestures are defined to map the motion of the flexible robot, including bending, translational motion and rotational motion. The bending is accomplished by pulling the rope-driven cues of the flexible robot, the translational motion is provided by the Cartesian motion of the manipulator, and the rotational motion is realized by the rotation of the last joint of the manipulator.
  • the AR device is used to recognize the surgeon's gestures, and a total of three gestures are mapped to the robot's movements, including "OPEN”, “OK” and “FINGER". Among them, the gestures of "OPEN”, “FINGER” and “OK” are mapped to translational motion, rotational motion and bending, respectively. After the gesture is recognized, the Euclidean distance between the wrist and the AR device is obtained in real time, if the distance decreases, the robot will move forward, and vice versa. Similarly, rotational motion and bending follow the same rules.
  • the calibration of the AR device and the world reference system is completed by the AR device's own SLAM function.
  • the system provided by the present invention and its various devices can be implemented by logically programming the method steps. , modules, and units realize the same function in the form of logic gates, switches, application-specific integrated circuits, programmable logic controllers, and embedded microcontrollers. Therefore, the system provided by the present invention and its various devices, modules and units can be regarded as a kind of hardware components, and the devices, modules and units included in it for realizing various functions can also be regarded as hardware components.
  • the device, module and unit for realizing various functions can also be regarded as both a software module for realizing the method and a structure within a hardware component.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Endoscopes (AREA)
  • Manipulator (AREA)

Abstract

本发明提供了一种基于增强现实的介入机器人无接触遥操系统及标定方法,该系统主要由头戴式AR设备、形状传感系统、电磁跟踪系统、连续体机器人等组成,其标定主要包括五个步骤。本发明将增强现实技术应用于柔性手术机器人,克服了以往增强现实技术只针对刚性器械显示的缺陷,为医生提供沉浸式的手术体验,进一步增加手术的安全性。在增强现实技术的辅助下对腔内解剖学结构及介入的机器人进行3D透视,采用无接触的手势识别方式对机器人进行控制,将不同的手势映射为机器人不同的运动;解决了因形状传感器基坐标系是浮动基座难以获得绝对形状信息的问题;巧妙的将AR设备和电磁跟踪系统的标定转化为AX=XB的求解问题,更加高效的求解了二者的变换关系。

Description

基于增强现实的介入机器人无接触遥操系统及标定方法 技术领域
本发明涉及腔内介入的技术领域,具体地,涉及基于增强现实的介入机器人无接触遥操系统及标定方法。
背景技术
当前遥操作机器人辅助的腔内介入手术要求外科医生操作杆在2D X射线引导下控制机器人,而这种接触式手术增加了外科医生的工作量并将他们暴露在潜在的传染环境中,且2D的图像缺少深度信息,不仅不够直观还增加了误判断的风险。
在申请号为201510802654.0的中国发明专利中公开了一种利用增强现实技术的手术机器人系统及其控制方法,其技术方案是针对刚性手术器械的增强现实,无法适用于柔性机器人的实时形变特性,且其手术的操作方式需要医生与操作手柄直接接触,增加了外科医生交叉感染的风险。
针对上述中的相关技术,发明人认为上述中的手术机器人系统及其控制方法存在无法对柔性机器人进行虚拟与现实叠加的增强显示且易导致医生交叉感染的问题,因此,需要提出一种技术方案以改善上述技术问题。
发明内容
针对现有技术中的缺陷,本发明的目的是提供一种基于增强现实的介入机器人无接触遥操系统及标定方法。
根据本发明提供的一种基于增强现实的介入机器人无接触遥操系统,包括头戴式AR设备、PC1、PC2、光栅解调仪、磁场发生器、插入管、冗余机械臂、驱动单元、连续体机器人、LED、内窥镜相机、形状传感器和电磁传感器,所述头戴式AR设备与PC1相连接,所述PC2分别与光栅解调仪、磁场发生器、冗余机械臂和驱动单元相连接。
优选地,所述形状传感器、电磁传感器、连续体机器人、插入管、冗余机械臂、驱动单元、内窥镜相机及LED组成柔性机器人。
优选地,所述头戴式AR设备将柔性机器人模型叠加到现实场景,所述柔性机器人 模型叠加的对象为连续体机器人和内窥镜插入管部分导管,所述连续体机器人和内窥镜插入管部分导管总长度由形状传感器的长度所决定。
优选地,所述光栅解调仪和形状传感器构成形状传感系统;所述磁场发生器和电磁传感器构成电磁跟踪系统。
优选地,所述系统的工作流程如下:
步骤1:PC1利用术前CT扫描的二维平面图重建腔内解剖结构三维模型,设置内窥镜影像在AR设备内的最佳观测位置;
步骤2:对系统进行标定;
步骤3:PC2利用形状传感系统测量的波长对柔性机器人的形状进行重建,根据步骤2的标定信息,把重建后的形状变换到AR设备坐标系,并把变换后的数据以及内窥镜影像通过无线通信发送给PC1;
步骤4:AR设备与PC1进行实时通信,医生佩戴AR设备;
步骤5:医生通过手势识别对柔性机器人的运动进行控制。
优选地,所述步骤2包括AR设备与电磁跟踪系统的标定、AR设备与机械臂的标定、AR设备定与解剖结构的标定、形状传感器与电磁传感器的标定和AR设备与世界参考系的标定;所述步骤4在AR设备内医生可观测到柔性机器人三维形状和腔内解剖三维模型在现实场景的叠加信息,医生实时观看到内窥镜的影像。
优选地,所述步骤5中的AR设备识别医生的手势并计算腕部相对于AR设备的位置,把不同的手势解析为不同的标志位,柔性机器人接收到标志位后,根据不同的标志位进行相应的运动。
优选地,医生手的腕部靠近AR设备时,机器人前进/正向旋转/向左弯曲。
优选地,所述电磁传感器采用六自由度,通过重新定义的虚拟平面标定电磁传感器坐标系与形状传感器坐标系的关系。
本发明还提供一种基于增强现实的介入机器人无接触遥操系统的标定方法,所述方法包括上述中的一种基于增强现实的介入机器人无接触遥操系统,所述方法包括如下步骤:
S1:CT扫描腔内解剖,获取二维扫描图;
S2:用2D CT图像重建解剖结构的3D虚拟模型,作为目标点云P;
S3:柔性机器人在介入的腔体入口处进行初步的移动,用连续体机器人末端的电磁传感器采集部分解剖壁面的点云数据,作为原始云G;
S4:通过
Figure PCTCN2021111946-appb-000001
将点云G从电磁跟踪系统参考系转到AR设备的参考系,作为点云Q;
S5:利用ICP算法对点云P和Q进行匹配,通过迭代求得AR设备参考系下解剖结构3D模型与实际解剖的变换关系。
与现有技术相比,本发明具有如下的有益效果:
1、本发明降低医生交叉感染的风险,采用无接触的手势识别方式对机器人进行控制,将不同的手势映射为机器人不同的运动。
2、本发明巧妙的将AR设备和电磁跟踪系统的标定转化为AX=XB的求解问题,更加高效的求解了二者的变换关系。
3、本发明将增强现实技术应用于柔性手术机器人,克服了以往增强现实技术只针对刚性器械显示的缺陷,在柔性机器人腔内介入过程中为外科医生提供沉浸式的手术体验,进一步增加手术的安全性。
4、本发明通过重新定义的虚拟平面标定电磁传感器坐标系与形状传感器坐标系的关系,解决了因形状传感器基坐标系是浮动基座难以获得绝对形状信息的问题。
附图说明
通过阅读参照以下附图对非限制性实施例所作的详细描述,本发明的其它特征、目的和优点将会变得更明显:
图1为本发明的系统图;
图2为本发明的工作流程图;
图3为本发明的各组成部分间的空间标定原理图;
图4为本发明的刚性套筒结构图;
图5为本发明的整体数据流图。
具体实施方式
下面结合具体实施例对本发明进行详细说明。以下实施例将有助于本领域的技术人员进一步理解本发明,但不以任何形式限制本发明。应当指出的是,对本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变化和改进。这些都属于本发明的保护范围。
本发明提供一种基于增强现实的介入机器人无接触遥操系统及标定方法,如图1所示,主要包括:头戴式AR设备、PC1、PC2、光栅解调仪、磁场发生器、插入管、冗余 机械臂、驱动单元、连续体机器人、LED、内窥镜相机、形状传感器和电磁传感器。其中,形状传感器、电磁传感器、连续体机器人、插入管、冗余机械臂、驱动单元、内窥镜相机及LED组成柔性机器人。利用AR设备将柔性机器人模型叠加到现实场景,其叠加的对象为连续体机器人和内窥镜插入管部分导管,总长度由形状传感器的长度所决定;光栅解调仪和形状传感器构成形状传感系统;磁场发生器和电磁传感器构成电磁跟踪系统。
头戴式AR设备由医生佩戴,利用AR设备的双目摄像头可识别医生的手势,并计算手腕相对于AR设备的空间位置,同时将柔性机器人和解剖结构模型叠加到现实对应的场景;PC1和AR设备连接,AR设备实时接收PC1所发送的柔性机器人形状信息和内窥镜影像,PC1实时接收AR设备识别的手势信息;PC2与磁场发生器、光栅解调仪及柔性机器人连接,实时重建柔性机器人形状,并与PC1进行无线通信;柔性体机器人共有平移、旋转和弯曲三个自由度。其中,连续体机器人为自接触式结构,采用绳驱的驱动方式,通过驱动单元内两个电机对绳子的拉伸可实现连续体机器人的左右弯曲,且冗余机械臂共有七个自由度,用于提供柔性机器人的平移和旋转运动;电磁传感器具有六自由,用于实时获取柔性机器人末端的位姿;形状传感器为多芯的光纤,嵌入柔性机器人腔内,感知机器人的形状。
工作流程如图2所示。首先,PC1利用术前CT扫描的二维平面图重建腔内解剖结构三维模型,并根据医生的操作习惯,设置内窥镜影像在AR设备内的最佳观测位置。
然后,对系统进行标定,总共包含五个方面的标定,如图3所示,分别为AR设备与电磁跟踪系统的标定、AR设备与机械臂的标定、AR设备定与解剖结构的标定、形状传感器与电磁传感器的标定以及AR设备与世界参考系的标定。通过五个标定步骤,把各部分参考系变换到AR设备坐标系下。
其次,PC2利用形状传感系统测量的波长对柔性机器人的形状进行实时的重建,根据步骤2的标定信息,把重建后的形状变换到AR设备坐标系,并把变换后的数据以及内窥镜影像通过无线通信发送给PC1。
其次,AR设备与PC1进行实时通信,同时,医生佩戴AR设备。在AR设备内医生可观测到柔性机器人三维形状和腔内解剖三维模型在现实场景的叠加信息,另外,医生还能实时观看到内窥镜的影像。
最后,医生通过手势识别对柔性机器人的运动进行控制。AR设备识别医生的手势并计算腕部相对于AR设备的位置,把不同的手势解析为不同的标志位,柔性机器人接收到标志位后,根据不同的标志位进行相应的运动。另外,当医生手的腕部靠近AR设备 时,机器人前进/正向旋转/向左弯曲,反之则相反。整体的数据流如图5所示,包括实时的数据、可视化数据以及先验标定数据。
本发明巧妙的将AR设备和电磁跟踪系统的标定转化为AX=XB的求解问题,更加高效的求解了二者的变换关系,如图3所示。
首先,将六自由度电磁传感器固定在带有定位标识码的无磁标定板上。然后在AR设备的视场中移动标定板,利用AR设备的相机对定位标识码的位姿进行估计,并记录标识码和电磁传感器的位姿。最后,定位标识码的姿态估计结果和电磁传感器的位姿可用于构建如下等式:
Figure PCTCN2021111946-appb-000002
Figure PCTCN2021111946-appb-000003
其中,
Figure PCTCN2021111946-appb-000004
是AR设备和定位标识码的变换关系,
Figure PCTCN2021111946-appb-000005
是电磁传感器与定位标识码的变换关系,
Figure PCTCN2021111946-appb-000006
是AR设备和电磁跟踪系统基坐标系的变换关系,
Figure PCTCN2021111946-appb-000007
是电磁跟踪系统基坐标系和电磁传感器的变换关系。
Figure PCTCN2021111946-appb-000008
i=1,2…,i表示不同组别的数据。式(1)和(2)可以表示为:
Figure PCTCN2021111946-appb-000009
Figure PCTCN2021111946-appb-000010
如果令
Figure PCTCN2021111946-appb-000011
可得到:
AX=XB            (5)
通过求解AX=XB即可得到X的值。
由于多芯光纤的形状感知部分比插入管和连续体机器人的长度之和来得短,因此形状传感器的基坐标系是一个浮动的坐标系。如何获得柔性机器人相对于AR设备坐标系的形状信息是一个巨大的挑战。本发明借助六自由度的电磁传感器,通过重新定义的虚拟平面标定电磁传感器坐标系与形状传感器坐标系的关系。进而,通过电磁跟踪系统与AR设备的标定关系把柔性机器人的形状信息变换到AR设备坐标系,如图3所示。以下为电磁传感器与形状传感器的标定过程:
将带有两个平行孔的刚性套筒固定在连续体机器人的末端,两个孔平行于刚性套筒的轴向方向。刚性套筒如图4所示,将六自由度电磁传感器(E1)固定到孔1,并将其坐标系{E1}设置为孔1的坐标系,此外,将另一个六自由度电磁传感器(E3)插入孔2。孔1和孔2的长度相同,并且电磁传感器的长度等于孔的长度。通过电磁跟踪系统可以获 得两个六自由度电磁传感器的位姿T E1和T E3,并可以通过以下公式计算两个电磁传感器的关系
Figure PCTCN2021111946-appb-000012
由于制造误差,孔1可能与孔2不平行。两个电磁传感器的Z轴分别平行于各自所在孔的轴向方向,可以得到由制造误差引起的孔1和孔2之间的角度:
Figure PCTCN2021111946-appb-000013
其中,R E1=[x E1,y E1,z E1],R E3=[x E3,y E3,z E3],R E1和R E3分别是T E1和T E3的旋转矩阵。向量z E1和z E3的变换旋转矩阵R'可以通过罗德里格斯旋转公式求得:
Figure PCTCN2021111946-appb-000014
其中,ω是z E1×z E3的单位向量,
Figure PCTCN2021111946-appb-000015
是ω的斜对称矩阵。R=R E1 -1R'R E1可以补偿孔1和孔2因制造所导致的平行误差。当孔2的电磁传感器(E3)拔除后,孔2就失去了参考系,因此将{E12}设置为孔2的坐标系,{E12}的原点和{E3}重合,{E12}的旋转矩阵是由{E1}的旋转矩阵经过R的变换得到。最后,孔2相对于孔1的变换关系为:
Figure PCTCN2021111946-appb-000016
将孔2的电磁传感器替换为形状传感器,形状传感器的最后一个光纤布拉格光栅FBG位于套筒内,并将形状传感器的基坐标系{S}定义在最后一个FBG。然后,以一个平整的平面为参考面,将柔性机器人具有形状感知部分的根部固定在参考面上,在平面上对形状传感器进行校准,包括直线设置和扭转补偿校准,其中,扭转补偿过程应让形状传感器的感应部分都处于弯曲状态。校准完成后,形状传感器坐标系{S}的X-Y平面平行于参考面,同时,在不考虑制造误差的情况下,孔2和其Z轴平行。在保证柔性机器人不绕自身轴向转动的情况下,将连续体机器人末端的刚性套筒分别移动到参考面的三个点。柔性机器人上的电磁传感器在每个点的位置为P 1,P 2,P 3,并采集第三点的旋转矩阵R c。利用三个点建立一个虚拟平面,并找到垂直于该平面的单位矢量r:
r 21=P 1-P 2                                      (10)
r 32=P 2-P 3                                      (11)
r=r 32×r 21                                      (12)
如图4所示,{E12}的X轴与虚拟平面之间的角度ψ可以通过以下方程式求解:
Figure PCTCN2021111946-appb-000017
其中,R 3=R cR=[x 3,y 3,z 3]。因为虚拟平面与参考面平行,并且形状传感器的X-Y平面与参考面平行,所以{E12}的X轴与形状传感器坐标系X-Y平面的夹角为ψ。最终,形状传感器{S}和电磁传感器坐标系{E1}的变换关系
Figure PCTCN2021111946-appb-000018
可以求得:
Figure PCTCN2021111946-appb-000019
Figure PCTCN2021111946-appb-000020
其中,l是电磁传感器的长度。因此,可以按照以下方式将形状重建的信息变换到AR设备的空间:
Figure PCTCN2021111946-appb-000021
其中,C(s)为柔性机器人在形状传感器坐标系下的三维形状信息,C(s) H为柔性机器人在AR设备坐标系下的三维形状信息。
为了透视解剖结构,并将腔内解剖结构的3D虚拟模型叠加在真实场景上,需要对AR设备和解剖模型进行标定。主要的标定过程如下:
1、CT扫描腔内解剖,获取二维扫描图。
2、用2D CT图像重建解剖结构的3D虚拟模型,作为目标点云P。
3、柔性机器人在介入的腔体入口处进行初步的移动,用连续体机器人末端的电磁传感器采集部分解剖壁面的点云数据,并将其作为原始云G。
4、通过
Figure PCTCN2021111946-appb-000022
将点云G从电磁跟踪系统参考系转到AR设备的参考系,作为点云Q。
5、利用ICP算法对点云P和Q进行匹配,通过迭代求得AR设备参考系下解剖结构3D模型与实际解剖的变换关系。
为了进一步简化机器人的控制,改进传统的基于接触的主从控制方法,使用非接触手势识别方法来对柔性机器人进行遥操作控制。在本发明中,定义了三个手势来映射柔性机器人的运动,包括弯曲,平移运动和旋转运动。通过拉动柔性机器人的绳驱线索来完成弯曲,平移运动由机械臂的笛卡尔运动提供,旋转运动是通过机械臂的最后一个关 节旋转实现的。
AR设备用于识别外科医生的手势,共三个手势映射到机器人的动作,包括“OPEN”,“OK”和“FINGER”。其中,“OPEN”,“FINGER”和“OK”的手势分别映射为平移运动,旋转运动和弯曲。识别手势后,实时获取手腕与AR设备的欧几里得距离,如果距离减少,则机器人将向前移动,反之亦然。类似地,旋转运动和弯曲遵循相同的规则。
由于AR设备与手腕的距离计算存在一定的噪声,需要对该数据进行滤波处理。本发明采用卡尔曼滤波器对其进行滤波,由于手腕的运动是不规律的,卡尔曼滤波器的预测模型设计为y=x。其中,x为上一时刻卡尔曼滤波器的最优估计值,y为下一时刻的预测值。
AR设备与机械臂的标定方法将定位标识码固定于机械臂末端,然后在AR设备视野内移动机械臂末端,采集机械臂姿态和定位标识的位姿,然后求解AX=XB问题。AR设备与世界参考系的标定则由AR设备自身的SLAM功能来完成。
本领域技术人员知道,除了以纯计算机可读程序代码方式实现本发明提供的系统及其各个装置、模块、单元以外,完全可以通过将方法步骤进行逻辑编程来使得本发明提供的系统及其各个装置、模块、单元以逻辑门、开关、专用集成电路、可编程逻辑控制器以及嵌入式微控制器等的形式来实现相同功能。所以,本发明提供的系统及其各项装置、模块、单元可以被认为是一种硬件部件,而对其内包括的用于实现各种功能的装置、模块、单元也可以视为硬件部件内的结构;也可以将用于实现各种功能的装置、模块、单元视为既可以是实现方法的软件模块又可以是硬件部件内的结构。
以上对本发明的具体实施例进行了描述。需要理解的是,本发明并不局限于上述特定实施方式,本领域技术人员可以在权利要求的范围内做出各种变化或修改,这并不影响本发明的实质内容。在不冲突的情况下,本申请的实施例和实施例中的特征可以任意相互组合。

Claims (10)

  1. 一种基于增强现实的介入机器人无接触遥操系统,其特征在于,包括头戴式AR设备、PC1、PC2、光栅解调仪、磁场发生器、插入管、冗余机械臂、驱动单元、连续体机器人、LED、内窥镜相机、形状传感器和电磁传感器,所述头戴式AR设备与PC1相连接,所述PC2分别与光栅解调仪、磁场发生器、冗余机械臂和驱动单元相连接。
  2. 根据权利要求1所述的一种基于增强现实的介入机器人无接触遥操系统,其特征在于,所述形状传感器、电磁传感器、连续体机器人、插入管、冗余机械臂、驱动单元、内窥镜相机及LED组成柔性机器人。
  3. 根据权利要求2所述的一种基于增强现实的介入机器人无接触遥操系统,其特征在于,所述头戴式AR设备将柔性机器人模型叠加到现实场景,所述柔性机器人模型叠加的对象为连续体机器人和内窥镜插入管部分导管,所述连续体机器人和内窥镜插入管部分导管总长度由形状传感器的长度所决定。
  4. 根据权利要求3所述的一种基于增强现实的介入机器人无接触遥操系统,其特征在于,所述光栅解调仪和形状传感器构成形状传感系统;所述磁场发生器和电磁传感器构成电磁跟踪系统。
  5. 根据权利要求1所述的一种基于增强现实的介入机器人无接触遥操系统,其特征在于,所述系统的工作流程如下:
    步骤1:PC1利用术前CT扫描的二维平面图重建腔内解剖结构三维模型,设置内窥镜影像在AR设备内的最佳观测位置;
    步骤2:对系统进行标定;
    步骤3:PC2利用形状传感系统测量的波长对柔性机器人的形状进行重建,根据步骤2的标定信息,把重建后的形状变换到AR设备坐标系,并把变换后的数据以及内窥镜影像通过无线通信发送给PC1;
    步骤4:AR设备与PC1进行实时通信,医生佩戴AR设备;
    步骤5:医生通过手势识别对柔性机器人的运动进行控制。
  6. 根据权利要求5所述的一种基于增强现实的介入机器人无接触遥操系统,其特征在于,所述步骤2包括AR设备与电磁跟踪系统的标定、AR设备与机械臂的标定、AR设备定与解剖结构的标定、形状传感器与电磁传感器的标定和AR设备与世界参考系的标定;所述步骤4在AR设备内医生可观测到柔性机器人三维形状和腔内解剖三维模 型在现实场景的叠加信息,医生实时观看到内窥镜的影像。
  7. 根据权利要求5所述的一种基于增强现实的介入机器人无接触遥操系统,其特征在于,所述步骤5中的AR设备识别医生的手势并计算腕部相对于AR设备的位置,把不同的手势解析为不同的标志位,柔性机器人接收到标志位后,根据不同的标志位进行相应的运动。
  8. 根据权利要求7所述的一种基于增强现实的介入机器人无接触遥操系统,其特征在于,医生手的腕部靠近AR设备时,机器人前进/正向旋转/向左弯曲。
  9. 根据权利要求1所述的一种基于增强现实的介入机器人无接触遥操系统,其特征在于,所述电磁传感器采用六自由度,通过重新定义的虚拟平面标定电磁传感器坐标系与形状传感器坐标系的关系。
  10. 一种基于增强现实的介入机器人无接触遥操系统的标定方法,其特征在于,所述方法包括如权利要求1-9中任一项的一种基于增强现实的介入机器人无接触遥操系统,所述方法包括如下步骤:
    S1:CT扫描腔内解剖,获取二维扫描图;
    S2:用2D CT图像重建解剖结构的3D虚拟模型,作为目标点云P;
    S3:柔性机器人在介入的腔体入口处进行初步的移动,用连续体机器人末端的电磁传感器采集部分解剖壁面的点云数据,作为原始云G;
    S4:通过
    Figure PCTCN2021111946-appb-100001
    将点云G从电磁跟踪系统参考系转到AR设备的参考系,作为点云Q;
    S5:利用ICP算法对点云P和Q进行匹配,通过迭代求得AR设备参考系下解剖结构3D模型与实际解剖的变换关系。
PCT/CN2021/111946 2021-03-08 2021-08-11 基于增强现实的介入机器人无接触遥操系统及标定方法 WO2022188352A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202110251635.9A CN112914731A (zh) 2021-03-08 2021-03-08 基于增强现实的介入机器人无接触遥操系统及标定方法
CN202110251635.9 2021-03-08
CN202110674129.0 2021-06-17
CN202110674129.0A CN113229941B (zh) 2021-03-08 2021-06-17 基于增强现实的介入机器人无接触遥操系统及标定方法

Publications (1)

Publication Number Publication Date
WO2022188352A1 true WO2022188352A1 (zh) 2022-09-15

Family

ID=76173434

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/111946 WO2022188352A1 (zh) 2021-03-08 2021-08-11 基于增强现实的介入机器人无接触遥操系统及标定方法

Country Status (2)

Country Link
CN (2) CN112914731A (zh)
WO (1) WO2022188352A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115511962A (zh) * 2022-09-20 2022-12-23 上海人工智能创新中心 基于光电触觉传感器的目标主动探测方法及系统
CN116572249A (zh) * 2023-06-07 2023-08-11 哈尔滨理工大学 一种基于三模态切换机制的柔性机械臂遥操作控制方法
CN116999178A (zh) * 2023-10-07 2023-11-07 北京科鹏医疗器械有限公司 一种经自然通道内窥镜操作的双频滤波直观主从映射方法

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112914731A (zh) * 2021-03-08 2021-06-08 上海交通大学 基于增强现实的介入机器人无接触遥操系统及标定方法
CN115542889A (zh) * 2021-06-30 2022-12-30 上海微觅医疗器械有限公司 机器人术前导航方法、系统、存储介质及计算机设备
CN114373046B (zh) * 2021-12-27 2023-08-18 达闼机器人股份有限公司 辅助机器人运行的方法、装置及存储介质
CN114931437B (zh) * 2022-07-25 2022-10-18 中国科学院自动化研究所 感测型连续体机器人、介入感测系统及方法

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102341046A (zh) * 2009-03-24 2012-02-01 伊顿株式会社 利用增强现实技术的手术机器人系统及其控制方法
CN104739519A (zh) * 2015-04-17 2015-07-01 中国科学院重庆绿色智能技术研究院 一种基于增强现实的力反馈手术机器人控制系统
CN204542390U (zh) * 2015-04-17 2015-08-12 中国科学院重庆绿色智能技术研究院 一种基于增强现实的力反馈手术机器人控制系统
CN105050525A (zh) * 2013-03-15 2015-11-11 直观外科手术操作公司 用于跟踪介入器械的形状传感器系统以及使用方法
CN105208960A (zh) * 2013-05-16 2015-12-30 直观外科手术操作公司 用于与外部成像集成的机器人医疗系统的系统和方法
US20180116732A1 (en) * 2016-10-18 2018-05-03 The Board Of Trustees Of The Leland Stanford Junior University Real-time Three Dimensional Display of Flexible Needles Using Augmented Reality
CN108406725A (zh) * 2018-02-09 2018-08-17 华南理工大学 基于电磁理论与移动跟踪的力反馈人机交互系统及方法
CN109758230A (zh) * 2019-02-26 2019-05-17 中国电子科技集团公司信息科学研究院 一种基于增强现实技术的神经外科手术导航方法和系统
CN209392094U (zh) * 2018-06-20 2019-09-17 深圳大学 一种增强现实的手术系统
US20200188028A1 (en) * 2017-08-21 2020-06-18 The Trustees Of Columbia University In The City Of New York Systems and methods for augmented reality guidance
WO2020231157A1 (ko) * 2019-05-16 2020-11-19 서울대학교병원 증강현실 대장 내시경 시스템 및 이를 이용한 모니터링 방법
CN112914731A (zh) * 2021-03-08 2021-06-08 上海交通大学 基于增强现实的介入机器人无接触遥操系统及标定方法

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5226195B2 (ja) * 2006-07-28 2013-07-03 オリンパスメディカルシステムズ株式会社 内視鏡装置及び内視鏡装置の作動方法
CN101099657A (zh) * 2007-07-13 2008-01-09 上海大学 细长柔性杆的空间形状检测装置和方法
CN101147668B (zh) * 2007-11-09 2010-06-02 清华大学 一种无线生物体腔内图像采集系统及装置
CN102162738B (zh) * 2010-12-08 2012-11-21 中国科学院自动化研究所 摄像头与惯性传感器组合定位定姿系统的标定方法
US10952600B2 (en) * 2014-07-10 2021-03-23 Covidien Lp Endoscope system
CN105266897B (zh) * 2015-11-25 2018-03-23 上海交通大学医学院附属第九人民医院 一种基于增强现实的显微外科手术导航系统及导航方法
CN108883028A (zh) * 2016-03-15 2018-11-23 皇家飞利浦有限公司 光纤真实形状感测喂食管
CN106648077A (zh) * 2016-11-30 2017-05-10 南京航空航天大学 基于实时跟踪和多源信息融合的自适应动态立体增强现实作业导航系统
WO2018195221A1 (en) * 2017-04-18 2018-10-25 Intuitive Surgical Operations, Inc. Graphical user interface for planning a procedure
US11254019B2 (en) * 2019-03-05 2022-02-22 The Boeing Company Automatic calibration for a robot optical sensor
CN110010249B (zh) * 2019-03-29 2021-04-27 北京航空航天大学 基于视频叠加的增强现实手术导航方法、系统及电子设备
WO2020221311A1 (zh) * 2019-04-30 2020-11-05 齐鲁工业大学 基于可穿戴设备的移动机器人控制系统及控制方法
CN110706279B (zh) * 2019-09-27 2021-09-07 清华大学 基于全局地图与多传感器信息融合的全程位姿估计方法
CN110711030B (zh) * 2019-10-21 2021-04-23 北京国润健康医学投资有限公司 基于ar技术的股骨头坏死微创手术导航系统及导航方法
CN111202583A (zh) * 2020-01-20 2020-05-29 上海奥朋医疗科技有限公司 跟踪手术床运动的方法、系统及介质
CN212466186U (zh) * 2020-02-11 2021-02-05 中国医学科学院北京协和医院 一种基于增强现实技术的骨肿瘤手术辅助系统
CN111329587A (zh) * 2020-02-19 2020-06-26 上海理工大学 利用形状传感光纤网格进行手术配准系统

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102341046A (zh) * 2009-03-24 2012-02-01 伊顿株式会社 利用增强现实技术的手术机器人系统及其控制方法
CN105050525A (zh) * 2013-03-15 2015-11-11 直观外科手术操作公司 用于跟踪介入器械的形状传感器系统以及使用方法
CN105208960A (zh) * 2013-05-16 2015-12-30 直观外科手术操作公司 用于与外部成像集成的机器人医疗系统的系统和方法
CN104739519A (zh) * 2015-04-17 2015-07-01 中国科学院重庆绿色智能技术研究院 一种基于增强现实的力反馈手术机器人控制系统
CN204542390U (zh) * 2015-04-17 2015-08-12 中国科学院重庆绿色智能技术研究院 一种基于增强现实的力反馈手术机器人控制系统
US20180116732A1 (en) * 2016-10-18 2018-05-03 The Board Of Trustees Of The Leland Stanford Junior University Real-time Three Dimensional Display of Flexible Needles Using Augmented Reality
US20200188028A1 (en) * 2017-08-21 2020-06-18 The Trustees Of Columbia University In The City Of New York Systems and methods for augmented reality guidance
CN108406725A (zh) * 2018-02-09 2018-08-17 华南理工大学 基于电磁理论与移动跟踪的力反馈人机交互系统及方法
CN209392094U (zh) * 2018-06-20 2019-09-17 深圳大学 一种增强现实的手术系统
CN109758230A (zh) * 2019-02-26 2019-05-17 中国电子科技集团公司信息科学研究院 一种基于增强现实技术的神经外科手术导航方法和系统
WO2020231157A1 (ko) * 2019-05-16 2020-11-19 서울대학교병원 증강현실 대장 내시경 시스템 및 이를 이용한 모니터링 방법
CN112914731A (zh) * 2021-03-08 2021-06-08 上海交通大学 基于增强现实的介入机器人无接触遥操系统及标定方法
CN113229941A (zh) * 2021-03-08 2021-08-10 上海交通大学 基于增强现实的介入机器人无接触遥操系统及标定方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115511962A (zh) * 2022-09-20 2022-12-23 上海人工智能创新中心 基于光电触觉传感器的目标主动探测方法及系统
CN115511962B (zh) * 2022-09-20 2024-05-28 上海人工智能创新中心 基于光电触觉传感器的目标主动探测方法及系统
CN116572249A (zh) * 2023-06-07 2023-08-11 哈尔滨理工大学 一种基于三模态切换机制的柔性机械臂遥操作控制方法
CN116572249B (zh) * 2023-06-07 2024-01-02 哈尔滨理工大学 一种基于三模态切换机制的柔性机械臂遥操作控制方法
CN116999178A (zh) * 2023-10-07 2023-11-07 北京科鹏医疗器械有限公司 一种经自然通道内窥镜操作的双频滤波直观主从映射方法
CN116999178B (zh) * 2023-10-07 2024-01-12 北京科鹏医疗器械有限公司 一种经自然通道内窥镜操作的双频滤波直观主从映射方法

Also Published As

Publication number Publication date
CN113229941B (zh) 2023-05-26
CN112914731A (zh) 2021-06-08
CN113229941A (zh) 2021-08-10

Similar Documents

Publication Publication Date Title
WO2022188352A1 (zh) 基于增强现实的介入机器人无接触遥操系统及标定方法
AU2021203525B2 (en) Navigation of tubular networks
US20210282662A1 (en) Systems and methods for deformation compensation using shape sensing
CN110215284B (zh) 一种可视化系统和方法
US9827057B2 (en) Estimation of a position and orientation of a frame used in controlling movement of a tool
EP2928407B1 (en) Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
KR102214145B1 (ko) 축소된 탐색 공간을 이용하여 의료 장치를 조정하는 시스템 및 방법
JPWO2018159338A1 (ja) 医療用支持アームシステムおよび制御装置
US20230200630A1 (en) Method for positioning an endoscope with flexible shaft
Lee et al. From medical images to minimally invasive intervention: Computer assistance for robotic surgery
JP2017511712A (ja) 介入ツールの仮想ナビゲーションのための組織の非剛体変形のためのシステム及び方法
WO2018013198A1 (en) Systems and methods for displaying an instrument navigator in a teleoperational system
CN113180828A (zh) 基于旋量理论的手术机器人约束运动控制方法
Bihlmaier et al. Endoscope robots and automated camera guidance
Dumpert et al. Semi-autonomous surgical tasks using a miniature in vivo surgical robot
CN117297773A (zh) 手术器械控制方法、手术机器人和存储介质
Zheng et al. Automatic Tracking Motion Based on Flexible Forbidden Virtual Fixtures Design in Robot Assisted Nasal Surgery
WO2023018684A1 (en) Systems and methods for depth-based measurement in a three-dimensional view
CN117064545A (zh) 基于增强现实的机器人腔内介入远程呈现方法及系统
WO2023233280A1 (en) Generating imaging pose recommendations
Fernando Robotics for surgeries

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21929819

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21929819

Country of ref document: EP

Kind code of ref document: A1