CN112914731A - Interventional robot contactless teleoperation system based on augmented reality and calibration method - Google Patents

Interventional robot contactless teleoperation system based on augmented reality and calibration method Download PDF

Info

Publication number
CN112914731A
CN112914731A CN202110251635.9A CN202110251635A CN112914731A CN 112914731 A CN112914731 A CN 112914731A CN 202110251635 A CN202110251635 A CN 202110251635A CN 112914731 A CN112914731 A CN 112914731A
Authority
CN
China
Prior art keywords
robot
augmented reality
contactless
equipment
shape
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110251635.9A
Other languages
Chinese (zh)
Inventor
高安柱
林泽才
杨广中
陈卫东
艾孝杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202110251635.9A priority Critical patent/CN112914731A/en
Publication of CN112914731A publication Critical patent/CN112914731A/en
Priority to CN202110674129.0A priority patent/CN113229941B/en
Priority to PCT/CN2021/111946 priority patent/WO2022188352A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/309Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Endoscopes (AREA)
  • Manipulator (AREA)

Abstract

The invention provides an interventional robot contactless teleoperation system based on augmented reality, which comprises a head-wearing AR device, a PC1, a PC2, a grating demodulator, a magnetic field generator, an insertion tube, a redundant mechanical arm, a driving unit, a continuum robot, an LED, an endoscope camera, a shape sensor and an electromagnetic sensor, wherein the head-wearing AR device is connected with the PC1, and the PC2 is respectively connected with the grating demodulator, the magnetic field generator, the redundant mechanical arm and the driving unit. The method reduces the risk of cross infection of doctors, performs 3D perspective on the anatomical structure in the cavity and the interventional robot under the assistance of the augmented reality technology, controls the robot in a non-contact gesture recognition mode, and maps different gestures into different motions of the robot; the problem that absolute shape information is difficult to obtain due to the fact that a shape sensor base coordinate system is a floating base is solved; the calibration of the AR equipment and the electromagnetic tracking system is ingeniously converted into a solving problem of AX (X) XB, and the transformation relation between the AR equipment and the electromagnetic tracking system is solved more efficiently.

Description

Interventional robot contactless teleoperation system based on augmented reality and calibration method
Technical Field
The invention relates to the technical field of intracavity intervention, in particular to an intervention robot contactless teleoperation system and a calibration method based on augmented reality.
Background
Current teleoperated robot-assisted endoluminal interventional procedures require the surgeon's joystick to control the robot under 2D X ray guidance, and such contact procedures increase the surgeon's workload and expose them to potentially infectious environments, and the lack of depth information in the 2D images not only is not intuitive enough but also increases the risk of misjudgment.
The invention discloses a surgical robot system utilizing an augmented reality technology and a control method thereof in a Chinese invention patent with the application number of 201510802654.0, the technical scheme aims at the augmented reality of a rigid surgical instrument, the real-time deformation characteristic of a flexible robot cannot be suitable for, and the operation mode of the operation needs a doctor to be in direct contact with an operation handle, so that the risk of cross infection of the surgeon is increased.
In view of the above-mentioned related technologies, the inventor believes that the surgical robot system and the control method thereof in the above have the problems that the flexible robot cannot be subjected to augmented display of virtual and real superposition and is prone to cause cross infection of doctors, and therefore, a technical solution is needed to improve the above technical problems.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide an interventional robot non-contact teleoperation system based on augmented reality and a calibration method.
The interventional robot contactless teleoperation system based on augmented reality comprises a head-mounted AR device, a PC1, a PC2, a grating demodulator, a magnetic field generator, an insertion tube, a redundant mechanical arm, a driving unit, a continuum robot, an LED, an endoscope camera, a shape sensor and an electromagnetic sensor, wherein the head-mounted AR device is connected with the PC1, and the PC2 is respectively connected with the grating demodulator, the magnetic field generator, the redundant mechanical arm and the driving unit.
Preferably, the shape sensor, the electromagnetic sensor, the continuum robot, the insertion tube, the redundant robotic arm, the drive unit, the endoscopic camera, and the LED constitute a flexible robot.
Preferably, the head-mounted AR apparatus superimposes a flexible robot model onto a real scene, the flexible robot model being superimposed on the continuous body robot and the endoscope insertion tube section guide tube, the total length of the continuous body robot and the endoscope insertion tube section guide tube being determined by the length of the shape sensor.
Preferably, the grating demodulator and the shape sensor constitute a shape sensing system; the magnetic field generator and the electromagnetic sensor constitute an electromagnetic tracking system.
Preferably, the workflow of the system is as follows:
step 1: the PC1 reconstructs an intracavity anatomical structure three-dimensional model by utilizing a preoperative CT scanned two-dimensional plane image, and sets an optimal observation position of an endoscope image in the AR equipment;
step 2: calibrating the system;
and step 3: the PC2 reconstructs the shape of the flexible robot by using the wavelength measured by the shape sensing system, transforms the reconstructed shape to an AR equipment coordinate system according to the calibration information in the step 2, and transmits the transformed data and the endoscope image to the PC1 through wireless communication;
and 4, step 4: the AR equipment is in real-time communication with the PC1, and the doctor wears the AR equipment;
and 5: the doctor controls the movement of the flexible robot through gesture recognition.
Preferably, the step 2 comprises calibration of the AR device and the electromagnetic tracking system, calibration of the AR device and the mechanical arm, calibration of the AR device and the anatomical structure, calibration of the shape sensor and the electromagnetic sensor, and calibration of the AR device and the world reference system; and 4, in the AR equipment, a doctor can observe the superposition information of the three-dimensional shape of the flexible robot and the three-dimensional model of the anatomical cavity in the real scene, and the doctor can observe the image of the endoscope in real time.
Preferably, the AR device in step 5 recognizes the gesture of the doctor and calculates the position of the wrist relative to the AR device, different gestures are analyzed into different marker positions, and the flexible robot performs corresponding motions according to the different marker positions after receiving the marker positions.
Preferably, the robot is advanced/rotated forward/flexed to the left as the wrist of the doctor's hand approaches the AR device.
Preferably, the electromagnetic sensor adopts six degrees of freedom, and the relationship between the electromagnetic sensor coordinate system and the shape sensor coordinate system is calibrated through the redefined virtual plane.
The invention also provides a calibration method of the interventional robot contactless teleoperation system based on augmented reality, the method comprises the interventional robot contactless teleoperation system based on augmented reality, and the method comprises the following steps:
s1: carrying out intra-cavity dissection by CT scanning to obtain a two-dimensional scanning image;
s2: reconstructing a 3D virtual model of the anatomical structure by using the 2D CT image as a target point cloud P;
s3: the flexible robot performs initial movement at an interventional cavity entrance, and an electromagnetic sensor at the tail end of the continuum robot is used for acquiring point cloud data of a part of anatomical wall surface to serve as an original cloud G;
s4: by passing
Figure BDA0002966310910000031
Transferring the point cloud G from the reference system of the electromagnetic tracking system to the reference system of the AR equipment to be used as a point cloud Q;
s5: and matching the point clouds P and Q by using an ICP (inductively coupled plasma) algorithm, and obtaining a transformation relation between the 3D model of the anatomical structure and the actual anatomy under the reference system of the AR (augmented reality) equipment through iteration.
Compared with the prior art, the invention has the following beneficial effects:
1. the invention reduces the risk of cross infection of doctors, controls the robot by adopting a non-contact gesture recognition mode, and maps different gestures into different motions of the robot.
2. The invention skillfully converts the calibration of the AR equipment and the electromagnetic tracking system into the solving problem of AX (X) XB, and solves the transformation relation of the AX and the XB more efficiently.
3. In order to further simplify the control of the robot, the invention improves the traditional contact-based master-slave control method and uses a non-contact gesture recognition method to carry out teleoperation control on the flexible robot.
4. The invention marks the relation between the electromagnetic sensor coordinate system and the shape sensor coordinate system through the redefined virtual plane, solves the problem that absolute shape information is difficult to obtain because the shape sensor base coordinate system is a floating base, and successfully applies the augmented reality technology to the augmented display of the flexible robot.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a system diagram of the present invention;
FIG. 2 is a flow chart of the operation of the present invention;
FIG. 3 is a diagram of the present invention for calibrating a system;
FIG. 4 is a view of the rigid sleeve of the present invention;
FIG. 5 is an overall data flow diagram of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
The invention provides an interventional robot contactless teleoperation system based on augmented reality and a calibration method, as shown in figure 1, the system mainly comprises: a head-mounted AR device, a PC1, a PC2, a grating demodulator, a magnetic field generator, an insertion tube, a redundant robotic arm, a drive unit, a continuum robot, an LED, an endoscopic camera, a shape sensor, and an electromagnetic sensor. Wherein, the shape sensor, the electromagnetic sensor, the continuum robot, the insertion tube, the redundant mechanical arm, the driving unit, the endoscope camera and the LED form the flexible robot. The flexible robot model is superposed to a real scene by utilizing AR equipment, the superposed objects are a continuum robot and an endoscope insertion tube part catheter, and the total length is determined by the length of the shape sensor; the grating demodulator and the shape sensor form a shape sensing system; the magnetic field generator and the electromagnetic sensor constitute an electromagnetic tracking system.
The head-mounted AR equipment is worn by a doctor, the gestures of the doctor can be recognized by using a binocular camera of the AR equipment, the spatial position of a wrist relative to the AR equipment is calculated, and meanwhile, the flexible robot and the anatomical structure model are superposed to a corresponding actual scene; the method comprises the following steps that the PC1 is connected with the AR equipment, the AR equipment receives flexible robot shape information and endoscope images sent by the PC1 in real time, and the PC1 receives gesture information identified by the AR equipment in real time; the PC2 is connected with the magnetic field generator, the grating demodulator and the flexible robot, reconstructs the shape of the flexible robot in real time and carries out wireless communication with the PC 1; the flexible robot has three degrees of freedom of translation, rotation and bending. The robot comprises a continuum robot, a redundant mechanical arm and a flexible robot, wherein the continuum robot is of a self-contact structure, a rope driving mode is adopted, left and right bending of the continuum robot can be realized by stretching a rope by two motors in a driving unit, and the redundant mechanical arm has seven degrees of freedom in total and is used for providing translational motion and rotational motion of the flexible robot; the electromagnetic sensor has six degrees of freedom and is used for acquiring the pose of the tail end of the flexible robot in real time; the shape sensor is a multi-core optical fiber and is embedded into the flexible robot cavity to sense the shape of the robot.
The workflow is shown in fig. 2. First, the PC1 reconstructs a three-dimensional model of the anatomical structure in the cavity using a two-dimensional plan view of a preoperative CT scan, and sets an optimal observation position of an endoscopic image in the AR device according to the operation habit of the doctor.
Then, the system is calibrated, which includes calibration of five aspects in total, as shown in fig. 3, which are calibration of the AR device and the electromagnetic tracking system, calibration of the AR device and the mechanical arm, calibration of the AR device and the anatomical structure, calibration of the shape sensor and the electromagnetic sensor, and calibration of the AR device and the world reference frame, respectively. And transforming the reference frame of each part into the coordinate frame of the AR equipment through five calibration steps.
Next, the PC2 reconstructs the shape of the flexible robot in real time using the wavelength measured by the shape sensing system, transforms the reconstructed shape to the AR device coordinate system based on the calibration information of step 2, and transmits the transformed data and the endoscope image to the PC1 through wireless communication.
Second, the AR device communicates with the PC1 in real time while the doctor wears the AR device. In the AR equipment, a doctor can observe the superposition information of the three-dimensional shape of the flexible robot and the three-dimensional model of the anatomical cavity in the real scene, and in addition, the doctor can also observe the image of the endoscope in real time.
Finally, the doctor controls the movement of the flexible robot through gesture recognition. The AR equipment recognizes the gestures of the doctor, calculates the position of the wrist relative to the AR equipment, analyzes different gestures into different mark positions, and performs corresponding movement according to different mark positions after the flexible robot receives the mark positions. In addition, when the wrist of the doctor's hand is close to the AR device, the robot is advanced/rotated forward/flexed to the left, and vice versa. The overall data flow is shown in fig. 5, and includes real-time data, visualization data, and a priori calibration data.
The invention ingeniously converts the calibration of the AR equipment and the electromagnetic tracking system into the solving problem of AX (X) XB, and more efficiently solves the transformation relation of the AX and the XB, as shown in figure 3.
Firstly, fixing a six-degree-of-freedom electromagnetic sensor on a non-magnetic calibration plate with a positioning identification code. And then moving the calibration plate in the view field of the AR equipment, estimating the pose of the positioning identification code by using a camera of the AR equipment, and recording the identification code and the pose of the electromagnetic sensor. Finally, the pose estimation result of the positioning identification code and the pose of the electromagnetic sensor can be used to construct the following equation:
Figure BDA0002966310910000051
Figure BDA0002966310910000052
wherein,
Figure BDA0002966310910000053
is a transformation relationship of the AR device and the location identification code,
Figure BDA0002966310910000054
is the conversion relation between the electromagnetic sensor and the positioning identification code,
Figure BDA0002966310910000055
is a transformation relation of the AR equipment and the electromagnetic tracking system base coordinate system,
Figure BDA0002966310910000056
is the transformation relation between the electromagnetic tracking system base coordinate system and the electromagnetic sensor.
Figure BDA0002966310910000057
i is 1,2 …, i represents different groups of data. The formulae (1) and (2) can be represented as:
Figure BDA0002966310910000058
Figure BDA0002966310910000059
if order
Figure BDA00029663109100000510
The following results were obtained:
AX=XB (5)
the value of X is obtained by solving AX ═ XB.
Since the shape sensing portion of the multi-core fiber is shorter than the sum of the lengths of the insertion tube and the continuum robot, the base coordinate system of the shape sensor is a floating coordinate system. How to obtain shape information of the flexible robot relative to the coordinate system of the AR device is a great challenge. The invention calibrates the relation between the coordinate system of the electromagnetic sensor and the coordinate system of the shape sensor by the redefined virtual plane by means of the electromagnetic sensor with six degrees of freedom. Further, the shape information of the flexible robot is transformed to the AR device coordinate system through the calibration relationship of the electromagnetic tracking system and the AR device, as shown in fig. 3. The calibration process of the electromagnetic sensor and the shape sensor is as follows:
a rigid sleeve with two parallel holes was fixed to the end of the continuum robot. Rigid sleeve as shown in fig. 4, a six degree of freedom electromagnetic sensor is fixed to the bore 1 and another six degree of freedom electromagnetic sensor is inserted into the bore 2. The length of the holes 1 and 2 is the same and the length of the electromagnetic sensor is equal to the length of the hole. The pose T of two six-degree-of-freedom electromagnetic sensors can be obtained through an electromagnetic tracking systemE1And TE2And the relationship of the hole 2 with respect to the hole 1 can be calculated by the following formula
Figure BDA0002966310910000061
Due to manufacturing tolerances, the hole 1 may not be parallel to the hole 2. The angle between the holes 1 and 2 caused by the manufacturing error can be obtained:
Figure BDA0002966310910000062
wherein R isE1=[xE1,yE1,zE1],RE2=[xE2,yE2,zE2],RE1And RE2Are each TE2And TE2The rotation matrix of (2). Vector zE1And zE2The transformation rotation matrix R of (a) can be obtained by the rodriess rotation equation:
Figure BDA0002966310910000063
wherein ω is zE1×zE2
Figure BDA0002966310910000064
Is a diagonally symmetric matrix of ω. R can compensate for parallelism errors of holes 1 and 2 due to manufacturing. Finally, the relationship of the transformation of hole 1 with respect to hole 2 is:
Figure BDA0002966310910000065
the electromagnetic sensor of the hole 2 is replaced by a shape sensor, the last fiber bragg grating FBG of which is located in the sleeve, and the base coordinate system of the shape sensor is defined at the last FBG. And then, fixing the part of the flexible robot with shape sensing on the reference surface by taking a flat plane as the reference surface, and calibrating the shape sensor on the plane, wherein the calibration comprises straight line arrangement and torsion compensation calibration, and the torsion compensation process is to enable the sensing part of the shape sensor to be in a bending state. After calibration is complete, the X-Y plane of the shape sensor coordinate system is parallel to the reference plane and the Z axis is parallel to the hole 2. And under the condition of ensuring that the flexible robot does not rotate around the self axial direction, the rigid sleeves at the tail end of the continuum robot are respectively moved to three points of the reference surface. The position of the electromagnetic sensor on the flexible robot at each point is P1,P2,P3And collecting a rotation matrix R of a third pointc. Establishing a virtual plane using three points and finding a unit vector perpendicular to the plane
Figure BDA0002966310910000066
Figure BDA0002966310910000067
Figure BDA0002966310910000068
Figure BDA0002966310910000069
As shown in fig. 4, the angle ψ between the X-axis of the electromagnetic sensor and the virtual plane can be solved by the following equation:
Figure BDA0002966310910000071
wherein R isc=[xc,yc,zc]. Because the virtual plane is parallel to the reference plane and the X-Y plane of the shape sensor is parallel to the reference plane, the X-axis of the electromagnetic sensor coordinate system is at an angle ψ to the X-Y plane of the shape sensor coordinate system. Finally, the transformation relation of the shape sensor and the electromagnetic sensor coordinate system
Figure BDA0002966310910000072
The following can be obtained:
Figure BDA0002966310910000073
Figure BDA0002966310910000074
where l is the length of the electromagnetic sensor. Thus, the shape reconstructed information can be transformed into the space of the AR device in the following way:
Figure BDA0002966310910000075
wherein C(s) is three-dimensional shape information of the flexible robot under a shape sensor coordinate system, C(s)HFor flexible robotsThree-dimensional shape information in an AR device coordinate system.
In order to visualize the anatomy and overlay a 3D virtual model of the intraluminal anatomy on the real scene, the AR device and the anatomy model need to be calibrated. The main calibration process is as follows:
1. and carrying out intra-cavity dissection through CT scanning to obtain a two-dimensional scanning image.
2. A 3D virtual model of the anatomical structure is reconstructed from the 2D CT image as a target point cloud P.
3. The flexible robot carries out primary movement at the entrance of the interventional cavity, and the electromagnetic sensor at the tail end of the continuum robot is used for acquiring point cloud data of a part of anatomical wall surface and taking the point cloud data as an original cloud G.
4. By passing
Figure BDA0002966310910000076
And transferring the point cloud G from the reference system of the electromagnetic tracking system to the reference system of the AR device to be used as a point cloud Q.
5. And matching the point clouds P and Q by using an ICP (inductively coupled plasma) algorithm, and obtaining a transformation relation between the 3D model of the anatomical structure and the actual anatomy under the reference system of the AR (augmented reality) equipment through iteration.
In order to further simplify the control of the robot, the traditional contact-based master-slave control method is improved, and a non-contact gesture recognition method is used for carrying out teleoperation control on the flexible robot. In the present invention, three gestures are defined to map the motion of the flexible robot, including bending, translational motion and rotational motion. Bending is accomplished by pulling the rope drive wires of the flexible robot, translational motion is provided by cartesian motion of the robotic arm, and rotational motion is achieved by rotation of the last joint of the robotic arm.
The AR device is used to recognize the surgeon's gestures, three of which are mapped to the actions of the robot, including "OPEN", "OK", and "filler". Wherein the gestures of "OPEN", "pointer" and "OK" are mapped to translational motion, rotational motion and bending, respectively. After recognizing the gesture, the euclidean distance of the wrist from the AR device is acquired in real time, and if the distance decreases, the robot will move forward, and vice versa. Similarly, the rotational movement and bending follow the same rules.
Since there is a certain noise in the calculation of the distance between the AR device and the wrist, the data needs to be filtered. The invention adopts the Kalman filter to filter the wrist movement, and because the wrist movement is irregular, the prediction model of the Kalman filter is designed as follows:
y=x
wherein x is the optimal estimation value of the Kalman filter at the previous moment, and y is the predicted value at the next moment.
The calibration method of the AR equipment and the mechanical arm fixes the positioning identification code at the tail end of the mechanical arm, then moves the tail end of the mechanical arm in the visual field of the AR equipment, collects the posture of the mechanical arm and the posture of the positioning identification code, and then solves the problem that AX is XB. The calibration of the AR device and the world reference frame is performed by the SLAM function of the AR device itself.
Those skilled in the art will appreciate that, in addition to implementing the system and its various devices, modules, units provided by the present invention as pure computer readable program code, the system and its various devices, modules, units provided by the present invention can be fully implemented by logically programming method steps in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system and various devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units included in the system for realizing various functions can also be regarded as structures in the hardware component; means, modules, units for performing the various functions may also be regarded as structures within both software modules and hardware components for performing the method.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (10)

1. The utility model provides an intervene robot contactless teleoperation system based on augmented reality, its characterized in that, includes wear-type AR equipment, PC1, PC2, grating demodulation appearance, magnetic field generator, insert pipe, redundant arm, drive unit, continuum robot, LED, endoscope camera, shape sensor and electromagnetic sensor, wear-type AR equipment is connected with PC1, PC2 is connected with grating demodulation appearance, magnetic field generator, redundant arm and drive unit respectively.
2. The augmented reality based interventional robot contactless teleoperation system of claim 1, wherein the shape sensor, the electromagnetic sensor, the continuum robot, the insertion tube, the redundant mechanical arm, the driving unit, the endoscopic camera and the LED constitute a flexible robot.
3. The augmented reality based interventional robot contactless telemanipulation system of claim 2, wherein the head-mounted AR device superimposes a flexible robot model onto a real scene, the flexible robot model superimposed objects being a continuum robot and an endoscopic insertion tube section catheter, the total length of the continuum robot and the endoscopic insertion tube section catheter being determined by the length of the shape sensor.
4. The augmented reality based interventional robot contactless teleoperation system of claim 3, wherein the grating demodulator and the shape sensor constitute a shape sensing system; the magnetic field generator and the electromagnetic sensor constitute an electromagnetic tracking system.
5. The augmented reality based interventional robot contactless teleoperation system according to claim 1, characterized in that the workflow of the system is as follows:
step 1: the PC1 reconstructs an intracavity anatomical structure three-dimensional model by utilizing a preoperative CT scanned two-dimensional plane image, and sets an optimal observation position of an endoscope image in the AR equipment;
step 2: calibrating the system;
and step 3: the PC2 reconstructs the shape of the flexible robot by using the wavelength measured by the shape sensing system, transforms the reconstructed shape to an AR equipment coordinate system according to the calibration information in the step 2, and transmits the transformed data and the endoscope image to the PC1 through wireless communication;
and 4, step 4: the AR equipment is in real-time communication with the PC1, and the doctor wears the AR equipment;
and 5: the doctor controls the movement of the flexible robot through gesture recognition.
6. The augmented reality based interventional robot contactless teleoperation system of claim 5, wherein the step 2 comprises calibration of the AR device and the electromagnetic tracking system, calibration of the AR device and the mechanical arm, calibration of the AR device and the anatomical structure, calibration of the shape sensor and the electromagnetic sensor, and calibration of the AR device and the world reference frame; and 4, in the AR equipment, a doctor can observe the superposition information of the three-dimensional shape of the flexible robot and the three-dimensional model of the anatomical cavity in the real scene, and the doctor can observe the image of the endoscope in real time.
7. The augmented reality-based interventional robot contactless teleoperation system of claim 5, wherein the AR device in step 5 recognizes the gesture of the doctor and calculates the position of the wrist relative to the AR device, different gestures are resolved into different marker positions, and after receiving the marker positions, the flexible robot performs corresponding movements according to the different marker positions.
8. The augmented reality based interventional robot contactless telemanipulation system of claim 7, wherein the robot is advanced/rotated forward/bent left when the wrist of the doctor's hand is close to the AR device.
9. The system of claim 1, wherein the electromagnetic sensor has six degrees of freedom, and the relationship between the electromagnetic sensor coordinate system and the shape sensor coordinate system is calibrated by the redefined virtual plane.
10. A calibration method for an augmented reality based interventional robot contactless teleoperation system, the method comprising an augmented reality based interventional robot contactless teleoperation system according to any one of claims 1-9, the method comprising the steps of:
s1: carrying out intra-cavity dissection by CT scanning to obtain a two-dimensional scanning image;
s2: reconstructing a 3D virtual model of the anatomical structure by using the 2D CT image as a target point cloud P;
s3: the flexible robot performs initial movement at an interventional cavity entrance, and an electromagnetic sensor at the tail end of the continuum robot is used for acquiring point cloud data of a part of anatomical wall surface to serve as an original cloud G;
s4: by passing
Figure FDA0002966310900000021
Transferring the point cloud G from the reference system of the electromagnetic tracking system to the reference system of the AR equipment to be used as a point cloud Q;
s5: and matching the point clouds P and Q by using an ICP (inductively coupled plasma) algorithm, and obtaining a transformation relation between the 3D model of the anatomical structure and the actual anatomy under the reference system of the AR (augmented reality) equipment through iteration.
CN202110251635.9A 2021-03-08 2021-03-08 Interventional robot contactless teleoperation system based on augmented reality and calibration method Pending CN112914731A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110251635.9A CN112914731A (en) 2021-03-08 2021-03-08 Interventional robot contactless teleoperation system based on augmented reality and calibration method
CN202110674129.0A CN113229941B (en) 2021-03-08 2021-06-17 Interventional robot non-contact teleoperation system based on augmented reality and calibration method
PCT/CN2021/111946 WO2022188352A1 (en) 2021-03-08 2021-08-11 Augmented-reality-based interventional robot non-contact teleoperation system, and calibration method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110251635.9A CN112914731A (en) 2021-03-08 2021-03-08 Interventional robot contactless teleoperation system based on augmented reality and calibration method

Publications (1)

Publication Number Publication Date
CN112914731A true CN112914731A (en) 2021-06-08

Family

ID=76173434

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110251635.9A Pending CN112914731A (en) 2021-03-08 2021-03-08 Interventional robot contactless teleoperation system based on augmented reality and calibration method
CN202110674129.0A Active CN113229941B (en) 2021-03-08 2021-06-17 Interventional robot non-contact teleoperation system based on augmented reality and calibration method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110674129.0A Active CN113229941B (en) 2021-03-08 2021-06-17 Interventional robot non-contact teleoperation system based on augmented reality and calibration method

Country Status (2)

Country Link
CN (2) CN112914731A (en)
WO (1) WO2022188352A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114373046A (en) * 2021-12-27 2022-04-19 达闼机器人有限公司 Method and device for assisting robot to operate and storage medium
CN114931437A (en) * 2022-07-25 2022-08-23 中国科学院自动化研究所 Sensing type continuum robot, intervention sensing system and method
WO2022188352A1 (en) * 2021-03-08 2022-09-15 上海交通大学 Augmented-reality-based interventional robot non-contact teleoperation system, and calibration method therefor
WO2023274270A1 (en) * 2021-06-30 2023-01-05 上海微觅医疗器械有限公司 Robot preoperative navigation method and system, storage medium, and computer device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115511962B (en) * 2022-09-20 2024-05-28 上海人工智能创新中心 Target active detection method and system based on photoelectric tactile sensor
CN116572249B (en) * 2023-06-07 2024-01-02 哈尔滨理工大学 Flexible mechanical arm teleoperation control method based on three-mode switching mechanism
CN116999178B (en) * 2023-10-07 2024-01-12 北京科鹏医疗器械有限公司 Dual-frequency filtering visual master-slave mapping method operated by natural channel endoscope

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5226195B2 (en) * 2006-07-28 2013-07-03 オリンパスメディカルシステムズ株式会社 Endoscope apparatus and method for operating endoscope apparatus
CN101099657A (en) * 2007-07-13 2008-01-09 上海大学 Thin long flexible rod spatial shape detecting device and method
CN101147668B (en) * 2007-11-09 2010-06-02 清华大学 Radio system and device for sampling image in a creature body cavity
US20110306986A1 (en) * 2009-03-24 2011-12-15 Min Kyu Lee Surgical robot system using augmented reality, and method for controlling same
CN102162738B (en) * 2010-12-08 2012-11-21 中国科学院自动化研究所 Calibration method of camera and inertial sensor integrated positioning and attitude determining system
WO2014150509A1 (en) * 2013-03-15 2014-09-25 Intuitive Surgical Operations, Inc. Shape sensor systems for tracking interventional instruments and methods of use
US9592095B2 (en) * 2013-05-16 2017-03-14 Intuitive Surgical Operations, Inc. Systems and methods for robotic medical system integration with external imaging
US10952600B2 (en) * 2014-07-10 2021-03-23 Covidien Lp Endoscope system
CN204542390U (en) * 2015-04-17 2015-08-12 中国科学院重庆绿色智能技术研究院 A kind of force feedback surgery operation robot control system based on augmented reality
CN104739519B (en) * 2015-04-17 2017-02-01 中国科学院重庆绿色智能技术研究院 Force feedback surgical robot control system based on augmented reality
CN105266897B (en) * 2015-11-25 2018-03-23 上海交通大学医学院附属第九人民医院 A kind of microsurgery navigation system and air navigation aid based on augmented reality
EP3429549B1 (en) * 2016-03-15 2022-04-20 Koninklijke Philips N.V. Fiber-optic realshape sensing feeding tube
US20180116732A1 (en) * 2016-10-18 2018-05-03 The Board Of Trustees Of The Leland Stanford Junior University Real-time Three Dimensional Display of Flexible Needles Using Augmented Reality
CN106648077A (en) * 2016-11-30 2017-05-10 南京航空航天大学 Adaptive dynamic stereoscopic augmented reality navigation system based on real-time tracking and multi-source information fusion
WO2018195221A1 (en) * 2017-04-18 2018-10-25 Intuitive Surgical Operations, Inc. Graphical user interface for planning a procedure
WO2019040493A1 (en) * 2017-08-21 2019-02-28 The Trustees Of Columbia University In The City Of New York Systems and methods for augmented reality guidance
CN108406725A (en) * 2018-02-09 2018-08-17 华南理工大学 Force feedback man-machine interactive system and method based on electromagnetic theory and mobile tracking
CN209392094U (en) * 2018-06-20 2019-09-17 深圳大学 A kind of surgery systems of augmented reality
CN109758230B (en) * 2019-02-26 2021-04-13 中国电子科技集团公司信息科学研究院 Neurosurgery navigation method and system based on augmented reality technology
US11254019B2 (en) * 2019-03-05 2022-02-22 The Boeing Company Automatic calibration for a robot optical sensor
CN110010249B (en) * 2019-03-29 2021-04-27 北京航空航天大学 Augmented reality operation navigation method and system based on video superposition and electronic equipment
WO2020221311A1 (en) * 2019-04-30 2020-11-05 齐鲁工业大学 Wearable device-based mobile robot control system and control method
KR102313319B1 (en) * 2019-05-16 2021-10-15 서울대학교병원 AR colonoscopy system and method for monitoring by using the same
CN110706279B (en) * 2019-09-27 2021-09-07 清华大学 Global position and pose estimation method based on information fusion of global map and multiple sensors
CN110711030B (en) * 2019-10-21 2021-04-23 北京国润健康医学投资有限公司 Femoral head necrosis minimally invasive surgery navigation system and navigation method based on AR technology
CN111202583A (en) * 2020-01-20 2020-05-29 上海奥朋医疗科技有限公司 Method, system and medium for tracking movement of surgical bed
CN212466186U (en) * 2020-02-11 2021-02-05 中国医学科学院北京协和医院 Bone tumor surgery auxiliary system based on augmented reality technology
CN111329587A (en) * 2020-02-19 2020-06-26 上海理工大学 Surgical registration system using shape sensing fiber optic mesh
CN112914731A (en) * 2021-03-08 2021-06-08 上海交通大学 Interventional robot contactless teleoperation system based on augmented reality and calibration method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022188352A1 (en) * 2021-03-08 2022-09-15 上海交通大学 Augmented-reality-based interventional robot non-contact teleoperation system, and calibration method therefor
WO2023274270A1 (en) * 2021-06-30 2023-01-05 上海微觅医疗器械有限公司 Robot preoperative navigation method and system, storage medium, and computer device
CN114373046A (en) * 2021-12-27 2022-04-19 达闼机器人有限公司 Method and device for assisting robot to operate and storage medium
CN114373046B (en) * 2021-12-27 2023-08-18 达闼机器人股份有限公司 Method, device and storage medium for assisting robot operation
CN114931437A (en) * 2022-07-25 2022-08-23 中国科学院自动化研究所 Sensing type continuum robot, intervention sensing system and method

Also Published As

Publication number Publication date
WO2022188352A1 (en) 2022-09-15
CN113229941A (en) 2021-08-10
CN113229941B (en) 2023-05-26

Similar Documents

Publication Publication Date Title
CN113229941B (en) Interventional robot non-contact teleoperation system based on augmented reality and calibration method
US20230107693A1 (en) Systems and methods for localizing, tracking and/or controlling medical instruments
US12089804B2 (en) Navigation of tubular networks
US12082920B2 (en) Systems and methods for deformation compensation using shape sensing
CN109069217B (en) System and method for pose estimation in image-guided surgery and calibration of fluoroscopic imaging system
US8073528B2 (en) Tool tracking systems, methods and computer products for image guided surgery
US8108072B2 (en) Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information
Burschka et al. Navigating inner space: 3-d assistance for minimally invasive surgery
CN104302241B (en) The registration arrangement and method of the Medical Devices of search space for using reduction
Lee et al. From medical images to minimally invasive intervention: Computer assistance for robotic surgery
CN102458293A (en) Virtual measurement tool for minimally invasive surgery
CN102458294A (en) Virtual measurement tool for minimally invasive surgery
WO2009045827A2 (en) Methods and systems for tool locating and tool tracking robotic instruments in robotic surgical systems
Staub et al. Contour-based surgical instrument tracking supported by kinematic prediction
Lu et al. A unified monocular camera-based and pattern-free hand-to-eye calibration algorithm for surgical robots with RCM constraints
WO2023018684A1 (en) Systems and methods for depth-based measurement in a three-dimensional view
Chen et al. Intuitive Teleoperation Control for Flexible Robotic Endoscopes Under Unkonwn Environmental Interferences
Lu Suture thread detection and 3D model reconstruction for automated surgical knot tying with a vision-based robotic system
Hashemi 3D Shape Estimation Of Tendon-Driven Catheters Using Ultrasound Imaging
CABRAS et al. THÈSEprésentée par
Salajegheh Imaging of surgical tools as a new paradigm for surgeon computer-interface in minimally invasive surgery
Navarro et al. An Approach to Perception Enhancement in Robotized Surgery using Computer Vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210608

WD01 Invention patent application deemed withdrawn after publication