CN109288591B - Surgical robot system - Google Patents

Surgical robot system Download PDF

Info

Publication number
CN109288591B
CN109288591B CN201811497442.6A CN201811497442A CN109288591B CN 109288591 B CN109288591 B CN 109288591B CN 201811497442 A CN201811497442 A CN 201811497442A CN 109288591 B CN109288591 B CN 109288591B
Authority
CN
China
Prior art keywords
unit
sensing
pose
endoscope
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811497442.6A
Other languages
Chinese (zh)
Other versions
CN109288591A (en
Inventor
师云雷
王家寅
何超
朱祥
夏玉辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN201811497442.6A priority Critical patent/CN109288591B/en
Publication of CN109288591A publication Critical patent/CN109288591A/en
Application granted granted Critical
Publication of CN109288591B publication Critical patent/CN109288591B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

The present invention provides a surgical robot system, comprising: the system comprises a first mechanical arm, a second mechanical arm, a pose acquisition unit, an image processing unit and a display unit; surgical instruments are hung on the first mechanical arm; the endoscope is hung on the second mechanical arm; the pose acquisition unit is used for acquiring the position of the surgical instrument in the display unit coordinate system and the pose of the endoscope in the display unit coordinate system; the image processing unit is used for obtaining an image containing relative position relation information of the surgical instrument and the endoscope view field according to the position of the surgical instrument in the display unit coordinate system, the pose of the endoscope in the display unit coordinate system and the endoscope view field; the display unit displays the image formed by the image processing unit, so that when the surgical instrument needs to be adjusted, the image containing the relative position relation information of the surgical instrument and the endoscope view field can be adjusted according to the image displayed on the display unit, and the surgical instrument can be conveniently and accurately adjusted to the view field of the endoscope.

Description

Surgical robot system
Technical Field
The invention relates to the technical field of medical instruments, in particular to a surgical robot system.
Background
In robotic surgery, a nurse is often required to assist the primary surgeon in changing the surgical instruments or to assist the surgeon in adjusting the surgical instruments to the field of view when the surgical instruments are not in the field of view of the surgeon. However, since the abdominal cavity of a human body is opaque, a nurse cannot directly observe the relationship between the surgical instrument and the endoscope (laparoscope) through the abdominal cavity wall, and the method is in a situation of blind trial or experience trial, especially when the laparoscope is a non-zero scope, the laparoscope is more serious, and at this time, the surgical instrument may hurt the internal organs of the human body, which causes a surgical risk, especially when performing surgery near a fragile internal organ such as a heart, a liver, etc., the surgical accident is more easily caused by puncturing the internal organs.
Patent application CN107016685A provides a real-time matching operation scene augmented reality projection method, and its thinking is to carry out quick three-dimensional reconstruction to medical image in order to obtain virtual three-dimensional model, will through reinforcing the display model virtual model and operation real scene carry out real-time tracking and overlap, through the three-dimensional image model of miniature projection and operation scene real-time overlapping, can make the surgeon obtain more direct patient's internal information, realize that the guide effect to the operation reduces the operation accidental injury and improves the operation success rate. This solution does not however help with the adjustment of the surgical instrument.
Patent application CN104274247A provides a medical procedure navigation method, comprising: a first step for three-dimensional reconstruction of a medical image to obtain a virtual model; and a second step of fusing the virtual model with the body part of the patient by an augmented reality technique so as to achieve a navigation effect on the medical operation in a fused state. The medical operation navigation method of the patent application enables a surgeon to obtain more accurate and visual body information of a patient, adopts an enhanced display technology to fuse a virtual model with body parts of the patient, ensures the intuition of the body of the patient, but can not solve the problem of being convenient for the adjustment of surgical instruments.
The invention patent CN 102341046B discloses a surgical robot system using augmented reality technology and a control method thereof: a master interface of a surgical robot, installed on a master robot for controlling slave robots including a robot arm, comprising: a screen display unit for displaying an endoscopic image corresponding to an image signal supplied from an endoscope for surgery; one or more arm operating units for controlling the robot arms, respectively; and an augmented reality technology implementation unit that generates virtual surgical instrument information so that the virtual surgical instrument is displayed on the screen display unit, in accordance with an operation performed by a user using the arm operation unit. The main interface of the surgical robot simultaneously displays the actual surgical instrument and the virtual surgical instrument by using the augmented reality technology, and the technical problem that the normal operation of an operator is influenced because the action of the hand of the operator is inconsistent with the action of the slave robot seen through a picture due to network communication delay is solved. This patent also uses the technology of enhanced display, but the main purpose is to use the virtual surgical instrument information to overcome the delay effect or for teaching, to ensure the concentration of the operation of the operator, or to let the operator sense the organ contact caused by the movement of the virtual surgical instrument through the operation of the main robot in real time, and does not relate to any purpose or method for solving the safety problem during the adjustment in the operation.
Disclosure of Invention
The invention aims to provide a surgical robot system to solve the problem that surgical instruments are inconvenient to adjust in the prior art.
In order to solve the above technical problem, the present invention provides a surgical robot system, including: the system comprises a first mechanical arm, a second mechanical arm, a pose acquisition unit, an image processing unit and a display unit; wherein the content of the first and second substances,
surgical instruments are hung on the first mechanical arm;
the endoscope is hung on the second mechanical arm;
the pose acquisition unit is used for acquiring the position of the surgical instrument in the display unit coordinate system and the pose of the endoscope in the display unit coordinate system;
the image processing unit is used for obtaining an image containing relative position relation information of the surgical instrument and the endoscope visual field according to the position of the surgical instrument in the display unit coordinate system, the pose of the endoscope in the display unit coordinate system and the endoscope visual field; and
the display unit displays the image formed by the image processing unit.
Optionally, in the surgical robot system, the pose acquisition unit is further configured to acquire a pose of the surgical instrument in the coordinate system of the display unit;
the image processing unit is used for obtaining an image containing relative pose relation information of the surgical instrument and the endoscope view field according to the poses of the surgical instrument and the endoscope in the display unit coordinate system and the endoscope view field.
Optionally, in the surgical robot system, the position of the display unit in the global coordinate system remains unchanged;
the pose acquisition unit includes:
the first sensing unit is arranged on the first mechanical arm;
the second sensing unit is arranged on the second mechanical arm;
a sensing processing unit that acquires the poses of the first sensing unit and the second sensing unit in the sensing processing unit coordinate system, respectively;
a coordinate calculation unit, which obtains the poses of the surgical instrument and the endoscope in the sensing processing unit coordinate system according to the poses of the first sensing unit and the second sensing unit in the sensing processing unit coordinate system, and the pose mapping relationship between the first sensing unit and the surgical instrument tip in the current configuration and the pose mapping relationship between the second sensing unit and the endoscope tip, and takes the poses of the surgical instrument and the endoscope in the sensing processing unit coordinate system as the poses of the surgical instrument and the endoscope in the display unit coordinate system.
Optionally, in the surgical robot system, the first sensing unit and the second sensing unit are both selected from optical targets, the sensing processing unit is selected from an optical tracker, or,
the first sensing unit and the second sensing unit are selected from magnetic field sensors, and the sensing processing unit is selected from a magnetic field generator.
Optionally, in the surgical robot system, the posture of the display unit in the global coordinate system remains unchanged;
the pose acquisition unit includes:
the first sensing unit is arranged on the first mechanical arm and the surgical instrument and used for acquiring the motion states of all joints on the first mechanical arm and the surgical instrument;
the second sensing unit is arranged on the second mechanical arm and the endoscope and used for acquiring the motion states of all joints on the second mechanical arm and the endoscope;
a coordinate calculation unit that acquires the motion states acquired by the first sensing unit and the second sensing unit, and acquires, as the poses of the surgical instrument and the endoscope in the display unit coordinate system, the pose descriptions of the surgical instrument and the endoscope in a global coordinate system from a kinematic model.
Optionally, in the surgical robot system, the display unit moves along with the movement of the operator's eyes, and the display unit and the operator's eyes are matched in position in the global coordinate system, and the movement variation amounts are consistent;
the pose acquisition unit includes:
the first sensing unit is arranged on the first mechanical arm;
the second sensing unit is arranged on the second mechanical arm;
a third sensing unit disposed on the display unit;
a sensing processing unit that acquires poses of the first sensing unit, the second sensing unit, and the third sensing unit each in the sensing processing unit coordinate system;
and the coordinate calculation unit is used for obtaining the poses of the surgical instrument and the endoscope in the coordinate system of the display unit according to the poses of the first sensing unit, the second sensing unit and the third sensing unit in the coordinate system of the sensing processing unit, and preset pose mapping relations of the first sensing unit and the surgical instrument tail end, pose mapping relations of the second sensing unit and the endoscope tail end and pose mapping relations of the third sensing unit and the display unit.
Optionally, in the surgical robot system, the first sensing unit, the second sensing unit and the third sensing unit are all selected from magnetic field sensors, the sensing processing unit is selected from magnetic field generators, or,
the first sensing unit, the second sensing unit and the third sensing unit are all selected from optical targets, and the sensing processing unit is selected from an optical tracker.
Optionally, in the surgical robot system, the display unit is selected from a head-mounted display.
Optionally, in the surgical robot system, the surgical robot system further includes an imaging device, and the imaging device is used for modeling a body surface and an internal organ environment of a human body;
the pose acquisition unit is also used for acquiring the pose of the internal organ environment of the human body in the coordinate system of the display unit;
according to the pose of the human internal organ environment in the coordinate system of the display unit, the image obtained by the image processing unit also comprises relative pose relation information of the human internal organ environment model and the surgical instrument.
Optionally, in the surgical robotic system, the imaging device is selected from one or more of a CT device, an MRI device, and a thermal imaging device.
Optionally, in the surgical robot system, the position of the display unit in the global coordinate system remains unchanged;
the pose acquisition unit includes:
the first sensing unit is arranged on the first mechanical arm;
the second sensing unit is arranged on the second mechanical arm;
the third sensing unit is arranged at the characteristic points of the body surface of the human body;
a sensing processing unit that acquires poses of the first sensing unit, the second sensing unit, and the third sensing unit each in the sensing processing unit coordinate system;
a coordinate calculation unit which obtains the poses of the surgical instrument and the endoscope in the sensing processing unit coordinate system according to the poses of the first sensing unit and the second sensing unit in the sensing processing unit coordinate system, and the pose mapping relationship between the first sensing unit and the surgical instrument tip in the current configuration and the pose mapping relationship between the second sensing unit and the endoscope tip in the current configuration, and takes the poses of the surgical instrument and the endoscope in the sensing processing unit coordinate system as the poses of the surgical instrument and the endoscope in the display unit coordinate system;
and the coordinate calculation unit is also used for obtaining the pose of the human internal organ environment model in the coordinate system of the sensing processing unit through image matching according to the pose of the third sensing unit in the coordinate system of the sensing processing unit and the positions of the feature points in the human body surface and the internal organ environment model, and taking the pose as the pose of the human internal organ environment model in the coordinate system of the display unit.
Optionally, in the surgical robot system, the display unit moves along with the movement of the operator's eyes, and the display unit and the operator's eyes are matched in position in the global coordinate system, and the movement variation amounts are consistent;
the pose acquisition unit includes:
the first sensing unit is arranged on the first mechanical arm;
the second sensing unit is arranged on the second mechanical arm;
a third sensing unit disposed on the display unit;
the fourth sensing unit is arranged at the characteristic points of the body surface of the human body;
a sensing processing unit that acquires poses of the first sensing unit, the second sensing unit, the third sensing unit, and the fourth sensing unit each in the sensing processing unit coordinate system;
a coordinate calculation unit, which obtains the respective poses of the surgical instrument and the endoscope in the coordinate system of the display unit according to the poses of the first sensing unit, the second sensing unit and the third sensing unit in the coordinate system of the sensing processing unit, and preset pose mapping relations of the first sensing unit and the surgical instrument tip, pose mapping relations of the second sensing unit and the endoscope tip, and pose mapping relations of the third sensing unit and the display unit;
the coordinate calculation unit further obtains the pose of the human internal organ environment model in the sensing processing unit coordinate system through image matching according to the pose of the fourth sensing unit in the sensing processing unit coordinate system and the positions of the feature points in the human body surface and internal organ environment model, and obtains the pose of the human internal organ environment model in the display unit coordinate system according to the pose of the display unit in the sensing processing unit coordinate system.
Optionally, in the surgical robot system, the surgical robot system further includes: the first mechanical arm and the second mechanical arm are positioned on the slave mechanical arm, and the master mechanical arm can control the movement of the first mechanical arm and the second mechanical arm.
In the surgical robot system provided by the present invention, the surgical robot system includes: the system comprises a first mechanical arm, a second mechanical arm, a pose acquisition unit, an image processing unit and a display unit; wherein a surgical instrument is hung on the first mechanical arm; the endoscope is hung on the second mechanical arm; the pose acquisition unit is used for acquiring the position of the surgical instrument in the display unit coordinate system and the pose of the endoscope in the display unit coordinate system; the image processing unit is used for obtaining an image containing relative position relation information of the surgical instrument and the endoscope visual field according to the position of the surgical instrument in the display unit coordinate system, the pose of the endoscope in the display unit coordinate system and the endoscope visual field; and the display unit displays the image formed by the image processing unit, so that when the surgical instrument needs to be adjusted, the adjustment can be carried out according to the image which is displayed on the display unit and contains the relative position relation information of the surgical instrument and the view field of the endoscope, and the view field of the surgical instrument to the endoscope can be conveniently and accurately adjusted.
Drawings
FIG. 1 is a schematic view of an operating room layout using a surgical robotic system according to an embodiment of the present invention;
FIG. 2 is a schematic structural view of a surgical robotic system according to an embodiment of the present invention;
FIG. 3 is a functional block diagram of a surgical robotic system according to an embodiment of the present invention;
FIG. 4 is another schematic structural view of a surgical robotic system according to an embodiment of the present invention;
FIG. 5 is a schematic view of another functional module of a surgical robotic system in accordance with an embodiment of the present invention;
FIG. 6 is a schematic illustration of the display effect of the surgical robotic system of the embodiment of the present invention;
in the context of figures 1 to 6,
10-from the hand end; 20-master hand end;
100-a first robot arm; 110-a second mechanical arm; 120-pose acquisition unit; 121 a-first sensing unit; 121 b-a second sensing unit; 122-a sensing processing unit; 123-coordinate calculation unit; 130-an image processing unit; 140-a display unit; 150-a surgical instrument; 160-endoscope;
200-a first robot arm; 210-a second robotic arm; 220-a pose acquisition unit; 221 a-a first sensing unit; 221 b-a second sensing unit; 221 c-a third sensing unit; 222-a sensing processing unit; 223-a coordinate calculation unit; 230-an image processing unit; 240-a display unit; 250-a surgical instrument; 260-endoscope;
310-kidney; 320-gallbladder; 350-surgical instruments; 360-endoscope.
Detailed Description
The surgical robotic system of the present invention is described in further detail below with reference to the figures and the specific embodiments. Advantages and features of the present invention will become apparent from the following description and from the claims. It is to be noted that the drawings are in a very simplified form and are not to precise scale, which is merely for the purpose of facilitating and distinctly claiming the embodiments of the present invention. In particular, due to factors such as the effect of the illustration, the drawings often employ different display scales.
A core idea of the present invention is to provide a surgical robot system including: the system comprises a first mechanical arm, a second mechanical arm, a pose acquisition unit, an image processing unit and a display unit; wherein a surgical instrument is hung on the first mechanical arm; the endoscope is hung on the second mechanical arm; the pose acquisition unit is used for acquiring the position of the surgical instrument in the display unit coordinate system and the pose of the endoscope in the display unit coordinate system; the image processing unit is used for obtaining an image containing relative position relation information of the surgical instrument and the endoscope visual field according to the position of the surgical instrument in the display unit coordinate system, the pose of the endoscope in the display unit coordinate system and the endoscope visual field; and the display unit displays the image formed by the image processing unit, so that when the surgical instrument needs to be adjusted, the adjustment can be carried out according to the image which is displayed on the display unit and contains the relative position relation information of the surgical instrument and the view field of the endoscope, and the view field of the surgical instrument to the endoscope can be conveniently and accurately adjusted.
In the embodiment of the present application, there is no particular limitation on the types of the endoscope and the surgical instrument, for example, the endoscope may be a detection instrument for collecting surgical environment information (including but not limited to tissue organ blood vessel information, surgical instrument, surgical consumable state information, etc.), and the surgical instrument may be a surgical tool such as scissors, needle holder, grasper, electric knife, electric coagulation forceps, etc.
[ EXAMPLES one ]
Take a teleoperated laparoscopic surgical robotic system as an example. Fig. 1 is a schematic view of a surgical operating room layout of the teleoperated laparoscopic surgical robotic system. The teleoperated laparoscopic surgical robot includes a master hand end 20 and a slave hand end 10. The master hand end 20 comprises a master manipulator and the slave hand end 10 comprises a robotic arm carrying a surgical instrument or laparoscope. During the operation, the main operating surgeon operates the main operating hand of the main hand end 20 according to the operation environment acquired by the laparoscope, thereby controlling the surgical instruments mounted on the robot arm of the hand end 10 to perform the minimally invasive surgery. The nurse at the patient end assists in performing the auxiliary work of replacing the surgical instruments and adjusting the positions of the surgical instruments.
Specifically, referring to fig. 2 and 3, the surgical robot system includes: a first robot arm 100, a second robot arm 110, a pose acquisition unit 120, an image processing unit 130, and a display unit 140; wherein the first robotic arm 100 has a surgical instrument 150 mounted thereon; an endoscope 160 is mounted on the second robot arm 110; the pose acquisition unit 120 is configured to acquire a pose or position of the surgical instrument 150 in the coordinate system of the display unit 140 and a pose of the endoscope 160 in the coordinate system of the display unit 140; the image processing unit 130 is configured to form an image containing relative position or pose relationship information between the surgical instrument 150 and the endoscope viewing field according to the pose or position of the surgical instrument 150 in the coordinate system of the display unit 140, the pose of the endoscope 160 in the coordinate system of the display unit 140, and the endoscope viewing field; and the display unit 140 displays the image formed by the image processing unit 130. There may also be a greater number of robotic arms located on the slave end 10, for example, there may also be a third robotic arm, a fourth robotic arm, etc., which may also be loaded with surgical instruments, respectively. Here, "pose" includes position and attitude.
Here, it is set as required that "the pose acquisition unit 120 is configured to acquire the poses of the surgical instrument 150 and the endoscope 160 in the coordinate system of the display unit 140" respectively, so that the image processing unit 130 acquires an image containing the relative pose relationship information of the surgical instrument 150 and the endoscope field of view; or "the pose acquisition unit 120 is configured to obtain the position of the surgical instrument 150 in the coordinate system of the display unit 140 and the pose of the endoscope 160 in the coordinate system of the display unit 140" so that the image processing unit 130 obtains an image including relative positional relationship information of the surgical instrument 150 and the endoscope field of view. For example, when the nurse focuses only on the position of the distal end of the surgical instrument 150 with respect to the endoscope visual field, the nurse only needs to provide the position of the distal end point of the surgical instrument 150 in the coordinate system of the display unit 140.
In the embodiment of the present application, the endoscope 160 may be a 3D endoscope or a 2D endoscope; further, the endoscope 160 may be a 0 ° endoscope, a 30 ° endoscope, or other endoscope with a specific angle. The actual field of view (also called the "field of view") that the endoscope 160 can see is a conical volume. Specifically, the field of view of the endoscope 160 is determined by the depth of the endoscope 160 entering the human body and the field of view angle of the endoscope 160, and the field of view angle of the endoscope 160 is determined by the characteristics of the endoscope 160 (e.g., the angle of the endoscope lens, the composition and type of the endoscope objective lens group, etc.).
In the embodiment of the present application, the display unit 140 is preferably a 3D display, so that the display unit 140 can better represent the image of the field of view of the endoscope 160, thereby facilitating the adjustment of the surgical instrument 150 by the nurse. Further, the display unit 140 may also be a device for displaying information, such as a flat panel display, a projector, or a holographic imaging device. The display unit 140 and the image processing unit 130 may be independent of each other or integrated together; further, the display unit 140 and the image processing unit 130 may be provided on a surgical device such as a video cart.
With continuing reference to fig. 2 and fig. 3, in the embodiment of the present application, the pose acquisition unit 120 includes: a first sensing unit 121a, wherein the first sensing unit 121a is disposed on the first robot arm 100; a second sensing unit 121b, the second sensing unit 121b being disposed on the second robot arm 110; a sensing processing unit 122, wherein the sensing processing unit 122 acquires the poses of the first sensing unit 121a and the second sensing unit 121b in the coordinate system of the sensing processing unit 122; a coordinate calculation unit 123, wherein the coordinate calculation unit 123 obtains the respective poses of the distal end of the surgical instrument 150 and the distal end of the endoscope 160 in the coordinate system of the sensing processing unit 122 according to the respective poses of the first sensing unit 121a and the second sensing unit 121b in the coordinate system of the sensing processing unit 122, the pose mapping relationship between the first sensing unit 121a and the distal end of the surgical instrument 150 in the current configuration, and the pose mapping relationship between the second sensing unit 121b and the distal end of the endoscope 160.
The present embodiment is not particularly limited as to how to acquire the "pose mapping relationship of the first sensing unit 121a and the distal end of the surgical instrument 150 in the current configuration" and the "pose mapping relationship of the second sensing unit 121b and the distal end of the endoscope 160". For example, the first sensing unit 121a may be disposed at the proximal end of the surgical instrument and the second sensing unit 121b may be disposed at the proximal end of the endoscope 160. During the movement from the hand end, the relative positions of the first sensing unit 121a and the surgical instrument 150 are fixed, and the relative positions of the second sensing unit 121b and the endoscope 160 are fixed, so that the pose mapping relationship between the first sensing unit 121a and the surgical instrument 150 and the pose mapping relationship between the second sensing unit 121b and the endoscope 160 can be obtained at the time of initialization. Alternatively, the first sensing unit 121a may be disposed on the first robot arm, and the second sensing unit 121b may be disposed on the second robot arm, and at this time, the pose mapping relationship between the first sensing unit 121a and the distal end of the surgical instrument 150, and the pose mapping relationship between the second sensing unit 121b and the distal end of the endoscope 160 may need to be determined by a kinematic model according to the motion state of each joint in the current configuration.
Generally, the surgical instrument includes an interface portion connected to an end of a robotic arm, an instrument shaft, and an end effector at a distal end of the instrument shaft. For serpentine surgical instruments, a serpentine joint may also be provided between the instrument shaft and the end effector. In the present invention, the "position/posture of the surgical instrument in the display unit coordinate system" may refer to a position/posture of a part or all of the surgical instrument in the display unit coordinate system, for example, a position/posture of an interface part, an instrument rod part, or the whole of the surgical instrument in the display unit coordinate system, or a position/posture of an interface part, an instrument rod part, and a serpentine joint (if any) part of the surgical instrument in the display unit coordinate system. The operator may select the type of end effector of the surgical instrument (e.g., whether it may cause serious injury to the human body), the length of the end effector of the surgical instrument (e.g., whether it may be negligible), and the like, depending on the surgical requirements. If the position/posture of the end effector of the surgical instrument under the display unit coordinate system or the position/posture of the snake joint of the surgical instrument under the display unit coordinate system needs to be considered, on the basis of acquiring the motion states of all joints between the sensing unit and the instrument rod, the motion states of the snake joint and all joints in the end effector are further acquired, and the position/posture of the end effector of the surgical instrument under the display unit coordinate system or the position/posture of the snake joint of the surgical instrument under the display unit coordinate system is determined according to a kinematic model. The same is true for endoscopes, particularly serpentine endoscopes.
In a preferred embodiment, the first sensing unit 121a and the second sensing unit 121b are both selected from optical targets, and the sensing processing unit 122 is selected from optical trackers.
The following is specifically explained with reference to fig. 2 and 3:
the sensing processing unit 122 measures the poses of the first sensing unit 121a and the second sensing unit 121b, and obtains the poses of the first sensing unit 121a and the second sensing unit 121b in the coordinate system of the sensing processing unit 122 as poses
Figure BDA0001897248100000111
The coordinate calculation unit 123 receives the sensing placeThe pose of the first sensing unit 121a and the second sensing unit 121b sent by the processing unit 122 in the coordinate system of the sensing processing unit 122
Figure BDA0001897248100000112
According to the preset pose mapping relationship between the first sensing unit 121a and the end of the surgical instrument 150
Figure BDA0001897248100000113
A pose mapping relationship between the second sensing unit 121b and the distal end of the endoscope 160
Figure BDA0001897248100000114
Thereby obtaining the pose of the distal end of the surgical instrument 150 in the coordinate system of the sensing processing unit 122
Figure BDA0001897248100000115
Pose of the tip of the endoscope 160 in the sensing processing unit 122 coordinate system
Figure BDA0001897248100000116
Then, the pose of the surgical instrument 150 in the sensing processing unit 122 coordinate system
Figure BDA0001897248100000117
And the pose of the endoscope 160 in the sensing processing unit 122 coordinate system
Figure BDA0001897248100000118
Respectively transformed into the coordinate system of the display unit 140. Since the display unit 140 is disposed on the video cart in the embodiment, the pose in the global coordinate system remains unchanged, so that the matching with the human eye coordinate system is not required, and therefore, the pose of the surgical instrument 150 in the coordinate system of the sensing processing unit 122 can be directly obtained
Figure BDA0001897248100000119
And the pose of the endoscope 160 in the sensing processing unit 122 coordinate system
Figure BDA00018972481000001110
As the poses of the surgical instrument 150 and the endoscope 160 in the coordinate system of the display unit 140
Figure BDA00018972481000001111
In an alternative embodiment, the first sensing unit 121a and the second sensing unit 121b may also be selected from magnetic field sensors, and the sensing processing unit 122 is selected from magnetic field generators. Preferably, the first sensing unit 121a is disposed at a proximal end of the first robotic arm 100 close to the surgical instrument 150, and the second sensing unit 121b is disposed at a proximal end of the second robotic arm 110 close to the endoscope 160.
In another alternative embodiment, the first sensing unit 121a and the second sensing unit 121b are position sensors, such as code discs, for sensing the motion states (for a rotational joint, the motion state is a rotational angle, and for a translational joint, a displacement) of the respective joints of the mechanical arm and the surgical instrument. The coordinate calculation unit 123 obtains the poses of the distal end of the surgical instrument 150 and the distal end of the endoscope 160 in the global coordinate system based on the robot arm kinematics model.
Then, the image processing unit 130 receives the poses of the surgical instrument 150 and the endoscope 160 in the coordinate system of the display unit 140
Figure BDA00018972481000001112
According to the poses of the surgical instrument 150 and the endoscope 160 in the coordinate system of the display unit 140
Figure BDA0001897248100000121
And an endoscope view field for obtaining an image containing relative pose relationship information of the surgical instrument and the endoscope view field. Specifically, the image processing unit 130 performs processing according to the pose of the endoscope 160 in the coordinate system of the display unit 140
Figure BDA0001897248100000122
The pose of the field of view of the endoscope 160, and hence the pose of the virtual model of the surgical instrument 150 and the pose of the virtual model of the field of view of the endoscope 160, is determined, forming an image containing relative pose information of the surgical instrument and the field of view of the endoscope. The display unit 140 displays the image including the relative positional and positional relationship information of the surgical instrument and the endoscope view field. Further, the image processing unit 130 sets an image containing the relative positional relationship information of the surgical instrument and the endoscope visual field on one display frame (frame). The image processing unit 130 may include one or more display frames in the image, and each display frame may contain a different image. The display unit 140 displays the image formed by the graphics processing unit 130, that is, the image displayed by the display unit 140 may include one or more display frames, each of which may include a different image, where one of the images includes relative pose relationship information between the surgical instrument and the endoscope viewing field. When the surgical instrument 150 needs to be adjusted to enable the on-hand instrument 150 to enter the field of view of the endoscope 160, the adjustment can be performed according to the image containing the relative pose relationship information between the surgical instrument 150 and the field of view of the endoscope displayed on the display unit 140, so that the field of view of the surgical instrument 150 entering the endoscope 160 can be conveniently and accurately adjusted.
In this embodiment, in the surgical procedure, when the surgical instrument is located outside the field of view of the endoscope, the main operating doctor cannot continue the surgical procedure, and in order to ensure normal operation and safety of the surgical procedure, the nurse needs to adjust the surgical instrument to the field of view of the endoscope, and the field of view that can be actually observed by the endoscope is a conical three-dimensional structure, that is, the surgical instrument needs to be adjusted to the range of the conical three-dimensional structure. If the nurse only concerns whether the surgical instrument is in the endoscope field of view, the above-described image containing the relative positional relationship information of the surgical instrument and the endoscope field of view may provide only the relative positional relationship of the distal end of the surgical instrument and the field of view of the endoscope. Thus, the virtual display model of the surgical instrument 150 may be an isometric or non-isometric model representation of the entire surgical instrument 150, or may be other identifiers or characters representing the distal end of the surgical instrument 150, as long as the relative position relationship between the distal end of the surgical instrument 150 and the field of view of the endoscope 160 can be clearly described. In addition, the virtual model of the field of view of the endoscope 160 may be represented by a cone or cone-shaped graph. If the nurse also needs to be concerned with the path of the surgical instrument moving to the field of view of the endoscope (e.g., from what position, what angle into the field of view of the endoscope), the above-described image containing the relative pose relationship information of the surgical instrument and the field of view of the endoscope needs to provide the relative pose relationship of the tip of the surgical instrument to the field of view of the endoscope. The virtual display model of the surgical instrument 150 needs to be a scaled or non-scaled model of the entire surgical instrument 150 to represent.
[ example two ]
The main difference between the second embodiment and the first embodiment is that in the second embodiment, the display unit moves along with the movement of the eyes of the operator, and the display unit and the eyes of the operator are matched in position in the global coordinate system, and the movement change amounts are consistent. For example, the display unit is selected from a head-mounted display. Therefore, when the image processing unit obtains the image regarding the relative positional relationship between the surgical instrument and the field of view of the endoscope from the respective poses of the surgical instrument and the endoscope in the display unit coordinate system and the field of view of the endoscope, the formed image regarding the relative positional relationship between the surgical instrument and the field of view of the endoscope will take into account the view of the user of the movable display, i.e., while also performing matching with the human eye coordinate system. Next, the structure and function of the surgical robot system in the second embodiment will be described in detail, and reference may be made to the description of the first embodiment for the undescribed parts of the second embodiment.
Specifically, referring to fig. 4 and 5, the surgical robot system includes: a first robot arm 200, a second robot arm 210, a pose acquisition unit 220, an image processing unit 230, and a display unit 240; wherein the first mechanical arm 200 is provided with a surgical instrument 250; an endoscope 260 is hung on the second robot arm 210; the pose acquisition unit 220 is configured to acquire the pose or position of the surgical instrument 250 in the coordinate system of the display unit 240 and the pose of the endoscope 260 in the coordinate system of the display unit 240; the image processing unit 230 is configured to obtain an image containing relative position or pose relationship information between the surgical instrument 250 and the endoscope viewing field according to the pose or position of the surgical instrument 250 in the display unit 240 coordinate system, the pose of the endoscope 260 in the display unit 240 coordinate system, and the viewing field of the endoscope 260; and the display unit 240 displays the image formed by the image processing unit 230.
In the embodiment of the present application, the pose acquisition unit 220 includes: a first sensing unit 221a, wherein the first sensing unit 221a is disposed on the first robot arm 200; a second sensing unit 221b, the second sensing unit 221b being disposed on the second robot arm 210; a third sensing unit 221c, wherein the third sensing unit 221c is disposed on the display unit 240; a sensing processing unit 222, wherein the sensing processing unit 222 acquires the poses of the first sensing unit 221a, the second sensing unit 221b, and the third sensing unit 221c in the coordinate system of the sensing processing unit 222; a coordinate calculation unit 223, wherein the coordinate calculation unit 223 obtains the respective poses of the distal end of the surgical instrument 250 and the distal end of the endoscope 260 in the coordinate system of the display unit 240 according to the respective poses of the first sensing unit 221a, the second sensing unit 221b and the third sensing unit 221c in the coordinate system of the sensing processing unit 222 and the pose mapping relationship of the first sensing unit 221a and the distal end of the surgical instrument 250, the pose mapping relationship of the second sensing unit 221b and the distal end of the endoscope 260, and the pose mapping relationship of the third sensing unit 221c and the display unit 240.
Wherein the first sensing unit 221a, the second sensing unit 221b and the third sensing unit 221c are all selected from magnetic field sensors, and the sensing processing unit 222 is selected from a magnetic field generator. Preferably, the first sensing unit 221a is disposed at an end of the first robot arm 200 close to the surgical instrument 250, the second sensing unit 221b is disposed at an end of the second robot arm 210 close to the endoscope 260, and the third sensing unit 221c may be disposed on a frame of a head-mounted glasses (display unit 240).
In the embodiment of the present application, the sensing processing unit 222 measures the poses of the first sensing unit 221a, the second sensing unit 221b, and the third sensing unit 221c to obtain the poses of the first sensing unit 221a, the second sensing unit 221b, and the third sensing unit 221c in the coordinate system of the sensing processing unit 222 as poses
Figure BDA0001897248100000141
The coordinate calculation unit 223 receives the poses of the first sensing unit 221a, the second sensing unit 221b and the third sensing unit 221c in the coordinate system of the sensing processing unit 222 sent by the sensing processing unit 222
Figure BDA0001897248100000142
According to the preset pose mapping relationship between the first sensing unit 221a and the end of the surgical instrument 250
Figure BDA0001897248100000143
A pose mapping relationship between the second sensing unit 221b and the distal end of the endoscope 260
Figure BDA0001897248100000144
And a pose mapping relationship of the third sensing unit 221c and the display unit 240
Figure BDA0001897248100000145
Thereby obtaining the pose of the surgical instrument 250 in the coordinate system of the sensing processing unit 222
Figure BDA0001897248100000146
The endoscope 260 is arranged atPose under the sensing processing unit 222 coordinate system
Figure BDA0001897248100000147
And the pose of the display unit 240 in the sensing processing unit 222 coordinate system
Figure BDA0001897248100000148
Then, the pose of the surgical instrument 250 in the sensing processing unit 222 coordinate system is determined
Figure BDA0001897248100000149
And the pose of the endoscope 260 in the sensing processing unit 222 coordinate system
Figure BDA00018972481000001410
Respectively transformed to the coordinate system of the display unit 240, thereby obtaining the pose of the surgical instrument 250 in the coordinate system of the display unit 240
Figure BDA00018972481000001411
And the pose of the endoscope 260 in the coordinate system of the display unit 240
Figure BDA00018972481000001412
In this embodiment, the display unit 240 is selected from a pair of head-mounted glasses, and the pose of the head-mounted glasses can be finally obtained through the third sensing unit 221c, so that the pose can be matched with the coordinate system of the human eye, and the adjustment of the surgical instrument 250 by the nurse is further facilitated.
Then, likewise, the image processing unit 230 receives the poses of the surgical instrument 250 and the endoscope 260 in the coordinate system of the display unit 240
Figure BDA0001897248100000151
According to the poses of the surgical instrument 250 and the endoscope 260 in the coordinate system of the display unit 240
Figure BDA0001897248100000152
And determining the poses of the virtual display model of the surgical instrument 250 and the virtual display model of the visual field of the endoscope 260 according to the preset visual field of the endoscope 260 to form an image containing relative pose relation information of the surgical instrument and the visual field of the endoscope. The display unit 240 displays the image containing the relative positional and orientational relationship information of the surgical instrument and the endoscope view field. Further, the image processing unit 230 sets an image containing the relative positional relationship information of the surgical instrument and the endoscope field of view on one display frame (frame). And the display unit 240 displays the image formed by the graphic processing unit 230. When the surgical instrument 250 needs to be adjusted to enable the on-hand instrument 250 to enter the visual field of the endoscope 260, the adjustment can be performed according to the image which is displayed on the display unit 240 and represents the relative pose relationship information of the visual fields of the surgical instrument 250 and the endoscope 260, so that the entry of the surgical instrument 250 into the visual field of the endoscope 260 can be conveniently and accurately adjusted.
[ EXAMPLE III ]
The third embodiment is different from the first embodiment in that the surgical robot system further includes an imaging device, the imaging device is configured to model a body surface and an internal organ environment of a human body, the pose acquisition unit is further configured to acquire a pose of the internal organ environment of the human body in the coordinate system of the display unit, and the image obtained by the image processing unit further includes relative pose relationship information between the model of the internal organ environment of the human body and the surgical instrument.
Specifically, the surgical robot system includes: the system comprises a first mechanical arm, a second mechanical arm, a pose acquisition unit, an image processing unit and a display unit; wherein a surgical instrument is hung on the first mechanical arm; an endoscope is hung on the second mechanical arm. The pose acquisition unit comprises a first sensing unit, a second sensing unit, a sensing processing unit and a coordinate calculation unit. The first sensing unit is arranged on the first mechanical arm; the second sensing unit is arranged on the second mechanical arm; the sensing processing unit is used for acquiring the poses of the surgical instrument and the endoscope in a coordinate system of the sensing processing unit; the coordinate calculation unit is used for obtaining the pose of the surgical instrument in the display unit coordinate system according to the poses of the first sensing unit and the second sensing unit in the sensing processing unit coordinate system respectively, and the pose of the endoscope in the display unit coordinate system. The image processing unit is used for displaying the relative position or the pose relation of the surgical instrument and the view field of the endoscope according to the pose of the surgical instrument in the display unit coordinate system, the pose of the endoscope in the display unit coordinate system and the view field of the endoscope.
On this basis, in this embodiment, the surgical robot system further includes an imaging device, the imaging device is configured to model a body surface and an internal organ environment of a human body, the pose acquisition unit further includes a third sensing unit, the third sensing unit is configured to acquire a pose of the internal organ environment model of the human body in a display unit coordinate system, the image processing unit superimposes the internal organ environment model of the human body with a surgical instrument model and an endoscope view field model, a formed image includes not only relative pose relationship information between the surgical instrument and the endoscope view field, but also relative pose relationship information between the surgical instrument and the internal organ environment of the human body, and the display unit further displays an image formed by the image processing unit. Wherein the imaging device is selected from one or more of a CT device, an MRI device, and a thermal imaging device.
Further, the third sensing unit is, for example, an auxiliary positioning sensor. Before an operation is carried out, an imaging device acquires an environment model of the body surface and internal organs of a human body. In preparation of an operation, the auxiliary positioning sensor may be attached to the epidermis outside a special bone of a patient body, including but not limited to feature points such as the clavicle end, the acromion, the xiphoid process, the thoracic vertebra, and the like, and the sensing processing unit may respectively acquire the pose of each feature point determined by the auxiliary positioning sensor in the coordinate system of the sensing processing unit. Further, the coordinate calculation unit obtains the poses of the human body surface and the internal organ environment model in the sensing processing unit coordinate system by using an Image registration (Image registration) method such as a point cloud registration algorithm automatic feature point registration or software combined with manual input according to the positions of the feature points in the human body surface and the internal organ environment model, preferably in combination with physical data of the patient in terms of height, weight, body fat percentage and the like, and further obtains the poses of the human body internal organ environment model in the sensing processing unit coordinate system. The method for acquiring the positions of the feature points in the human body surface and internal organ environment models is not limited, the positions of the feature points in the human body surface and internal organ environment models can be marked manually, and the positions of the feature points in the human body surface and internal organ environment models can be acquired through a graphic algorithm according to the characteristics of each feature point. Further, the coordinate calculation unit obtains the pose of the internal organ environment model of the human body in a coordinate system of a display unit according to the pose of the endoscope in the coordinate system of the sensing processing unit. Also, since the display unit is provided on the video car, the attitude remains unchanged in the global coordinate system, so that matching with the human eye coordinate system is not required. The pose of the surgical instrument in the sensing processing unit coordinate system, the pose of the endoscope in the sensing processing unit coordinate system, and the pose of the internal organ environment model in the human body in the sensing processing unit coordinate system are set as the poses of the surgical instrument, the endoscope, and the internal organ environment model in the human body in the display unit coordinate system. The image processing unit is used for forming an image containing relative pose relation information of the surgical instrument and the view field of the endoscope and relative pose relation information of the surgical instrument and the internal organ environment of the human body according to the poses of the surgical instrument, the endoscope and the internal organ environment of the human body under the coordinate system of the display unit and the view field of the endoscope, so that an operator (nurse) can feel a completely transparent abdominal cavity environment. As shown in fig. 6, not only the surgical instrument 350 and the endoscope 360, but also organs in the human body, such as the kidney 310 and the gallbladder 320, can be seen, so that the safety of the surgical operation can be greatly enhanced. In the present embodiment, since how the surgical instrument 350 moves under the model of the internal organ environment of the human body is taken into consideration, the surgical instrument 350, the internal organ environment of the human body, and the view field model of the endoscope 360 are displayed in equal proportion, and the attitude relationship of the surgical instrument 350 with respect to the view field of the endoscope 360 is taken into consideration.
The third undescribed part of this embodiment can be referred to the description of the first embodiment, and the third embodiment is not described again.
In addition, in an alternative embodiment of the third embodiment, the display unit may move along with the movement of the eyes of the operator, and the display unit and the eyes of the operator are matched in position in the global coordinate system, and the movement changes are consistent, for example, a head mounted display. At this time, the pose of the head mounted display under the coordinate system of the sensing processing unit needs to be acquired, so as to acquire the pose of the internal organ environment model of the human body under the display unit. Specifically, the pose acquisition unit includes: the first sensing unit is arranged on the first mechanical arm; the second sensing unit is arranged on the second mechanical arm; a third sensing unit disposed on the display unit; the fourth sensing unit is arranged at the characteristic points of the body surface of the human body; a sensing processing unit that acquires poses of the first sensing unit, the second sensing unit, the third sensing unit, and the fourth sensing unit each in the sensing processing unit coordinate system; a coordinate calculation unit, which obtains the respective poses of the surgical instrument and the endoscope in the coordinate system of the display unit according to the poses of the first sensing unit, the second sensing unit and the third sensing unit in the coordinate system of the sensing processing unit, and preset pose mapping relations of the first sensing unit and the surgical instrument tip, pose mapping relations of the second sensing unit and the endoscope tip, and pose mapping relations of the third sensing unit and the display unit; the coordinate calculation unit further obtains the pose of the human internal organ environment model in the sensing processing unit coordinate system through image matching according to the pose of the fourth sensing unit in the sensing processing unit coordinate system and the positions of the feature points in the human body surface and internal organ environment model, and obtains the pose of the human internal organ environment model in the display unit coordinate system according to the pose of the display unit in the sensing processing unit coordinate system. Accordingly, reference may be made to example two.
In summary, in the surgical robot system provided by the present invention, the surgical robot system includes: the system comprises a first mechanical arm, a second mechanical arm, a pose acquisition unit, an image processing unit and a display unit; wherein a surgical instrument is hung on the first mechanical arm; the endoscope is hung on the second mechanical arm; the pose acquisition unit is used for acquiring the position of the surgical instrument in the display unit coordinate system and the pose of the endoscope in the display unit coordinate system; the image processing unit is used for obtaining an image containing relative position relation information of the surgical instrument and the endoscope visual field according to the position of the surgical instrument in the display unit coordinate system, the pose of the endoscope in the display unit coordinate system and the endoscope visual field; and the display unit displays the image formed by the image processing unit, so that when the surgical instrument needs to be adjusted, the adjustment can be carried out according to the image which is displayed on the display unit and contains the relative position relation information of the surgical instrument and the view field of the endoscope, and the view field of the surgical instrument to the endoscope can be conveniently and accurately adjusted.
The above description is only for the purpose of describing the preferred embodiments of the present invention, and is not intended to limit the scope of the present invention, and any variations and modifications made by those skilled in the art based on the above disclosure are within the scope of the appended claims.

Claims (7)

1. A surgical robotic system, comprising: the system comprises a first mechanical arm, a second mechanical arm, a pose acquisition unit, an image processing unit and a display unit; wherein the content of the first and second substances,
surgical instruments are hung on the first mechanical arm;
the endoscope is hung on the second mechanical arm;
the pose acquisition unit is used for acquiring the pose of the surgical instrument in the display unit coordinate system and the pose of the endoscope in the display unit coordinate system;
the image processing unit is used for obtaining an image containing relative position relation information of the surgical instrument and the endoscope view field according to the pose of the surgical instrument in the display unit coordinate system, the pose of the endoscope in the display unit coordinate system and the endoscope view field; and
the display unit displays the image formed by the image processing unit;
the display unit moves along with the movement of the eyes of the operator, the positions of the display unit and the eyes of the operator are matched in the global coordinate system, and the movement variation is consistent;
the pose acquisition unit includes:
the first sensing unit is arranged on the first mechanical arm;
the second sensing unit is arranged on the second mechanical arm;
a third sensing unit disposed on the display unit;
a sensing processing unit that acquires poses of the first sensing unit, the second sensing unit, and the third sensing unit each in the sensing processing unit coordinate system;
and the coordinate calculation unit is used for obtaining the poses of the surgical instrument and the endoscope in the coordinate system of the display unit according to the poses of the first sensing unit, the second sensing unit and the third sensing unit in the coordinate system of the sensing processing unit, and preset pose mapping relations of the first sensing unit and the surgical instrument tail end, pose mapping relations of the second sensing unit and the endoscope tail end and pose mapping relations of the third sensing unit and the display unit.
2. A surgical robotic system as claimed in claim 1, wherein the first, second and third sensing units are each selected from a magnetic field sensor, the sensing processing unit is selected from a magnetic field generator, or,
the first sensing unit, the second sensing unit and the third sensing unit are all selected from optical targets, and the sensing processing unit is selected from an optical tracker.
3. The surgical robotic system as claimed in claim 1, wherein the display unit is selected from a head mounted display.
4. A surgical robotic system as claimed in claim 1,
the surgical robot system also comprises an imaging device, wherein the imaging device is used for modeling the human body surface and the internal organ environment;
the pose acquisition unit is also used for acquiring the pose of the internal organ environment of the human body in the coordinate system of the display unit;
according to the pose of the human internal organ environment in the coordinate system of the display unit, the image obtained by the image processing unit also comprises relative pose relation information of the human internal organ environment model and the surgical instrument.
5. A surgical robotic system as claimed in claim 4, wherein the imaging device is selected from one or more of a CT device, an MRI device and a thermal imaging device.
6. A surgical robotic system as claimed in claim 4,
the pose acquisition unit includes:
the fourth sensing unit is arranged at the characteristic points of the body surface of the human body;
the sensing processing unit acquires the pose of the fourth sensing unit under the sensing processing unit coordinate system;
the coordinate calculation unit further obtains the pose of the human internal organ environment model in the sensing processing unit coordinate system through image matching according to the pose of the fourth sensing unit in the sensing processing unit coordinate system and the positions of the feature points in the human body surface and internal organ environment model, and obtains the pose of the human internal organ environment model in the display unit coordinate system according to the pose of the display unit in the sensing processing unit coordinate system.
7. A surgical robotic system as claimed in any one of claims 1 to 6, further comprising: the first mechanical arm and the second mechanical arm are positioned on the slave mechanical arm, and the master mechanical arm can control the movement of the first mechanical arm and the second mechanical arm.
CN201811497442.6A 2018-12-07 2018-12-07 Surgical robot system Active CN109288591B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811497442.6A CN109288591B (en) 2018-12-07 2018-12-07 Surgical robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811497442.6A CN109288591B (en) 2018-12-07 2018-12-07 Surgical robot system

Publications (2)

Publication Number Publication Date
CN109288591A CN109288591A (en) 2019-02-01
CN109288591B true CN109288591B (en) 2021-12-03

Family

ID=65142527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811497442.6A Active CN109288591B (en) 2018-12-07 2018-12-07 Surgical robot system

Country Status (1)

Country Link
CN (1) CN109288591B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110101455B (en) * 2019-04-30 2021-01-01 微创(上海)医疗机器人有限公司 Display device and surgical robot
CN110464471B (en) * 2019-09-10 2020-12-01 深圳市精锋医疗科技有限公司 Surgical robot and control method and control device for tail end instrument of surgical robot
CN111276032A (en) * 2020-02-29 2020-06-12 中山大学中山眼科中心 Virtual operation training system
US20230256607A1 (en) * 2020-08-19 2023-08-17 Beijing Surgerii Robotics Company Limited Robot system and control method thereof
CN112353361B (en) * 2020-09-21 2023-07-25 常州市速瑞医疗科技有限公司 3D pleuroperitoneal cavity system based on master-slave integrated intelligent mirror supporting robot
CN112043397B (en) * 2020-10-08 2021-09-24 深圳市精锋医疗科技有限公司 Surgical robot and motion error detection method and detection device thereof
CN112155736B (en) * 2020-10-12 2021-09-10 德智鸿(上海)机器人有限责任公司 Double-arm surgical robot
CN112245011B (en) * 2020-10-23 2022-02-01 上海微创医疗机器人(集团)股份有限公司 Surgical robot system, adjustment method, storage medium, and terminal
CN112472297B (en) * 2020-11-26 2022-03-29 上海微创医疗机器人(集团)股份有限公司 Pose monitoring system, pose monitoring method, surgical robot system and storage medium
CN112587244A (en) * 2020-12-15 2021-04-02 深圳市精锋医疗科技有限公司 Surgical robot and control method and control device thereof
CN113143461B (en) * 2021-01-26 2023-08-01 合肥工业大学 Man-machine cooperative minimally invasive endoscope holding robot system
CN113014871B (en) * 2021-02-20 2023-11-10 青岛小鸟看看科技有限公司 Endoscopic image display method and device and endoscopic surgery auxiliary system
CN113384347B (en) * 2021-06-16 2022-07-08 瑞龙诺赋(上海)医疗科技有限公司 Robot calibration method, device, equipment and storage medium
CN113633387B (en) * 2021-06-21 2024-01-26 安徽理工大学 Surgical field tracking supporting laparoscopic minimally invasive robot touch interaction method and system
CN114099005B (en) * 2021-11-24 2023-09-15 重庆金山医疗机器人有限公司 Method for judging whether instrument is in visual field or is shielded or not and energy display method
CN113876427B (en) * 2021-12-03 2022-03-08 南京利昂医疗设备制造有限公司 Detection device and method for intracavity clamp
CN114564050A (en) * 2022-03-03 2022-05-31 瑞龙诺赋(上海)医疗科技有限公司 Operation platform positioning system, pose information determination method and device
CN117671012B (en) * 2024-01-31 2024-04-30 临沂大学 Method, device and equipment for calculating absolute and relative pose of endoscope in operation

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9789608B2 (en) * 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
EP2925250B1 (en) * 2012-11-30 2017-07-26 Olympus Corporation Operation support system
EP2988696A4 (en) * 2013-04-25 2016-11-30 Intuitive Surgical Operations Surgical equipment control input visualization field
JP6257371B2 (en) * 2014-02-21 2018-01-10 オリンパス株式会社 Endoscope system and method for operating endoscope system
EP3125807B1 (en) * 2014-03-28 2022-05-04 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
CN105232155B (en) * 2015-09-08 2018-11-09 微创(上海)医疗机器人有限公司 Operating robot adjusts system

Also Published As

Publication number Publication date
CN109288591A (en) 2019-02-01

Similar Documents

Publication Publication Date Title
CN109288591B (en) Surgical robot system
US20230215007A1 (en) Systems and methods for using registered fluoroscopic images in image-guided surgery
CN107049492B (en) Surgical robot system and method for displaying position of surgical instrument
KR102501099B1 (en) Systems and methods for rendering on-screen identification of instruments in teleoperated medical systems
WO2018159338A1 (en) Medical support arm system and control device
EP3395282B1 (en) Endoscopic view of invasive procedures in narrow passages
WO2018129532A1 (en) Systems and methods for registering elongate devices to three dimensional images in image-guided procedures
KR20140112207A (en) Augmented reality imaging display system and surgical robot system comprising the same
KR20170127561A (en) System and method for on-screen identification of instruments in a remotely operated medical system
Breedveld et al. Theoretical background and conceptual solution for depth perception and eye-hand coordination problems in laparoscopic surgery
CN106333715B (en) Laparoscopic surgical system
CN113645919A (en) Medical arm system, control device, and control method
US11992283B2 (en) Systems and methods for controlling tool with articulatable distal portion
Noonan et al. Gaze contingent articulated robot control for robot assisted minimally invasive surgery
CN110169821B (en) Image processing method, device and system
CN113180828A (en) Operation robot constrained motion control method based on rotation theory
CN114727848A (en) Visualization system and method for ENT procedures
CN112641514A (en) Minimally invasive interventional navigation system and method
US20210259776A1 (en) Hybrid simulation model for simulating medical procedures
CN114795495A (en) Master-slave operation minimally invasive surgery robot system
CN116829091A (en) Surgical assistance system and presentation method
Looi et al. KidsArm—An image-guided pediatric anastomosis robot
Mylonas et al. Gaze contingent depth recovery and motion stabilisation for minimally invasive robotic surgery
Breedveld Observation, manipulation, and eye-hand coordination problems in minimally invasive surgery
CN113081273B (en) Punching auxiliary system and surgical robot system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 101, block B, building 1, No. 1601, Zhangdong Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201203

Applicant after: Shanghai minimally invasive medical robot (Group) Co.,Ltd.

Address before: 201203, 501, Newton Road, Zhangjiang hi tech park, Shanghai, Pudong New Area

Applicant before: Microport (Shanghai) Medbot Co.,Ltd.

GR01 Patent grant
GR01 Patent grant