CN116421325A - Surgical robot cooperation system and calibration method thereof - Google Patents

Surgical robot cooperation system and calibration method thereof Download PDF

Info

Publication number
CN116421325A
CN116421325A CN202310470692.5A CN202310470692A CN116421325A CN 116421325 A CN116421325 A CN 116421325A CN 202310470692 A CN202310470692 A CN 202310470692A CN 116421325 A CN116421325 A CN 116421325A
Authority
CN
China
Prior art keywords
surgical robot
relative
pose
data
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310470692.5A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yida Medical Beijing Health Technology Co ltd
Original Assignee
Yida Medical Beijing Health Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yida Medical Beijing Health Technology Co ltd filed Critical Yida Medical Beijing Health Technology Co ltd
Priority to CN202310470692.5A priority Critical patent/CN116421325A/en
Publication of CN116421325A publication Critical patent/CN116421325A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The specification relates to the technical field of medical robots, and particularly discloses a surgical robot cooperation system and a calibration method thereof, wherein the method comprises the following steps: acquiring first image data and second image data acquired by a binocular camera; the first image data comprises the first identification feature; the second image data includes the second identification feature therein; determining first relative pose data of the first identification feature with respect to the binocular camera based on the first image data; determining second relative pose data of the second identification feature relative to the binocular camera based on the second image data; and calibrating the surgical robot collaboration system according to the first relative posture data and the second relative posture data. The above scheme can realize the calibration of the surgical robot cooperation system comprising a plurality of surgical robots.

Description

Surgical robot cooperation system and calibration method thereof
Technical Field
The specification relates to the technical field of medical robots, in particular to a surgical robot cooperation system and a calibration method thereof.
Background
Mainstream laparoscopic robots contain three to four robotic arms with active fixed points (ARCM). In surgery, typically a first arm holds an endoscope, and second and third arms hold surgical instruments, and if a fourth arm is used to assist in the task. In some types of surgery, efficiency may be improved by adding a fifth arm (second auxiliary arm). Because of its structural reasons, the ARCM robot arm occupies a large amount of space near the operation area, and when the number of robot arms is too large, the working range of each robot arm becomes small due to the risk of collision with each other.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the specification provides a surgical robot cooperation system and a calibration method thereof, which are used for solving the problem that the working range of each mechanical arm is reduced due to the risk of mutual collision when the number of the mechanical arms is too large in the prior art.
The embodiment of the specification provides a calibration method of a surgical robot cooperation system, wherein the surgical robot cooperation system comprises a first surgical robot and a second surgical robot; the first surgical robot is provided with a first identification feature, and the second surgical robot is provided with a second identification feature; the method comprises the following steps:
Acquiring first image data and second image data acquired by a binocular camera; the first image data comprises the first identification feature; the second image data includes the second identification feature therein;
determining first relative pose data of the first identification feature with respect to the binocular camera based on the first image data; determining second relative pose data of the second identification feature relative to the binocular camera based on the second image data;
and calibrating the surgical robot collaboration system according to the first relative posture data and the second relative posture data.
In one embodiment, calibrating the surgical robotic collaboration system based on the first relative pose data and the second relative pose data includes:
calculating a first transformation matrix of the first surgical robot relative to the binocular camera according to the first relative pose data; calculating a second transformation matrix of the binocular camera relative to the second surgical robot according to the second relative pose data;
and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot based on the first conversion matrix and the second conversion matrix so as to calibrate the surgical robot cooperation system.
In one embodiment, calibrating the surgical robotic collaboration system based on the first relative pose data and the second relative pose data includes:
determining first pose information of the first surgical robot relative to the second surgical robot based on the first relative pose data and the second relative pose data;
and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the first pose information of the first surgical robot relative to the second surgical robot so as to calibrate the surgical robot cooperation system.
In one embodiment, the base of the first surgical robot is rigidly connected to the surgical bed and the base of the second surgical robot is rigidly connected to the surgical bed;
correspondingly, according to the first pose information of the first surgical robot relative to the second surgical robot, calculating a target pose conversion matrix between the first surgical robot and the second surgical robot comprises:
determining second pose information of the first surgical robot relative to the second surgical robot according to the rigid connection relation data between the base of the first surgical robot and the operating table and the rigid connection relation data between the base of the second surgical robot and the operating table;
The first pose information and the second pose information are fused, so that fused pose information is obtained;
and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the fused pose information.
In one embodiment, the fusing processing is performed on the first pose information and the second pose information to obtain fused pose information, which includes:
carrying out noise reduction processing on the first pose information and the second pose information to obtain first pose information after noise reduction processing and second pose information after noise reduction processing;
carrying out fusion processing on the first pose information after the noise reduction processing and the second pose information after the noise reduction processing to obtain fused pose information; the fusion process includes at least one of: average weighting processing, kalman filtering processing, multi-Bayesian estimation, and neural network processing.
The embodiment of the specification provides a calibration device of a surgical robot cooperation system, wherein the surgical robot cooperation system comprises a first surgical robot and a second surgical robot; the first surgical robot is provided with a first identification feature, and the second surgical robot is provided with a second identification feature; the device comprises:
The acquisition module is used for acquiring the first image data and the second image data acquired by the binocular camera; the first image data comprises the first identification feature; the second image data includes the second identification feature therein;
a determining module for determining first relative pose data of the first identification feature with respect to the binocular camera based on the first image data; determining second relative pose data of the second identification feature relative to the binocular camera based on the second image data;
and the calibration module is used for calibrating the surgical robot collaboration system according to the first relative posture data and the second relative posture data.
The embodiment of the specification also provides a surgical robot cooperation system, which comprises:
a first surgical robot having a first identification feature disposed thereon;
a second surgical robot having a second identification feature disposed thereon;
a binocular camera for acquiring first image data and second image data; the first image data comprises the first identification feature; the second image data includes the second identification feature therein;
The controller is used for acquiring the first image data and the second image data acquired by the binocular camera; further for determining first relative pose data of the first identification feature with respect to the binocular camera based on the first image data; determining second relative pose data of the second identification feature relative to the binocular camera based on the second image data; and the operation robot cooperation system is calibrated according to the first relative posture data and the second relative posture data.
In one embodiment, the robotic arm of the first surgical robot comprises a passive stationary point robotic arm; the robotic arm of the second surgical robot includes a robotic arm with an active stationary point.
In one embodiment, the controller includes a first control terminal for controlling operation of the robotic arm of the first surgical robot and a second control terminal for controlling operation of the robotic arm of the second surgical robot;
the controller also comprises an interactive selection user interface for selecting left and right hands or a mirror holding mechanical arm; the interactive selection user interface is also used for displaying the installation condition and the use condition of the surgical instrument obtained by automatically identifying the information of the built-in chip.
In one embodiment, the base of the first surgical robot is rigidly connected to the operating table and the base of the second surgical robot is rigidly connected to the operating table;
the controller is configured to determine first pose information of the first surgical robot relative to the second surgical robot based on the first relative pose data and the second relative pose data; the method is also used for determining second pose information of the first surgical robot relative to the second surgical robot according to the rigid connection relation data between the base of the first surgical robot and the operating table and the rigid connection relation data between the base of the second surgical robot and the operating table; and the method is also used for carrying out fusion processing on the first pose information and the second pose information to obtain fused pose information, and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the fused pose information.
In one embodiment, the first identifying feature comprises a checkerboard fixedly connected to the robotic arm of the first surgical robot.
In one embodiment, the first identifying feature comprises a plurality of non-repeating patterns formed on the end of the robotic arm of the first surgical robot by a preset surface treatment process.
The embodiment of the specification also provides medical equipment, which comprises a processor and a memory for storing instructions executable by the processor, wherein the processor realizes the steps of the surgical robot collaboration system calibration method in any embodiment when executing the instructions.
The embodiments of the present disclosure also provide a computer readable storage medium having stored thereon computer instructions that, when executed, implement the steps of the surgical robot collaboration system calibration method described in any of the embodiments above.
In this embodiment of the present disclosure, a calibration method for a surgical robot collaboration system is provided, where the surgical robot collaboration system includes a first surgical robot and a second surgical robot, a first identification feature is provided on the first surgical robot, a second identification feature is provided on the second surgical robot, a first image data and a second image data acquired by a binocular camera may be acquired, the first image data includes the first identification feature, the second image data includes the second identification feature, a first relative pose data of the first identification feature with respect to the binocular camera may be determined based on the first image data, a second relative pose data of the second identification feature with respect to the binocular camera may be determined based on the second image data, and then the surgical robot collaboration system may be calibrated according to the first relative pose data and the second relative pose data. In the above-mentioned scheme, when needing more robotic arms to carry out operation, can carry out through first surgical robot and the cooperation of second surgical robot, first surgical robot and second surgical robot all can have at least one robotic arm, cooperate through a plurality of surgical robots, can accomplish extremely complicated art formula, improve the operation efficiency, can also avoid increasing each arm that the robotic arm leads to and interfere with each other and the motion is restricted. In addition, in the above scheme, the identification features are set on the first surgical robot and the second surgical robot, after the image data are acquired through the binocular camera, the first relative pose data of the first surgical robot relative to the binocular camera and the second relative pose data of the second surgical robot relative to the binocular camera are obtained, and then the calibration of the surgical robot cooperation system can be achieved based on the first relative pose data and the second relative pose data, so that the first surgical robot and the second surgical robot can operate in the same coordinate system, and the robot cooperation system can cooperatively execute the surgical operation.
Drawings
The accompanying drawings are included to provide a further understanding of the specification, and are incorporated in and constitute a part of this specification. In the drawings:
fig. 1 shows a schematic view of an application scenario of a surgical robot collaboration system according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram illustrating an application scenario of a surgical robot collaboration system calibration method according to an embodiment of the present disclosure;
FIG. 3 is a schematic view showing the structure of a doctor control device in a surgical robot collaboration system according to an embodiment of the present disclosure;
FIG. 4 is a schematic view showing the construction of a surgical robot collaboration system according to an embodiment of the present disclosure;
FIG. 5 is a schematic view of a first surgical robot according to an embodiment of the present disclosure;
FIG. 6 shows a schematic structural diagram of a second surgical robot in an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of calibration using a binocular camera in one embodiment of the present disclosure;
FIG. 8 shows a schematic diagram of a checkerboard in an embodiment of the present description;
FIG. 9 shows a schematic view of a surgical instrument in a surgical field in one embodiment of the present disclosure;
FIG. 10 illustrates a schematic view of a scenario in which sensor positioning is identified in an embodiment of the present disclosure;
FIG. 11 is a schematic diagram of a scenario in which a hybrid calibration method is used for calibration in an embodiment of the present disclosure;
FIG. 12 is a schematic view of a visual interaction flow of a surgical robotic collaboration system in accordance with an embodiment of the present disclosure;
FIG. 13 is a schematic diagram showing a flow of sound interaction of a surgical robot collaboration system in accordance with one embodiment of the present disclosure;
FIG. 14 is a block diagram showing the structure of a controller module in one embodiment of the present disclosure;
FIG. 15 is a block diagram showing a module configuration of a controller in an embodiment of the present specification;
FIG. 16 is a flow chart of a control algorithm of the controller in one embodiment of the present disclosure;
fig. 17 is a schematic diagram showing a controller selecting a control object scheme in an embodiment of the present specification;
FIG. 18 is a diagram illustrating an interface of controller selection interactions in one embodiment of the present disclosure;
FIG. 19 is a flow chart illustrating a method of calibrating a surgical robotic collaboration system in accordance with an embodiment of the present disclosure;
FIG. 20 is a block diagram showing the construction of a calibration device for a surgical robot collaboration system according to an embodiment of the present disclosure;
fig. 21 is a schematic diagram showing the constitution of the medical device in one embodiment of the present specification.
Detailed Description
The principles and spirit of the present specification will be described below with reference to several exemplary embodiments. It should be understood that these embodiments are presented merely to enable one skilled in the art to better understand and practice the present description, and are not intended to limit the scope of the present description in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Those skilled in the art will appreciate that the embodiments of the present description may be implemented as a system, apparatus, method, or computer program product. Accordingly, the present disclosure may be embodied in the form of: complete hardware, complete software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
Embodiments of the present disclosure provide a surgical robot collaboration system. Fig. 1 shows a schematic view of an application scenario of a surgical robot collaboration system according to an embodiment of the present disclosure. In this application scenario, as shown in fig. 1, the surgical robot collaboration system in this embodiment may include a first surgical robot 100, a second surgical robot 200, a doctor control device 300, and an auxiliary device 400.
The first surgical robot 100 may include at least one robotic arm 110 and a mounting device 120. The mounting device 120 may be coupled to the surgical bed 40 or other movable device to avoid robot interference.
The second surgical robot 200 may include an image trolley 201 and a patient trolley 202. The patient trolley 202 may include at least one image arm 221 and an instrument arm 222.
The physician control device 300 may include a manipulation unit 301, a display device 302, and a base 303.
In a use scenario, at least one operator 50 (e.g., a healthcare worker) may participate in the use of the surgical robot to manipulate either the first surgical robot 100 or the second surgical robot 200 via the physician control device 300 to perform a particular surgical procedure.
As shown in fig. 1, in one embodiment, a first surgical robot 100 may have one or more robotic arms 110. One or more robotic arms 110 may be mounted on a surgical bed or stand-alone trolley, and may be used to hold surgical instruments or endoscopes.
As shown in fig. 1, in one embodiment, the second surgical robot 200 may have more than three robotic arms and at least one physician console. Each of the robotic arms may be used to hold a surgical instrument or endoscope, and each of the physician control devices may control the operation of one or more of the robotic arms.
In one embodiment, the physician control device 300 may have a physician operation unit, a visual display unit, and a human-computer jump unit, and the operator 50 enables control of the first surgical robot 100 and the second surgical robot 200 by interacting with the physician control device 300.
Referring to fig. 2, a schematic diagram of the interaction relationship between the modules of the surgical robot collaboration system is shown. As shown in fig. 2, the controller 500 may be configured to receive real-time information acquired by the sensors of the first surgical robot 100, the second surgical robot 200, and the doctor control device 300, and perform instruction calculation; the calculated instructions are then sent to the execution units of the first surgical robot 100, the second surgical robot 200 and the doctor control device 300. Optionally, the controller 500 has a plurality of control terminals, which respectively correspond to the instruction transceiver of the first surgical robot 100, the second surgical robot 200, the doctor control device 300, and the multiple robots for coordinated operation.
The controller 500 may have control software running therein for controlling the first surgical robot 100 and the second surgical robot 200 to perform a desired surgical operation. Control instructions are calculated based on feedback information of the first surgical robot 100 and the second surgical robot 200 and issued to the execution units of the first surgical robot 100 and the second surgical robot 200.
Referring to fig. 3, a schematic structural diagram of a doctor control device in the embodiment of the present specification is shown. As shown in fig. 3, the doctor control device may include an image part 310, a carriage part 320, an adjusting part 330, and a manipulation arm 340. The two manipulator arms 3401, 3402 detect hand motion information of the operator through their end control handles as motion control input of the whole system. The carriage assembly 320 is a base bracket for mounting other components. The carriage member 320 has movable casters 3201, which can be moved or fixed as required. The cart member 320 is provided with a foot switch 3202 for detecting a switching amount control signal from an operator. The adjustment unit 320 can electrically adjust the positions of the devices such as the control arm 3301, the image unit 3302, and the operator armrest 3303, that is, the man-machine parameter adjustment function. Image component 310 can provide the operator with stereoscopic images detected from the image system, providing the operator with reliable image information for performing the surgical procedure.
As shown in fig. 4, in the present embodiment, the surgical robot collaboration system may include: a first surgical robot 100, a first surgical robot 200, a controller 500, and a binocular camera 600.
The first surgical robot 100 may have a first identification feature disposed thereon. The second surgical robot 200 may have a second identification feature disposed thereon.
The first surgical robot 100 may be a PRCM (passive fixed point) surgical robot. As shown in fig. 5, the PRCM surgical robot 100 includes at least one robotic arm 110 on which surgical instruments 130 may be mounted for illumination, cutting, clamping, etc. surgical operations. The PRCM surgical robot also includes a trolley 121 having movement and positioning functions. The PRCM surgical robot also includes a connection mechanism 122 that can connect the robotic arm 110 to the trolley 121 or the surgical bed to avoid interference between the workspace and a second surgical robot person. The PRCM surgical robot is provided with auxiliary equipment capable of performing pose recognition.
In one embodiment, the second surgical robot may be an ARCM (active fixed point) surgical robot. As shown in fig. 6, the ARCM surgical robot includes a patient trolley 202 and an image trolley 201. The patient trolley 202 includes a robotic arm 220. The robotic arm 220 may include at least one image arm 221 and at least two instrument arms 222. The image arm 221 carries an endoscope 2210, such as a 3D laparoscope. The endoscope 2210 is used for acquiring human tissue and organ, surgical instruments and surgical environment information. The instrument arm 222 carries surgical instruments 2220, including active instruments and passive instruments. The image dolly 201 in the ARCM surgical robot may provide auxiliary image data for an operator or an auxiliary operator (e.g., a nurse) or the like. The image dolly 201 may include an image processing device 211 and an image display device 212. The image processing device 211 is used for image processing and transmits the processed image to the image display device 212 and the image section 310 of the doctor control device 300.
In this embodiment, the combination of ARCM and PRCM is used to implement the operation, and PRCM is small in size, does not occupy the ARCM operation space, can be flexibly arranged, and can implement the high-coordination operation requirement.
The binocular camera 600 may be used to acquire first image data and second image data. The first image data includes a first identification feature. The second image data includes a second identification feature therein. The binocular camera 600 may be a binocular video camera or two cameras with relative pose determinations.
The controller 500 may be communicatively connected to the binocular camera 600 for acquiring the first image data and the second image data acquired by the binocular camera 600. The controller 500 may also be configured to determine first relative pose data of the first recognition feature with respect to the binocular camera 600 based on the first image data. The controller 500 may also be configured to determine second relative pose data of the second recognition feature with respect to the binocular camera 600 based on the second image data. The controller 500 may also be configured to calibrate the surgical robotic collaboration system based on the first relative pose data and the second relative pose data.
The binocular camera 600 is adopted to perform mutual calibration among the mechanical arm units, in a use scene, the fixedly connected checkerboard or other identification features exist on each mechanical arm unit, and meanwhile, one binocular camera or two cameras with determined relative pose exist, so that the checkerboard information or the information of other identification features can be obtained in the field of view. Through binocular recognition, the relative pose of each checkerboard can be obtained, the relative positions of the first surgical robot and the second surgical robot are further known, and the information is transmitted to the controller unit for calculating actual control instructions.
As shown in fig. 7, a first checkerboard 1001 is fixedly connected to the first surgical robot 100. A second checkerboard 2001 is fixedly connected to the second surgical robot 200. The binocular camera 600 may capture first and second image data by capturing first and second checkers 1001 and 2001 on the first and second surgical robots 100 and 200. The binocular camera 600 may transmit the first image data and the second image data to the controller 500. The controller 500 may determine first relative pose data of the first checkerboard 1001 with respect to the binocular camera 600 based on the first image data, determine second relative pose data of the second checkerboard 2001 with respect to the binocular camera 600 based on the second image data, and calibrate the surgical robot collaboration system according to the first relative pose data and the second relative pose data.
Referring to fig. 8, a schematic diagram of a checkerboard in the present embodiment is shown. P is any feature point on the checkerboard. The binocular camera determines the pose, and the coordinate information O of the camera can be obtained by using parallax information of the two-dimensional feature points acquired by binocular camera c Three-dimensional position information of medium feature points camera P=(x c ,y c ,z c ) When the feature point is fixed on any robot, the robot coordinate system O is opposite to r In the descending process, pose information of the robot in a robot coordinate system can be obtained robot P=(x r ,y r ,z r ). When the above O c ,O r When fixed, for any feature point P on the checkerboard, a 4×4 conversion matrix exists robot T camera The following equation is satisfied:
robot P= robot T camera camera P
the existing first surgical robot and second surgical robot can obtain pose conversion matrixes of the first surgical robot and the second surgical robot by taking a camera coordinate system as an intermediate variable robotA T robotB Expression of (2)
robotA T robotBrobotA T camera camera T robotB
In other embodiments, the first and second recognition features may include a plurality of non-repeating patterns formed on the ends of the mechanical arms of the first and second surgical robots 100 and 200 through a preset surface treatment process. As shown in fig. 9, the surface of the distal end of the surgical instrument may have a special surface treatment process, such as non-repeating pattern, two-dimensional code, bar code, etc., to assist in identifying the posture information and instrument type information of the instrument.
Alternatively, a trained image recognition module may be present in the controller 500, and the pose of the instrument tip in the field of view may be recognized from the images acquired by the binocular camera 600. When the instrument appears in the field of view, the image recognition module in the controller 500 can obtain the pose relationship between the bases of the surgical instruments by combining the joint angle signals of the current joint sensor for the pose and instrument type information of the surgical instruments in the field of view of the camera.
In some embodiments of the present invention, the bases of both the first surgical robot 100 and the second surgical robot 200 are coupled to the surgical bed 40, and their relative pose relationship may be obtained by a position sensor or one or more of electromagnetic encoders, IMU devices.
As shown in fig. 10, the first surgical robot 100 and the second surgical robot 200 may be rigidly connected to the operating table 40. The relative pose between the robots can be obtained through sensor recognition, and this information is transmitted to the controller 500, i.e. a pose conversion matrix between the first surgical robot 100 and the second surgical robot 200 can be determined, which can be used to calculate the control instructions actually sent to the execution unit.
In some embodiments of the present invention, the hybrid calibration of the robot collaboration system may be performed by combining two methods, a binocular camera identification method and a rigid connection identification method. Specifically, the controller 500 may determine first pose information of the first surgical robot 100 with respect to the second surgical robot 200 based on the first relative pose data and the second relative pose data. The controller 500 may also determine second pose information of the first surgical robot 100 with respect to the second surgical robot 200 based on the rigid connection relationship data between the base of the first surgical robot 100 and the surgical bed 40 and the rigid connection relationship data between the base of the second surgical robot 200 and the surgical bed 40. Then, the controller 500 may perform fusion processing on the first pose information and the second pose information to obtain fused pose information; based on the fused pose information, a target pose conversion matrix between the first surgical robot 100 and the second surgical robot 200 is calculated.
Referring to fig. 11, a schematic view of a scene calibrated by a hybrid calibration method in the embodiment of the present disclosure is shown, the first pose information is obtained by identifying the checkerboard 1001 and the checkerboard 2001 by using the binocular camera 600, and the second pose information can be obtained by using the sensor and rigid connection relationship data. Then, the first pose information and the second pose information may be sent to the controller 500 for fusion processing, so as to obtain a target pose conversion matrix between the first surgical robot 100 and the second surgical robot 200.
In one embodiment, the controller may perform noise reduction processing on the first pose information and the second pose information to obtain the first pose information after the noise reduction processing and the second pose information after the noise reduction processing. The controller can also perform fusion processing on the first pose information after the noise reduction processing and the second pose information after the noise reduction processing to obtain fused pose information; the fusion process includes at least one of: average weighting processing, kalman filtering processing, multi-Bayesian estimation, and neural network processing.
In this embodiment, the pose information fusion may adopt a centralized or decentralized fusion structure. That is, the fusion processing may be performed after the noise reduction processing, or the noise reduction processing may be performed after the fusion processing. Optionally, the data fusion can be performed by adopting a weighted average method, a Kalman filtering method, a multi-Bayesian estimation method, an artificial neural network method and the like, so that errors of information acquired by two systems are filtered, and the obtained result is more accurate. The accuracy of the calibration can be improved through the mixed calibration, so that the accuracy of the operation is improved, and the operation effect is improved.
In some embodiments of the present description, the robotic system may employ visual interaction for status prompting and security protection in a use scenario. As shown in fig. 12, the system may prompt the system operation status by a warning light when entering/exiting teleoperated surgical operation. The safety alert may be triggered when the surgical procedure triggers the safety alert, such as when the safety requirements are not met. The safety warning is different, and light prompt types can be different, and the warning subsystem can carry out interactive prompt with different prompt lamp colors and flashing frequencies according to the operating condition.
In some embodiments of the present description, the robotic system may employ voice interaction for additional security protection in a use scenario, as the user doctor uses the immersive surgical display device. As shown in fig. 13, the system may be voice-prompted by a speaker when entering/exiting teleoperated surgical operation. The safety alert may be triggered when the surgical procedure triggers the safety alert, such as when the safety requirements are not met. The safety alarm is different, the alarm sound can be different, and the alarm subsystem can carry out interactive prompt by using different buzzer sounds according to the operating condition.
In some embodiments of the present description, the controller includes a first control terminal and a second control terminal. The first control terminal is used for controlling the operation of the mechanical arm of the first surgical robot. The second control terminal is used for controlling the operation of the mechanical arm of the second surgical robot. The controller also includes an interactive selection user interface to make a selection of left and right hand or mirror holding robotic arms. The interactive selection user interface is also used for displaying the installation condition and the use condition of the surgical instrument obtained by automatically identifying the information of the built-in chip.
As shown in fig. 14, the controller is configured to transmit and receive control instructions to and from the first surgical robot and the second surgical robot. The system comprises a first control terminal, a second control terminal, a communication unit and a synchronous processing unit.
The first control terminal is used for receiving the motion information sent by the first surgical robot sensor and sending a control instruction for controlling the motion of the first surgical robot.
And the second control terminal is used for receiving the motion information sent by the second surgical robot sensor and sending a control instruction for controlling the motion of the second surgical robot.
The communication unit is used for sending the numbers stored in the first control terminal and the second control terminal to the synchronous processing unit;
And the synchronous processing unit is used for processing the real-time data received by the first control terminal and the second control terminal, and planning real-time motion instructions through control software so as to realize the cooperative operation of the first surgical robot and the second surgical robot.
As shown in fig. 15, the controller of the present invention is composed of a robotics unit, a logic control unit and a joint control unit, and performs motion control based on the hardware state of the robot received by the controller.
The logic control unit is used for controlling the motion state of the robot and sending a command to the robotics unit and the joint control unit; the decision unit will make instruction decisions based on the information of the robotics unit and the joint control unit.
The robotics unit performs calculation planning of kinematics and dynamics based on feedback information of joint control and motion conditions of logic control, sends expected joint tracks to the joint control module, and simultaneously assists the logic control module in making running state decisions by planning information.
The joint control unit sends control instructions of each joint of the robot to the controller based on the planned trajectory of the robot and the motion state of the logic control.
As shown in fig. 16, the present embodiment can perform motor drive control based on a combination of one or more of a current loop, a speed loop, and a position loop. After receiving the position command of the doctor control device, the controller calculates and sends the required command to the motor M based on the position deviation delta theta, the speed deviation delta v and the current deviation delta I, and drives each joint of the first surgical robot and the second surgical robot.
Referring to fig. 17, there are a first surgical robot, a plurality of display units and a plurality of mechanical arm units in a second surgical robot in the system, and in order to ensure accurate pose control of the surgical operation, it is necessary to implement initial selection and switching of control objects of the doctor control device through a certain interaction.
In one embodiment, the UI setting of the control object and the pose calibration of each mechanical arm are performed by adopting the control flow shown in fig. 17, and the pose detection between the mechanical arms is performed in real time, and when the need of replacing the control object is identified, or the pose deviation exceeds the set threshold, the corresponding operation is performed again.
Referring to fig. 18, a user may make a selection of a left-right hand or endoscope using a robotic arm through an interactive user interface, physical buttons, or the like, as shown in fig. 18. Optionally, the installation condition and the use condition of the surgical instrument can be automatically identified by the information of the built-in chip and displayed on the interactive selection display interface.
The embodiment of the specification also provides a calibration method of the surgical robot collaboration system. Fig. 19 shows a flowchart of a surgical robot collaboration system calibration method in an embodiment of the present disclosure. Although the present description provides methods and apparatus structures as shown in the following examples or figures, more or fewer steps or modular units may be included in the methods or apparatus based on conventional or non-inventive labor. In the steps or the structures of the apparatuses, which logically do not have the necessary cause and effect relationship, the execution order or the structure of the modules of the apparatuses are not limited to the execution order or the structure of the modules shown in the drawings and described in the embodiments of the present specification. In actual implementation of the apparatus or the end product of the method or the module structure of (a) may be performed sequentially or in parallel (e.g., in a parallel processor or a multi-threaded processing environment, or even in a distributed processing environment) according to an embodiment or a connection of the method or the module structure shown in the drawings.
Specifically, as shown in fig. 19, the calibration method of the surgical robot collaboration system provided in an embodiment of the present disclosure may include the following steps.
Step S191, acquiring first image data and second image data acquired by a binocular camera; the first image data comprises a first identification feature; the second image data includes a second identification feature therein.
The method in the embodiments of the present description may be applied to a controller of a surgical robot collaboration system. The surgical robot collaboration system may further include a first surgical robot and a second surgical robot. The first surgical robot is provided with a first identification feature thereon. A second identification feature is disposed on the second surgical robot. Wherein the first identification feature and the second identification feature may be checkered or non-repeating patterns formed by surface processes.
The controller may be in communication with the binocular camera and may acquire first image data and second image data acquired by the binocular camera. The first image data may include a first identification feature therein. The second image data may include a second identification feature therein.
Step S192, determining first relative pose data of the first recognition feature with respect to the binocular camera based on the first image data; second relative pose data of the second identification feature with respect to the binocular camera is determined based on the second image data.
Based on the first image data acquired by the binocular camera, depth data of the first identification feature in the image can be acquired, and further first relative posture data of the first identification feature relative to the binocular camera is obtained. Likewise, second relative pose data of the second identification feature with respect to the binocular camera may also be determined based on the second image data.
Step S193, calibrating the surgical robot collaboration system according to the first relative gesture data and the second relative gesture data.
After the first relative pose data and the second relative pose data are obtained, a pose conversion matrix between the first surgical robot and the second surgical robot, that is, a conversion matrix between the coordinate system of the first surgical robot and the coordinate system of the second surgical robot, may be obtained. Through the pose conversion matrix, the surgical robot collaboration system can be calibrated.
In the above embodiment, when more mechanical arms are needed to perform operation, the operation can be performed through cooperation of the first surgical robot and the second surgical robot, both the first surgical robot and the second surgical robot can have at least one mechanical arm, and through cooperation of a plurality of surgical robots, extremely complex surgery can be completed, the operation efficiency is improved, and mutual interference and limitation of movement of each arm caused by adding the mechanical arms can be avoided. After the identification features are arranged on the first surgical robot and the second surgical robot, image data are acquired through the binocular camera, first relative pose data of the first surgical robot relative to the binocular camera and second relative pose data of the second surgical robot relative to the binocular camera are obtained, and then calibration of the surgical robot cooperation system can be achieved based on the first relative pose data and the second relative pose data, so that the first surgical robot and the second surgical robot can operate in the same coordinate system, and the robot cooperation system can cooperatively execute surgical operation.
In some embodiments of the present description, calibrating the surgical robotic collaboration system based on the first relative pose data and the second relative pose data may include: calculating a first transformation matrix of the first surgical robot relative to the binocular camera according to the first relative posture data; calculating a second transformation matrix of the binocular camera relative to the second surgical robot according to the second relative posture data; based on the first conversion matrix and the second conversion matrix, a target pose conversion matrix between the first surgical robot and the second surgical robot is calculated so as to calibrate the surgical robot cooperation system.
In some embodiments of the present description, calibrating the surgical robotic collaboration system based on the first relative pose data and the second relative pose data may include: determining first pose information of the first surgical robot relative to the second surgical robot based on the first relative pose data and the second relative pose data; and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the first pose information of the first surgical robot relative to the second surgical robot so as to calibrate the surgical robot cooperation system.
In some embodiments of the present disclosure, the base of the first surgical robot is rigidly connected to the surgical bed and the base of the second surgical robot is rigidly connected to the surgical bed; accordingly, calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the first pose information of the first surgical robot relative to the second surgical robot may include: determining second pose information of the first surgical robot relative to the second surgical robot according to the rigid connection relation data between the base of the first surgical robot and the operating table and the rigid connection relation data between the base of the second surgical robot and the operating table; the first pose information and the second pose information are fused, and fused pose information is obtained; and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the fused pose information. By means of mixed calibration, the accuracy of system calibration can be improved.
In some embodiments of the present disclosure, the fusing processing of the first pose information and the second pose information to obtain fused pose information may include: carrying out noise reduction processing on the first pose information and the second pose information to obtain the first pose information after the noise reduction processing and the second pose information after the noise reduction processing; the first pose information after the noise reduction processing and the second pose information after the noise reduction processing are fused, so that fused pose information is obtained; the fusion process includes at least one of: average weighting processing, kalman filtering processing, multi-Bayesian estimation, and neural network processing. Through noise reduction and fusion processing, the accuracy of pose information can be improved, and the calibration accuracy is further improved.
In some embodiments of the present description, first pose information of the first surgical robot relative to the second surgical robot may be determined based on the first relative pose data and the second relative pose data obtained by the dual-objective determination, and a first pose conversion matrix between the first surgical robot and the second surgical robot may be determined according to the first pose information. The second pose information of the first surgical robot relative to the second surgical robot can be determined according to the rigid connection relation data between the base of the first surgical robot and the operating table and the rigid connection relation data between the base of the second surgical robot and the operating table, and the second pose conversion matrix between the first surgical robot and the second surgical robot is determined according to the second pose information. Then, the first pose conversion matrix and the second pose conversion matrix can be fused to obtain the target pose conversion matrix. In this embodiment, the matrix fusion may be performed by a kalman filter method or the like.
Based on the same inventive concept, the embodiment of the specification also provides a calibration device of a surgical robot cooperation system, wherein the surgical robot cooperation system comprises a first surgical robot and a second surgical robot; the first surgical robot is provided with a first identification feature and the second surgical robot is provided with a second identification feature. Examples are as follows. Because the principle of the surgical robot collaboration system calibration device for solving the problems is similar to that of the surgical robot collaboration system calibration method, the implementation of the surgical robot collaboration system calibration device can be referred to the implementation of the surgical robot collaboration system calibration method, and the repetition is omitted. As used below, the term "unit" or "module" may be a combination of software and/or hardware that implements the intended function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated. Fig. 20 is a block diagram of a configuration of a surgical robot cooperation system calibration apparatus according to an embodiment of the present specification, as shown in fig. 20, including: the configuration of the acquisition module 21, the determination module 22 and the calibration module 23 will be described below.
The acquisition module 21 is used for acquiring first image data and second image data acquired by the binocular camera; the first image data comprises a first identification feature; the second image data includes a second identification feature;
the determining module 22 is configured to determine first relative pose data of the first identification feature with respect to the binocular camera based on the first image data; determining second relative pose data of the second identification feature with respect to the binocular camera based on the second image data;
the calibration module 23 is configured to calibrate the surgical robot collaboration system according to the first relative gesture data and the second relative gesture data.
From the above description, it can be seen that the following technical effects are achieved in the embodiments of the present specification: when more mechanical arms are needed to perform operation, the operation can be performed cooperatively by the first operation robot and the second operation robot, the first operation robot and the second operation robot can both have at least one mechanical arm, and the operation can be performed cooperatively by a plurality of operation robots, so that the operation efficiency is improved, and the mutual interference and the limitation of movement of each arm caused by the increase of the mechanical arms can be avoided. In addition, set up identification feature on first surgical robot and second surgical robot, later gather image data through binocular camera after, obtain first relative position appearance data of first surgical robot for binocular camera and second relative position appearance data of second surgical robot for binocular camera, later can realize the demarcation to surgical robot cooperation system based on first relative position appearance data and second relative position appearance data for first surgical robot and second surgical robot can operate in the same coordinate system, thereby make the cooperation of robot system can cooperate the operation.
The embodiment of the present disclosure further provides a medical device, specifically referring to a schematic diagram of a medical device composition structure of the surgical robot collaboration system calibration method shown in fig. 21 and provided based on the embodiment of the present disclosure, the medical device may specifically include an input device 31, a processor 32, and a memory 33. Wherein the memory 33 is for storing processor executable instructions. The processor 32, when executing instructions, implements the steps of the surgical robotic collaboration system calibration method of any of the embodiments described above.
In this embodiment, the input device may specifically be one of the main means for exchanging information between the user and the computer system. The input device may include a keyboard, mouse, camera, scanner, light pen, handwriting input board, voice input apparatus, etc.; the input device is used to input raw data and a program for processing these numbers into the computer. The input device may also obtain data transmitted from other modules, units, and devices. The processor may be implemented in any suitable manner. For example, the processor may take the form of, for example, a microprocessor or processor, and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a programmable logic controller, and an embedded microcontroller, among others. The memory may in particular be a memory device for storing information in modern information technology.
The embodiment of the specification also provides a computer storage medium based on the surgical robot collaboration system calibration method, wherein the computer storage medium stores computer program instructions, and the steps of the surgical robot collaboration system calibration method in any embodiment are realized when the computer program instructions are executed.
It will be apparent to those skilled in the art that the modules or steps of the embodiments described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a storage device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than herein, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps within them may be fabricated into a single integrated circuit module. Thus, embodiments of the present specification are not limited to any specific combination of hardware and software.
It is to be understood that the above description is intended to be illustrative, and not restrictive. Many embodiments and many applications other than the examples provided will be apparent to those of skill in the art upon reading the above description. The scope of the disclosure should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
The above description is only of preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the embodiments of the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the protection scope of the present specification.

Claims (10)

1. A method for calibrating a surgical robot collaboration system, wherein the surgical robot collaboration system comprises a first surgical robot and a second surgical robot; the first surgical robot is provided with a first identification feature, and the second surgical robot is provided with a second identification feature; the method comprises the following steps:
acquiring first image data and second image data acquired by a binocular camera; the first image data comprises the first identification feature; the second image data includes the second identification feature therein;
determining first relative pose data of the first identification feature with respect to the binocular camera based on the first image data; determining second relative pose data of the second identification feature relative to the binocular camera based on the second image data;
And calibrating the surgical robot collaboration system according to the first relative posture data and the second relative posture data.
2. The surgical robot collaboration system calibration method of claim 1, wherein calibrating the surgical robot collaboration system based on the first relative pose data and the second relative pose data comprises:
calculating a first transformation matrix of the first surgical robot relative to the binocular camera according to the first relative pose data; calculating a second transformation matrix of the binocular camera relative to the second surgical robot according to the second relative pose data;
and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot based on the first conversion matrix and the second conversion matrix so as to calibrate the surgical robot cooperation system.
3. The surgical robot collaboration system calibration method of claim 1, wherein calibrating the surgical robot collaboration system based on the first relative pose data and the second relative pose data comprises:
Determining first pose information of the first surgical robot relative to the second surgical robot based on the first relative pose data and the second relative pose data;
and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the first pose information of the first surgical robot relative to the second surgical robot so as to calibrate the surgical robot cooperation system.
4. The surgical robot collaboration system calibration method of claim 3, wherein the base of the first surgical robot is rigidly connected to an operating table and the base of the second surgical robot is rigidly connected to the operating table;
calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to first pose information of the first surgical robot relative to the second surgical robot, comprising:
determining second pose information of the first surgical robot relative to the second surgical robot according to the rigid connection relation data between the base of the first surgical robot and the operating table and the rigid connection relation data between the base of the second surgical robot and the operating table;
The first pose information and the second pose information are fused, so that fused pose information is obtained;
and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the fused pose information.
5. A surgical robotic collaboration system, comprising:
a first surgical robot having a first identification feature disposed thereon;
a second surgical robot having a second identification feature disposed thereon;
a binocular camera for acquiring first image data and second image data; the first image data comprises the first identification feature; the second image data includes the second identification feature therein;
the controller is used for acquiring the first image data and the second image data acquired by the binocular camera; further for determining first relative pose data of the first identification feature with respect to the binocular camera based on the first image data; determining second relative pose data of the second identification feature relative to the binocular camera based on the second image data; and the operation robot cooperation system is calibrated according to the first relative posture data and the second relative posture data.
6. The surgical robot collaboration system of claim 5, wherein the robotic arm of the first surgical robot comprises a passive stationary point robotic arm; the robotic arm of the second surgical robot includes a robotic arm with an active stationary point.
7. The surgical robotic collaboration system of claim 6, wherein the controller further comprises an interactive selection user interface to make a left-right hand or mirror-holding robotic arm selection; the interactive selection user interface is also used for displaying the installation condition and the use condition of the surgical instrument obtained by automatically identifying the information of the built-in chip.
8. The surgical robot collaboration system of claim 5, wherein the base of the first surgical robot is rigidly connected to the surgical bed and the base of the second surgical robot is rigidly connected to the surgical bed;
the controller is configured to determine first pose information of the first surgical robot relative to the second surgical robot based on the first relative pose data and the second relative pose data; the method is also used for determining second pose information of the first surgical robot relative to the second surgical robot according to the rigid connection relation data between the base of the first surgical robot and the operating table and the rigid connection relation data between the base of the second surgical robot and the operating table; and the method is also used for carrying out fusion processing on the first pose information and the second pose information to obtain fused pose information, and calculating a target pose conversion matrix between the first surgical robot and the second surgical robot according to the fused pose information.
9. The surgical robot collaboration system of claim 5, wherein the first identification feature comprises a checkerboard fixedly connected to a robotic arm of the first surgical robot; alternatively, the first identifying feature comprises a plurality of non-repeating patterns formed on the end of the robotic arm of the first surgical robot by a surface treatment process.
10. A medical device comprising a processor and a memory for storing processor-executable instructions, which when executed by the processor implement the steps of the method of any one of claims 1 to 4.
CN202310470692.5A 2023-04-27 2023-04-27 Surgical robot cooperation system and calibration method thereof Pending CN116421325A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310470692.5A CN116421325A (en) 2023-04-27 2023-04-27 Surgical robot cooperation system and calibration method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310470692.5A CN116421325A (en) 2023-04-27 2023-04-27 Surgical robot cooperation system and calibration method thereof

Publications (1)

Publication Number Publication Date
CN116421325A true CN116421325A (en) 2023-07-14

Family

ID=87088992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310470692.5A Pending CN116421325A (en) 2023-04-27 2023-04-27 Surgical robot cooperation system and calibration method thereof

Country Status (1)

Country Link
CN (1) CN116421325A (en)

Similar Documents

Publication Publication Date Title
US11007023B2 (en) System and method of registration between devices with movable arms
US11941734B2 (en) Rendering tool information as graphic overlays on displayed images of tools
KR102117273B1 (en) Surgical robot system and method for controlling the same
US11534246B2 (en) User input device for use in robotic surgery
US8828023B2 (en) Medical workstation
KR101705921B1 (en) Synthetic representation of a surgical robot
CN106725857B (en) Robot system
US11880513B2 (en) System and method for motion mode management
US11703952B2 (en) System and method for assisting operator engagement with input devices
KR20220004950A (en) Systems and Methods for Imaging Devices and Integrated Motion
CN114098981A (en) Head and neck auxiliary traction surgical robot with two cooperative arms and control method thereof
CN116421325A (en) Surgical robot cooperation system and calibration method thereof
US20210030502A1 (en) System and method for repositioning input control devices
CN116600732A (en) Augmented reality headset for surgical robot
WO2023192204A1 (en) Setting and using software remote centers of motion for computer-assisted systems
WO2022147074A1 (en) Systems and methods for tracking objects crossing body wall for operations associated with a computer-assisted system
Casals et al. Robotic aids for laparoscopic surgery problems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination