WO2023022258A1 - Système de guidage de chirurgie assisté par intelligence artificielle de robot de maintien de laparoscope basé sur des informations d'image - Google Patents

Système de guidage de chirurgie assisté par intelligence artificielle de robot de maintien de laparoscope basé sur des informations d'image Download PDF

Info

Publication number
WO2023022258A1
WO2023022258A1 PCT/KR2021/011021 KR2021011021W WO2023022258A1 WO 2023022258 A1 WO2023022258 A1 WO 2023022258A1 KR 2021011021 W KR2021011021 W KR 2021011021W WO 2023022258 A1 WO2023022258 A1 WO 2023022258A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
data
robot
surgery
tool
Prior art date
Application number
PCT/KR2021/011021
Other languages
English (en)
Korean (ko)
Inventor
황희선
노경석
김정준
김종찬
박지현
공성호
Original Assignee
한국로봇융합연구원
서울대학교병원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국로봇융합연구원, 서울대학교병원 filed Critical 한국로봇융합연구원
Publication of WO2023022258A1 publication Critical patent/WO2023022258A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/252User interfaces for surgical systems indicating steps of a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/506Supports for surgical instruments, e.g. articulated arms using a parallelogram linkage, e.g. panthograph
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0487Special user inputs or interfaces
    • A61B2560/0493Special user inputs or interfaces controlled by voice

Definitions

  • the present invention relates to an image information-based laparoscopic robot artificial intelligence surgical guide system and guide method.
  • surgery refers to repairing a disease by cutting, cutting, or manipulating skin, mucous membranes, or other tissues using medical machines.
  • open surgery which incises and opens the skin at the surgical site and treats, molds, or removes internal organs, etc., has recently been using robots due to problems such as bleeding, side effects, patient pain, and scars.
  • Surgery is emerging as an alternative.
  • a surgical robot refers to a robot having a function that can substitute for a surgical operation performed by a surgeon. Compared to humans, these surgical robots have the advantage of being able to perform more accurate and precise movements and enabling remote surgery.
  • Surgical robots currently being developed worldwide include bone surgery robots, laparoscopic surgery robots, and stereotactic surgery robots.
  • a surgical robot device is generally composed of a master console and a slave robot.
  • a control lever for example, a handle
  • an instrument coupled to the robot arm of the slave robot or held by the robot arm is manipulated to perform surgery.
  • the surgical robot is provided with a robot arm for manipulation for surgery, and an instrument is mounted on a front end of the robot arm.
  • an instrument is mounted on the front end of the robot arm to perform surgery, the instrument moves along with the movement of the robot arm, which means that the patient's skin is partially punctured and the instrument is inserted thereto to perform surgery.
  • the surgical area is wide, there is a concern that the advantage of robot surgery may be halved, such as incision of the skin as much as the path the instrument moves or perforation of the skin for each surgical area.
  • the instrument mounted on the front end of the robot arm sets a virtual rotation center point at a predetermined position at the distal end and controls the robot arm so that the instrument rotates around this point.
  • This virtual center point is referred to as a 'remote center' or ' It is called RCM (remote center of motion).
  • the present invention has been devised to solve the above conventional problems, and according to an embodiment of the present invention, while mapping the surgical image data to be a guide and the current captured image data, there is a missing in the surgical sequence or a change of more than a threshold value. Its purpose is to provide an image information-based laparoscopic robot artificial intelligence surgery guide system that can be transmitted as a notification signal or controlled to capture an image of the location where the corresponding event exists immediately before surgery is completed.
  • the present invention it is possible to memorize the location of a tool (gauze, mass, etc.) inserted into the affected area during surgery and automatically photograph the location right before the surgery is completed to determine whether the tool has been removed or not. Its purpose is to provide an image information-based laparoscopic robot artificial intelligence surgery guide system that can be controlled to photograph the location immediately before surgery is completed if it is not removed by judgment.
  • the present invention it is possible to store a momentary robot posture to be memorized through a voice command or a foot pedal, and to establish a movement plan to move to show the memorized momentary image according to a user's request later.
  • the purpose is to provide an image information-based laparoscopic robot artificial intelligence surgical guide system.
  • a situation in which movement to a target point is impossible due to a changed environment (organ movement) during surgery and the current surgical tool position, etc., an object that becomes an obstacle during movement is displayed on the screen, and kinematic characteristics of the laparoscopic robot
  • an image information-based laparoscopic robot artificial intelligence surgery guide system that can display the optimal screen possible considering the There is a purpose.
  • An object of the present invention is to provide a system for guiding surgery by monitoring a surgical procedure based on image data captured by a laparoscopic camera of a laparoscopic camera holder robot, comprising: a data collection unit for collecting surgical image data; a surgical image learning DB that stores the surgical learning data by learning the collected surgical image data and classifying them by surgery type and operator; an image processing device that receives current surgical image data captured by the camera and exchanges data through communication in a non-real-time control area with a controller that controls driving of the holder robot; a guide monitoring unit generating surgical guide data by comparing the surgical learning data with the current surgical image data captured by the camera; And notification means for guiding the surgical guide data; it can be achieved as an image information-based laparoscopic robot artificial intelligence surgery guide system comprising a.
  • the surgical learning data may be characterized in that surgical sequence characteristics are learned for each surgical type and each operator.
  • the guide monitoring unit a comparison analysis unit for comparing and analyzing the current surgical image data and the surgical learning data in real time; and an event determination unit for determining an event as to whether a sequence is missing or whether a change in the current surgical image data compared to the surgical learning data exceeds a threshold value according to the comparative analysis by the comparison and analysis unit, wherein the notification means comprises: It may be characterized in that a notification signal is transmitted when the event occurs.
  • the controller may control driving of the holder robot to capture an image of a location where the event occurs.
  • the image processing device recognizes the surgical tool in the current surgical image data and identifies the location and type to perform non-real-time control with the controller.
  • data is exchanged in the area, and the surgical learning data learns the position and direction characteristics of surgical instruments according to the surgical sequence, and the comparison and determination unit position and direction characteristics of the surgical instruments according to the surgical sequence of the surgical learning data,
  • the surgical tools in the current surgical image data may be compared and analyzed, and the event determination unit may determine an event as to whether there is a change of more than a threshold value in the position and direction characteristics of the surgical tool according to the sequence.
  • the image processing device recognizes the tool to be removed in the current surgical image data to identify the location and type of the tool to be removed, and the controller and exchanges data in a non-real-time control area, and the controller, when the tool to be removed is recognized, controls driving of the holder robot to capture an image of the position of the tool to be removed right before surgery is completed.
  • the controller may further include a removal decision unit for determining whether the tool to be removed is removed when the tool to be removed is recognized, and the controller controls the tool to be removed when it is determined that the tool to be removed is not removed immediately before completion of the operation. It may be characterized in that driving of the holder robot is controlled to capture an image.
  • the control input may be a position command based on a display image, and the controller may control driving of the holder robot so that the position of the image is changed based on the voice control input.
  • the voice command processing device includes a voice command DB that learns characteristics of each person and is classified for each voice control command, recognizes a voice control command from the voice data, and exchanges data with the controller in a non-real-time control area. that can be characterized.
  • a robot posture storage unit for commanding to store the posture of the robot at a specific point in time or during a specific time range during surgery, and the controller drives the holder robot to switch to the stored robot posture according to a user's request. It can be characterized as controlling.
  • a tool that does not match the surgical tool DB and the removal target tool DB exists in the current surgical image data, it may be characterized by further comprising an obstacle recognizing unit that recognizes it as an obstacle and displays it in the surgical image data.
  • the image information-based laparoscopic robot artificial intelligence surgery guide system while mapping the surgical image data that serves as a guide and the current captured image data, if there is an omission in the surgical sequence or a change greater than the threshold value, it is sent as a notification signal. Alternatively, it has an effect of being able to control to capture an image of a location where a corresponding event exists right before surgery is completed.
  • the surgery is automatically completed to determine whether the tool is removed by memorizing the position of a tool (gauze, mass, etc.) inserted into the affected area during surgery. If it is not removed by determining whether or not the technology or control to shoot the corresponding position just before the surgery is completed, it has the effect of being able to control the photographing of the corresponding position immediately before the surgery is completed.
  • a tool gauge, mass, etc.
  • the momentary robot posture to be memorized through a voice command or foot pedal is stored, and the instantaneous image that is remembered according to the user's request later It has the effect of establishing a movement plan to move to show
  • the image information-based laparoscopic robot artificial intelligence surgery guide system it is possible to grasp the situation in which movement to the target point cannot be performed due to the changed environment (organ movement) and the current surgical tool position during surgery, and obstacles when moving to the screen.
  • This object can be displayed, the optimal screen can be displayed considering the kinematic characteristics of the laparoscopic robot, and the current position is automatically saved when a move command is given to the previous position, and then the move can be performed according to the return command to the previous position. .
  • FIG. 1 is a configuration diagram of a laparoscopic holder robot system having a laparoscopic mounting adapter and an RCM structure according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram showing a driving mechanism having an RCM structure according to an embodiment of the present invention
  • FIG. 3 is a side view of a laparoscopic holder robot having a laparoscopic mounting adapter and an RCM structure according to an embodiment of the present invention
  • Figure 4 is a side cross-sectional view of the laparoscopic adapter device attached to the laparoscope according to an embodiment of the present invention
  • FIG. 5 is a side cross-sectional view of a detachable unit according to an embodiment of the present invention.
  • FIG. 6 is a front view of a detachable unit according to an embodiment of the present invention.
  • FIG. 7 is a block diagram showing a control flow of a controller according to an embodiment of the present invention.
  • FIG. 8 is a diagram of a laparoscopic camera holder robot control system according to an embodiment of the present invention.
  • FIG. 9 is a block diagram of an external control device and a controller that are communicatively connected by an external interface according to an embodiment of the present invention.
  • FIG. 10 is a block diagram of a four-mode laparoscopic camera holder robot control system according to an embodiment of the present invention.
  • FIG. 11 is a flowchart of a method for controlling a laparoscopic camera holder robot according to an embodiment of the present invention
  • FIG. 12 is an example of a screen when a grid command is input in a voice command mode according to an embodiment of the present invention and a screen when a voice command is number 4;
  • FIG. 13 is a block diagram of a laparoscopic robot artificial intelligence surgery guide system based on image information according to an embodiment of the present invention
  • FIG. 14 is a block diagram of a data collection unit according to an embodiment of the present invention.
  • 15 is a block diagram of a learning DB according to an embodiment of the present invention.
  • FIG. 16 shows a block diagram of a guide monitoring unit according to an embodiment of the present invention.
  • FIG. 1 shows a configuration diagram of a laparoscopic holder robot system having a laparoscopic mounting adapter and an RCM structure according to an embodiment of the present invention.
  • the configuration and function of the laparoscopic holder robot having the laparoscopic mounting adapter and the RCM structure according to an embodiment of the present invention are described focusing on the driving mechanism, and second, the control system and control method for the laparoscopic holder robot are described. Then, thirdly, the method and system for monitoring the laparoscopic surgery process based on image information will be described.
  • FIG. 2 is a schematic diagram showing a driving mechanism having an RCM structure according to an embodiment of the present invention.
  • Figure 3 shows a side view of the laparoscopic holder robot having a laparoscopic mounting adapter and RCM structure according to an embodiment of the present invention.
  • Laparoscope holder robot 100 having a laparoscope mounting adapter and an RCM structure basically has a camera 2 at one end and a laparoscope 1 equipped with an image sensor 3 at the other end. rotates around a remote center of motion (RCM) point (4).
  • RCM remote center of motion
  • the laparoscopic holder robot 100 having a laparoscopic mounting adapter and an RCM structure generally includes a body 5, an RCM structure 10, a first rotation drive unit 20, and a second rotation drive unit 30. , Linear movement device 40, the laparoscopic adapter device 50 having a laparoscopic axial rotation device 60, and the like.
  • the laparoscopic holder robot 100 having the laparoscopic mounting adapter and the RCM structure according to an embodiment of the present invention has four degrees of freedom and enables the laparoscope 1 to be detached from the holder robot 100 by the laparoscopic adapter device 50. It consists of
  • the laparoscope 1 is tilted up and down based on the RCM point 4 on the laparoscope by the first rotation drive unit 20, and the RCM point 4 and the RCM point 4 by the second rotation drive unit 30. It is rotationally driven based on the imaginary axis connecting the first rotation joint 11, and the laparoscope 1 can be moved in the longitudinal direction through the linear moving device 40, and the laparoscope 1 can be moved in the longitudinal direction by the laparoscope axial rotation device 60. has four degrees of freedom to rotate about its longitudinal axis.
  • the RCM structure 10 has a rear end coupled on the first rotary joint 11 of the body, and a front end coupled to the laparoscope 1 side on the second rotary joint 16.
  • the first rotation driving unit 20 is provided in the body 5 and drives the RCM structure 10 to rotate with respect to the first rotation joint 11, so that the laparoscope 1 rotates around the RCM point 4 .
  • the RCM structure 10 has a structure in which the first link unit 12 and the second link unit 14 are combined.
  • the first rotation joint 11 is a 1-1 rotation joint 11-1, a 1-2 rotation joint 11-1 spaced apart from the 1-1 rotation joint 11-1 downward by a specific interval. 2)
  • the second rotation joint 16 is configured to include a 2-1 rotation joint 16-1 and a 2-2 rotation joint 16-2.
  • the first link unit 12 includes a 1-1 link 12-1 having one end connected to the 1-1 rotation joint 11-1, and a 1-1 link 12-1 having one end connected to the 1-1 rotation joint 11-1. ) is connected to the other end of the first hinge 13, and the other end connects the 1-2 link 12-1 connected to the linear movement device 40 through the 2-1 rotation joint 16-1. consists of including
  • the second link unit 14 includes a 2-1 link 14-1 having one end connected to the 1-2 rotation joint 11-2, and a 2-1 link 14-1 having one end connected to the 1-2 rotation joint 11-2.
  • the 2-2 link 140-2 connected to the other end of 1) by the second hinge 15 and the other end connected to the linear movement device 40 through the 2-2 rotation joint 16-2 have
  • the first link unit 12 and the second link unit 14 are hinged at a point where the 1-2 link 12-2 and the 2-1 link 14-1 intersect.
  • the second rotation drive unit 30 is installed on one side of the body 5 and drives the laparoscope 1 to rotate based on a virtual line connecting the RCM point 4 and the first rotation joint 11.
  • the RCM point 4 is a virtual line connecting the 1-1 rotation point 11-1 and the 1-2 rotation point 11-2 and the laparoscope 1 It can be seen that it is located at the point where it intersects. Therefore, the laparoscope 1 can be rotated with respect to the RCM point 4 by driving the first rotary driving unit through the RCM structure 10 .
  • the linear movement device 40 is connected to the second rotation joint 16 and is configured to move the laparoscope 1 in the longitudinal direction.
  • the specific configuration, means, and form of the linear movement device 40 are not limited as long as it can move the laparoscope 1 along the longitudinal direction of the laparoscope 1.
  • the laparoscopic adapter device is configured to attach and detach the linear movement device and the laparoscope.
  • Figure 4 shows a side cross-sectional view of the laparoscopic adapter device attached to the laparoscope according to an embodiment of the present invention.
  • 5 is a cross-sectional side view of a detachable unit according to an embodiment of the present invention
  • FIG. 6 is a front view of the detachable unit according to an embodiment of the present invention.
  • Figure 7 shows a block diagram showing the control flow of the controller according to an embodiment of the present invention.
  • the laparoscopic adapter device 50 includes a fastening means provided on one side of the upper portion and configured to be detachable from the linear movement device 40, and a detachable unit 70 configured to attach and detach the laparoscope 1. do.
  • This detachable unit 70 is configured to be replaceable according to the size of the diameter of the laparoscope.
  • the laparoscopic axial rotation device 60 may be detachably installed between the linear movement device 40 and the detachable unit 70. Through the laparoscope axial rotation device 60, the laparoscope can be rotated based on the longitudinal axis.
  • the present invention it is possible to solve the problem of having a different diameter of the laparoscope 1 and mount it on the robot 100 like a module. Since the diameter of the laparoscope 1 varies, it can be divided into 2-3 pieces and mounted, and the external size of the adapter 50 is kept constant so that it can be combined with the robot 100. Since there are cases where axial rotation is not required, it is configured to be equipped with a modular laparoscopic axial rotation device. Since the laparoscope 1 having an inclined angle can be mounted, it is configured to align the central axis, and the motor drive 62 and the electric motor 61 are installed together as one module.
  • the attachment/detachment unit 70 has a cylindrical inner surface, into which the laparoscope 1 is inserted, and a mounting portion 71 having an incision in the longitudinal direction and , It may be configured to include a cam clamping member 72 for fixing the laparoscope 1 by tightening the mounting portion 71 by manipulation.
  • the controller 200 controls the driving of the first rotary drive unit 20, the second rotation drive unit 30, the linear movement unit 40, and the axial rotation unit 60 of the laparoscope to position the end of the laparoscope 10. As will be described later by adjusting, the imaging position of the laparoscopic camera 2 is adjusted.
  • the adapter device 50 is coupled to the laparoscope axial rotation device 60 and is configured to detach only the laparoscope 1 itself.
  • the position recognition unit recognizes a longitudinal movement position of the laparoscope and a rotation angle position based on a longitudinal axis of the laparoscope from a reference position based on an angle criterion and a length criterion.
  • the controller 200 moves the position of the laparoscope at the point where the procedure is stopped from the reference position.
  • the driving of the linear moving device 40 and the laparoscope axial rotation device 60 are controlled to move the laparoscope 1 to the laparoscope position at the procedure stop point.
  • the longitudinal position of the laparoscope 1 should not be changed.
  • a reference line is made so that the light source mechanical part and the adapter device 50 already mounted on the laparoscope (1) match, and the angle part is matched, and the length of the laparoscope (1) is matched with the step part in the laparoscope to match the length. That is, after the laparoscope holder robot 100 is coupled to the laparoscope 1 and mounted in the robot system, the RCM point 4 and the state position are determined through a calibration process. After that, when it is detached, this calibration process is not needed.
  • FIG. 8 shows a block diagram of a laparoscopic camera holder robot control system according to an embodiment of the present invention.
  • 9 is a block diagram of an external control device and a controller that are communicatively connected by an external interface according to an embodiment of the present invention.
  • 10 also shows a block diagram of a four-mode laparoscopic camera holder robot control system according to an embodiment of the present invention.
  • the laparoscopic camera holder robot control system is a system for controlling the driving of the aforementioned laparoscopic camera holder robot 100 .
  • the controller 20 basically drives the holder robot 10 based on the control command signal, that is, the first rotation drive unit 20, the second rotation drive unit 30, and the linear movement device 40. , Controls the driving of the laparoscopic axial rotation device 60.
  • the controller 200 is divided into a real-time control area and a non-real-time control area, and is configured to mutually exchange data through internal communication.
  • the real-time controller 202 of the controller 200 receives a control signal from the attitude measurement unit and the first control input means in the real-time domain.
  • the top priority control input means is a means for inputting a top priority control command signal to the controller 200 by a user in a real-time control area, and may be composed of a foot pedal 110 in an embodiment of the present invention.
  • the user inputs a control signal to the controller 200 through manipulation of the foot pedal 110 .
  • the posture measurement unit is configured to measure the posture data of the holder robot 100 in the real-time control area and transmits the data to the controller 200.
  • the posture measurement unit may be configured with the IMU sensor 120.
  • the display unit 150 is configured to display image data captured by the laparoscopic camera 2 in real time.
  • the external control device 210 is communicatively connected to the external interface 201 provided in the controller 200, and adjusts the position of the end of the laparoscope 1 by means of an external adjustment input to adjust the position of the image displayed on the display unit 150. configured to adjust.
  • the external interface 201 operates in the non-real-time control area and exchanges data with the real-time control area of the controller through internal communication.
  • the external adjustment input is a position command based on the displayed image, and the controller 200 controls the driving of the holder robot 100 so that the position of the image is changed based on the external adjustment input.
  • the external control device 210 is a device for the operator (user) to move the camera holder robot 100 during surgery, and the laparoscope 1 indirectly attached to the camera holder robot 100 by controlling the position shown in the laparoscopic image. ) to control the end position to control the displayed image.
  • the connection with the external control device 210 is connected to the camera holder robot controller 200 through the "external interface 201" module in the camera holder robot controller 200, and the external interface 201 module uses various communication methods. Support (ADS, TCP/IP, Serial, etc.).
  • the external interface 201 module operates in a non-real-time area and exchanges data with the real-time control area of the camera holder robot controller 200 through internal communication.
  • the image processing device 130 is configured to receive image data captured by the camera 2 and exchange data with the controller 200 through communication in a non-real-time control area.
  • the image processing device 130 learns sample surgical tool images, includes a surgical tool learning DB classified by type of surgical tool, recognizes the surgical tool in the image data, identifies the location and type, and connects the controller 200 and non-real-time It is configured to exchange data in the control domain.
  • the voice command processing device 140 receives the voice data from the microphone 6 that receives voice data from the user, recognizes voice control commands, and exchanges data with the controller through communication in the non-real-time control area.
  • Voice control input is a position command based on the display image
  • the controller 200 controls the driving of the holder robot 100 so that the position of the image is changed based on the voice control input.
  • the voice command processing device 140 learns the characteristics of each person and includes a voice command DB classified for each voice control command, and recognizes the voice control command from the voice data so that the controller 200 and the data in the non-real-time control area It is configured to exchange
  • the laparoscopic camera holder robot control system when there is an inclination angle at the rear end of the laparoscope 1, a kinematic map between the inclination angle, the screen coordinate system of the image data, and the laparoscope end coordinate system. ), it is configured to include a laparoscopic inclination angle correction unit that calculates the movement of the laparoscopic end coordinate system according to the movement of the screen coordinate system based on the screen coordinate system.
  • a device capable of inputting an installed laparoscopic inclination angle is provided, and based on a kinematic map between the screen coordinate system and the laparoscopic end coordinate system, the movement of the laparoscopic end coordinate system according to the movement of the laparoscopic end coordinate system is calculated.
  • the robot motion for generating the laparoscopic distal coordinate system motion is generated.
  • Figure 11 shows a flow chart of a laparoscopic camera holder robot control method according to an embodiment of the present invention.
  • 12 shows an example of a screen when a grid command is input in the voice command mode according to an embodiment of the present invention and a screen when a voice command is number 4.
  • the laparoscopic camera holder robot control system can be operated in a manual operation mode, an external operation mode, a basic operation mode, and an automatic operation mode
  • the basic operation mode is a foot pedal and a voice command.
  • the control by the foot pedal is applied with the highest priority for other modes as a real-time control area.
  • the user adjusts the position of the holder robot 100 through the manual operation mode (S2, S3), Image data is displayed on the display unit 150 by the laparoscopic camera 2 .
  • the user can control the position of the robot 100 after pressing the “manual mode” button on the robot 100 or the controller 200.
  • the manual operation mode button is lit and the button is pressed again. When pressed, the light turns off and the manual operation mode is released.
  • the basic operation mode is executed, the posture data measured through the IMU sensor 120 in the real-time control area is input, and the user inputs a control signal through the foot pedal 110 and the controller 200 In the real-time control area, the driving of the holder robot 100 is controlled based on the foot pedal 110 input signal (S5).
  • the voice command processing device 140 recognizes a specific voice control command from the voice data, and the controller 200 operates the holder robot 100 based on the voice control command.
  • the driving of is controlled (S8).
  • the control input by the foot pedal takes precedence over the voice command.
  • the basic operation mode is a mode that basically operates when the power of the camera holder robot 100 is turned on, and the foot pedal 110 device directly connected to the real-time control area and the voice command connected to the non-real-time control area can be performed. .
  • the foot pedal command connected to the real-time area takes precedence.
  • the voice command processing device 140 receives voice data from the microphone 6 that receives voice data from the user, recognizes voice control commands, and transmits data through communication with the controller in a non-real-time control area. will exchange
  • the voice control input is a position command based on the display image
  • the controller 200 controls the driving of the holder robot 100 so that the position of the image is changed based on the voice control input.
  • the voice command processing device 140 learns the characteristics of each person and includes a voice command DB classified for each voice control command, and recognizes the voice control command from the voice data so that the controller 200 and the data in the non-real-time control area It is configured to exchange
  • the voice control command includes a grid voice command, and at the time of the grid voice command, the image data is divided into a plurality of pieces, an index is displayed in each divided area, and the user selects a specific index.
  • the controller 200 controls the driving of the holder robot 100 so that the specific index partition area becomes the entire screen.
  • the user issues a movement command through the wirelessly connected microphone 6, provides a learning algorithm that can learn the characteristics of each person, and easily transfers the learned result to the voice command processing device 140. be able to upload.
  • the voice command is based on the displayed image, and may be moved by up/down, right/left, near/far, right rotation/left rotation commands of the image.
  • it provides a voice command system definition (up, down, left, right, near, far, etc.) of fine-tuning movements according to the degree of freedom of laparoscope (1) movement and a scale calibration algorithm optimized for laparoscopic surgery.
  • screen movement i.e. robot movement speed, can be optimized to meet user requirements.
  • the external control device 210 is communicatively connected to the external interface 201 provided in the controller 200, and the controller 200 operates based on an external control input by the user. is to control the driving of the laparoscopic robot 100 (S11).
  • This external operation mode operates in the non-real-time control area, exchanges data with the real-time control area of the controller 200 through internal communication, the external control input is a position command based on the display image, and the controller 200 controls the driving of the holder robot 100 so that the position of the image is changed based on the external adjustment input.
  • control input position, speed
  • the robot 100 is controlled through the camera holder robot controller 200.
  • the external control device 210 must transmit data to the camera holder robot controller 200 according to a predetermined protocol, and related APIs are provided.
  • the image processing device 130 having a surgical tool learning DB classified by type of surgical tool by learning the sample surgical tool image recognizes the surgical tool in the image data
  • the location and type are identified and data is exchanged with the controller 200 in the non-real-time control area, and the controller 200 controls the drive 100 of the holder robot so that the recognized surgical tool is maintained within a specific area within the image data.
  • this automatic operation mode is executed only when the user designates the automatic operation mode, image data is input (S12), and the surgical tool is recognized by the image processing device 130 (S13).
  • the driving priority of the holder robot 100 is foot pedal 110 input, voice control command, external adjustment input, and automatic operation mode in order. am.
  • FIG. 13 is a block diagram of a laparoscopic robot artificial intelligence surgery guide system based on image information according to an embodiment of the present invention.
  • 14 is a block diagram of a data collection unit according to an embodiment of the present invention
  • FIG. 15 is a block diagram of a learning DB according to an embodiment of the present invention
  • FIG. 16 is a block diagram of a guide monitoring unit according to an embodiment of the present invention. It did
  • the image information-based laparoscopic robot artificial intelligence surgery guide system can guide surgery by monitoring the surgical process based on image data captured by the laparoscopic camera 1 of the laparoscopic camera holder robot 100. It is a system that has
  • the image information-based laparoscopic robot artificial intelligence surgery guide system according to an embodiment of the present invention, in the aforementioned control system, data collection unit, data learning unit, learning DB, guide monitoring unit, notification It can be seen that it is configured to further include means and the like.
  • the data collection unit is configured to collect surgical image data 311 , sample surgical tool image 312 , sample removal target tool image 313 , and audio data 314 .
  • the data learning unit learns the collected surgical image data 311, sample surgical tool image 312, sample removal target tool image 313, and audio data 314, and each of the learned data is of the learning DB 330. It is stored in the surgical image learning DB 331, the surgical tool learning DB 332, the removal target tool learning DB 333, and the voice command DB 334.
  • the image processing device 130 receives the current surgical image data captured by the camera and exchanges data with the controller 200 that controls the driving of the holder robot 100 through communication in a non-real-time control area.
  • the data learning unit is configured to learn the collected surgical image data 311, classify the surgical image data by surgery type and operator, and store the surgical learning data in the surgical image learning DB.
  • the surgical learning data the surgical sequence characteristics are learned by surgery type and by operator.
  • the data learning unit 320 learns the sample surgical instrument images 312, classifies them by surgical instrument type, and stores them in the surgical instrument learning DB 332.
  • the image processing device 130 is configured to exchange data with the controller 200 in a non-real-time control area by recognizing a surgical tool in the current surgical image data to determine the location and type.
  • the surgical learning data can learn the characteristics of the position and direction of the surgical tool according to the surgical sequence.
  • the data learning unit 320 learns the sample removal target tool images 313, classifies them according to removal target tool types, and stores them in the removal target tool learning DB 333.
  • the image processing device 130 recognizes the tool to be removed in the current surgical image data, identifies the location and type, and exchanges data with the controller 200 in the non-real-time control area.
  • the data learning unit 320 learns the characteristics of each person from the voice data, classifies them according to voice control commands, and stores them in the voice command DB 334.
  • the voice command processing device 140 recognizes a voice control command from voice data and exchanges data with the controller 200 in a non-real-time control area.
  • the guide monitoring unit 350 is basically configured to generate surgical guide data by comparing the surgical learning data stored in the learning DB 330 with the current surgical image data captured by the camera, and the notification means 340 It is configured to guide the surgical guide data.
  • the guide monitoring unit 350 includes a search engine 351 that detects surgical learning data to be compared and analyzed based on the current surgical image data, and the current surgical image data and surgery
  • a comparison and analysis unit 352 that compares and analyzes the learning data in real time, and an event judgment that determines whether a sequence is missing or a change of the current surgical image data compared to the surgical learning data exceeds a threshold according to the comparison and analysis of the comparison and analysis unit 352. It can be seen that it is configured to include the unit 353.
  • the notification means 340 is configured to transmit a notification signal when an event occurs.
  • the controller 200 controls the driving of the holder robot 200 to capture an image of the location where the event occurred.
  • the surgical learning data can be learned the characteristics of the position and direction of the surgical tool according to the surgical sequence. Therefore, the comparison and determination unit 352 compares and analyzes the characteristics of the position and direction of the surgical instruments according to the surgical sequence of the surgical learning data and the surgical instruments in the current surgical image data, and the event determination unit 353 compares and analyzes the surgical instruments according to the sequence It is possible to determine an event about whether there is a change in the characteristics of location and direction that exceeds a threshold value.
  • the image processing device 130 includes the removal target tool learning DB 333 classified by removal target tool type by learning the sample removal target tool image 313, and the image processing device 130 removes the current surgical image data. Data can be exchanged with the controller 200 in the non-real-time control area by recognizing the target tool and figuring out the location and type.
  • the controller 200 controls driving of the holder robot 200 to capture an image of the position of the tool to be removed right before the surgery is completed.
  • the controller 200 includes a removal decision unit 356 that determines whether the tool to be removed is removed, and when it is determined that the tool to be removed is not removed right before the surgery is completed, the tool to be removed is removed.
  • the driving of the holder robot 100 is controlled to take an image.
  • the voice command processing device 140 is configured to receive voice data from the microphone 6 that receives voice data from the user, recognize voice control commands, and exchange data with the controller 200 through communication in a non-real-time control area. do.
  • the voice control input is a position command based on the display image
  • the controller 200 controls the driving of the holder robot to change the position of the image based on the voice control input.
  • voice command DB characteristics of each person are learned and classified according to voice control commands, and voice control commands are recognized from voice data to exchange data with a controller in a non-real-time control area.
  • the robot posture storage unit 357 is configured to command to store the posture of the robot during surgery, at a corresponding time point, or during a specific time range. That is, when a storage command is input by a voice command or the foot pedal 110, the robot posture at that time is stored. Therefore, the controller can control the driving of the holder robot to be converted to the stored robot posture according to the user's request.
  • the driving is controlled so that the momentary robot posture to be memorized is stored through a voice command or the foot pedal 110, and the memorized instantaneous image is later displayed according to the user's request. Also, when a command to move to the previous location is given, the current location is automatically saved and then moved according to the command to return to the previous location.
  • a situation in which movement to a target point cannot be achieved due to a changed environment (movement of organs) during surgery and the current position of a surgical tool may be identified and an object that becomes an obstacle during movement may be displayed on the screen.
  • the obstacle recognition unit may be configured to recognize a surgical tool DB and a tool that does not match the removal target tool DB as an obstacle and display it in the surgical image data when a tool that does not match the current surgical image data exists.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un système de guidage de chirurgie assisté par intelligence artificielle de robot de maintien de laparoscope basé sur des informations d'image et, de manière plus spécifique, un système de surveillance d'un processus chirurgical sur la base de données d'image capturées par une caméra de laparoscope d'un robot de support de caméra de laparoscope, ce qui permet de guider la chirurgie. Le système de guidage de chirurgie assisté par intelligence artificielle de robot de maintien de laparoscope basé sur des informations d'image comprend : une unité de collecte de données pour collecter des données d'image de chirurgie ; une base de données d'apprentissage d'image de chirurgie pour l'apprentissage des données d'image de chirurgie collectées, la classification de celles-ci par rapport à chaque type de chirurgie et chaque chirurgien, et le stockage des données d'apprentissage de chirurgie ; un dispositif de traitement d'image pour recevoir des données d'image de chirurgie actuelle capturées par la caméra et échanger des données avec un dispositif de commande pour commander l'entraînement du robot de support par l'intermédiaire d'une communication dans une zone de commande non en temps réel ; une unité de surveillance de guide pour générer des données de guide de chirurgie en comparaison avec les données d'apprentissage de chirurgie et les données d'image de chirurgie actuelle capturées par la caméra ; et un moyen de notification pour guider les données de guide de chirurgie.
PCT/KR2021/011021 2021-08-19 2021-08-19 Système de guidage de chirurgie assisté par intelligence artificielle de robot de maintien de laparoscope basé sur des informations d'image WO2023022258A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0109257 2021-08-19
KR1020210109257A KR102627401B1 (ko) 2021-08-19 2021-08-19 영상정보기반 복강경 로봇 인공지능 수술 가이드 시스템

Publications (1)

Publication Number Publication Date
WO2023022258A1 true WO2023022258A1 (fr) 2023-02-23

Family

ID=85239897

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/011021 WO2023022258A1 (fr) 2021-08-19 2021-08-19 Système de guidage de chirurgie assisté par intelligence artificielle de robot de maintien de laparoscope basé sur des informations d'image

Country Status (2)

Country Link
KR (1) KR102627401B1 (fr)
WO (1) WO2023022258A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101114234B1 (ko) * 2011-05-18 2012-03-05 주식회사 이턴 수술 로봇 시스템 및 그 복강경 조작 방법
WO2017175232A1 (fr) * 2016-04-07 2017-10-12 M.S.T. Medical Surgery Technologies Ltd. Système de commande chirurgical à activation vocale
KR20180100831A (ko) * 2017-03-02 2018-09-12 한국전자통신연구원 수술로봇 카메라의 시점 제어 방법 및 이를 위한 장치
US20190008598A1 (en) * 2015-12-07 2019-01-10 M.S.T. Medical Surgery Technologies Ltd. Fully autonomic artificial intelligence robotic system
KR20190133424A (ko) * 2018-05-23 2019-12-03 (주)휴톰 수술결과에 대한 피드백 제공방법 및 프로그램
WO2020159978A1 (fr) * 2019-01-31 2020-08-06 Intuitive Surgical Operations, Inc. Procédé et systèmes de commande de caméra pour système chirurgical assisté par ordinateur

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7794396B2 (en) 2006-11-03 2010-09-14 Stryker Corporation System and method for the automated zooming of a surgical camera
KR101997566B1 (ko) 2012-08-07 2019-07-08 삼성전자주식회사 수술 로봇 시스템 및 그 제어방법
US9827054B2 (en) * 2014-03-14 2017-11-28 Synaptive Medical (Barbados) Inc. Intelligent positioning system and methods therefore
KR101926123B1 (ko) 2017-12-28 2018-12-06 (주)휴톰 수술영상 분할방법 및 장치
KR101864411B1 (ko) 2017-12-28 2018-06-04 (주)휴톰 수술보조 영상 표시방법 및 프로그램
KR102146672B1 (ko) * 2018-05-23 2020-08-21 (주)휴톰 수술결과에 대한 피드백 제공방법 및 프로그램
KR102008891B1 (ko) * 2018-05-29 2019-10-23 (주)휴톰 수술보조 영상 표시방법, 프로그램 및 수술보조 영상 표시장치
US10383694B1 (en) 2018-09-12 2019-08-20 Johnson & Johnson Innovation—Jjdc, Inc. Machine-learning-based visual-haptic feedback system for robotic surgical platforms
GB2577714B (en) 2018-10-03 2023-03-22 Cmr Surgical Ltd Automatic endoscope video augmentation
KR102195825B1 (ko) 2018-12-12 2020-12-28 (주)헬스허브 알람 기능을 통한 수술 가이드 시스템 및 그 방법
US10729502B1 (en) * 2019-02-21 2020-08-04 Theator inc. Intraoperative surgical event summary
KR102239186B1 (ko) 2019-07-26 2021-04-12 한국생산기술연구원 인공지능 기반 로봇 매니퓰레이터의 자동 제어 시스템 및 방법
US20210059758A1 (en) * 2019-08-30 2021-03-04 Avent, Inc. System and Method for Identification, Labeling, and Tracking of a Medical Instrument

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101114234B1 (ko) * 2011-05-18 2012-03-05 주식회사 이턴 수술 로봇 시스템 및 그 복강경 조작 방법
US20190008598A1 (en) * 2015-12-07 2019-01-10 M.S.T. Medical Surgery Technologies Ltd. Fully autonomic artificial intelligence robotic system
WO2017175232A1 (fr) * 2016-04-07 2017-10-12 M.S.T. Medical Surgery Technologies Ltd. Système de commande chirurgical à activation vocale
KR20180100831A (ko) * 2017-03-02 2018-09-12 한국전자통신연구원 수술로봇 카메라의 시점 제어 방법 및 이를 위한 장치
KR20190133424A (ko) * 2018-05-23 2019-12-03 (주)휴톰 수술결과에 대한 피드백 제공방법 및 프로그램
WO2020159978A1 (fr) * 2019-01-31 2020-08-06 Intuitive Surgical Operations, Inc. Procédé et systèmes de commande de caméra pour système chirurgical assisté par ordinateur

Also Published As

Publication number Publication date
KR102627401B1 (ko) 2024-01-23
KR20230028818A (ko) 2023-03-03

Similar Documents

Publication Publication Date Title
US9630323B2 (en) Operation support system and control method of operation support system
JP3506809B2 (ja) 体腔内観察装置
KR102105142B1 (ko) 입력 장치의 오퍼레이터가 볼 수 있는 디스플레이 영역으로 기구가 진입할 때 기구의 제어를 입력 장치로 전환하는 방법
JP4179846B2 (ja) 内視鏡手術システム
CN108348134B (zh) 内窥镜系统
WO2010021447A1 (fr) Système d’affichage tridimensionnel pour robot chirurgical et son procédé de commande
WO2011149260A2 (fr) Structure à centre de mouvement déporté pour bras de robot chirurgical
WO2023197941A1 (fr) Procédé d'imagerie d'implantation chirurgicale et système d'imagerie
WO2023022258A1 (fr) Système de guidage de chirurgie assisté par intelligence artificielle de robot de maintien de laparoscope basé sur des informations d'image
WO2023022257A1 (fr) Système et procédé de commande de robot de support de caméra de laparoscope
WO2013018985A1 (fr) Système de robot chirurgical
JP3744974B2 (ja) 内視鏡下外科手術装置
JPH09266882A (ja) 内視鏡装置
JP4382894B2 (ja) 視野移動内視鏡システム
JP4554027B2 (ja) 眼科装置
CN110549328A (zh) 作业辅助控制装置和方法、作业图像控制装置和显示方法
KR102535861B1 (ko) 복강경 장착 어뎁터와 rcm구조를 가지는 복강경 홀더 로봇
CN217938392U (zh) 外科手术植入成像系统
WO2022031069A1 (fr) Module d'endoscope chirurgical laparoscopique comportant une couche de revêtement appliquée sur celui-ci
WO2019045531A2 (fr) Ensemble bras médical
WO2011145803A2 (fr) Dispositif médical pour chirurgie
CN218606817U (zh) 一种距离自动调整扶镜机械臂装置
CN217286107U (zh) 一种远程手术显微镜设备
WO2022230814A1 (fr) Système robotisé
CN115005998B (zh) 一种手术机器人系统及其机械臂防干涉调整方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21954305

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE