WO2010021447A1 - Three-dimensional display system for surgical robot and method for controlling same - Google Patents

Three-dimensional display system for surgical robot and method for controlling same Download PDF

Info

Publication number
WO2010021447A1
WO2010021447A1 PCT/KR2009/001388 KR2009001388W WO2010021447A1 WO 2010021447 A1 WO2010021447 A1 WO 2010021447A1 KR 2009001388 W KR2009001388 W KR 2009001388W WO 2010021447 A1 WO2010021447 A1 WO 2010021447A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
user
surgical robot
robot
controller
Prior art date
Application number
PCT/KR2009/001388
Other languages
French (fr)
Korean (ko)
Inventor
원종석
최승욱
Original Assignee
(주)미래컴퍼니
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)미래컴퍼니 filed Critical (주)미래컴퍼니
Priority to CN2009801410053A priority Critical patent/CN102186434B/en
Publication of WO2010021447A1 publication Critical patent/WO2010021447A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/32Surgical cutting instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the present invention relates to a 3D display system of a surgical robot and a control method thereof.
  • surgery refers to repairing a disease by cutting, slitting, or manipulating skin, mucous membranes, or other tissues with a medical device.
  • open surgery which incise the skin of the surgical site and open, treat, shape, or remove the organs inside of the surgical site, has recently been performed using robots due to problems such as bleeding, side effects, patient pain, and scars. This alternative is in the spotlight.
  • a first microscope display method may be used.
  • the microscope method is a method of performing robot surgery while looking at the surgical site in three dimensions through a microscope installed in the surgical robot, and there is a fear that the shoulder or neck of the doctor may not be able to concentrate on the operation or cause malfunction of the robot during a long operation. have.
  • HMD head mount display
  • HMD head mount display
  • a polarizing display in the form of glasses, which realizes a three-dimensional image in a polarized manner, and thus the image is dark in general, and there is a limitation in that the polarized glasses must be removed to see another one during surgery.
  • the present invention by moving the 3D display according to the operator's position in the robot surgery process, even if the operator does not change the posture to see the image or wear the display device on the operator's body, robot surgery while simply checking the three-dimensional image as needed It is to provide a 3D display system and a control method of the surgical robot that can proceed.
  • a 3D display for displaying three-dimensional image information obtained from the surgical site, a robot arm coupled to the 3D display, a surgical robot to which the robot arm is connected, a controller included in the surgical robot Including, but the controller is provided with a 3D display system of the surgical robot, characterized in that for driving the robot arm to move the 3D display to a predetermined position.
  • the image information may be image information taken by a vision system inserted into a surgical patient and processed to be displayed on a 3D display.
  • the robotic arm may include an articulated link driven by a controller to move the 3D display to a predetermined position.
  • One end of the robot arm is fixed to the structure of the operating room, the robot arm may be electrically connected to the surgical robot.
  • the surgical robot is provided with a console unit for user manipulation, and the controller can move the 3D display according to the position of the user's eyes.
  • an operation handle is installed in the console, and the controller may move the 3D display according to a user's operation on the operation handle.
  • the controller is connected to a sensing unit for sensing the position of the user's eyes, and the controller may receive a signal from the sensing unit and move the 3D display according to the position of the user's eyes.
  • the controller is connected to a storage unit for storing data regarding the position of the 3D display to be moved according to the eye position of each user, and the controller loads the data stored in the storage unit in accordance with the user who operates the console unit, and previews the 3D display in advance. You can move it to a stored location.
  • a method of controlling a 3D display coupled to a robot arm electrically connected to a surgical robot comprising: acquiring information about a user accessing the surgical robot; Accordingly, a method of controlling a 3D display system of a surgical robot including driving a robot arm to move a 3D display is provided.
  • the 3D display may display image information captured by the vision system inserted into the body of the surgical patient, and the information about the user may include information about the position of the user's eyes.
  • the acquiring step may include sensing a position of an eye of the user by a sensor, and receiving a signal from the sensor to derive information about the user. Meanwhile, the information about the user is stored in advance in the form of data regarding the position of the 3D display to be moved according to the position of the user's eyes, and in this case, the obtaining step may include loading the data.
  • the 3D display is mounted on the articulated robot arm and the robot arm is manually or automatically driven so that the 3D display is moved by tracking the position of the operator's eyes, so as to confirm the 3D image.
  • the 3D display is preset for each operator, so that the operator and the operator's posture are always correct. Dimensional image information can be provided.
  • FIG. 1 is a perspective view showing a 3D display system of a surgical robot according to an embodiment of the present invention.
  • Figure 2 is a flow chart showing a 3D display system control method of a surgical robot according to an embodiment of the present invention.
  • operation handle 10 3D display
  • sensing unit 24 storage unit
  • FIG. 1 is a perspective view showing a 3D display system of a surgical robot according to an embodiment of the present invention.
  • a surgical robot 1 a console unit 3, an operation handle 5, a 3D display 10, a robot arm 12, a controller 20, a sensing unit 22, and a storage unit 24 is shown.
  • the present embodiment is a so-called 'articulated 3D display' method, and does not fix the part where the 3D image is displayed on the console or wear it on the user's head as in the microscope method or the HMD method. It is characterized in that the 3D display 10 is free to move by connecting the robot arm 12 to 10).
  • 3D display is a system that reproduces 3D images, and includes a software technology that can be viewed in 3D and hardware that implements the contents created by the software in 3D. Since the human eye is about 65 mm apart, when images with a slight parallax (binocular disparity) of left and right information enter both eyes, it is transmitted through the optic nerve to the brain and the two images merge in the brain. You will feel a three-dimensional sense of space.
  • the 3D display is a system that allows a user to feel a virtual three-dimensional feeling in a flat display hardware.
  • a two-dimensional display device simultaneously displays two left and right images and sends them to each eye to create a virtual three-dimensional effect.
  • a polarization method that separates images of left and right eyes using a light shielding effect by a combination of orthogonal polarizing elements, and a screen in which images of left and right channels are alternately rapidly displayed on one screen through shutter glasses or the like.
  • Time-division method that allows fast blocking and opening / closing, allowing only one direction image to enter one eye.
  • HMD that displays the screen in front of the eye with a helmet-type display device and focuses clearly using the lens.
  • a robot arm having a 3D display 10 which enables the display of each of the left and right in various ways as described above is compact and lightweight as shown in FIG. 1, and has a predetermined degree of freedom. Combination of the 12 allows the user to move the 3D display 10 as desired.
  • the doctor performing robotic surgery may move the 3D display 10 to a position desired by the doctor by tracking the posture of the doctor, as well as when the surgeon is sitting or standing. .
  • This movement of the 3D display 10 can be manually or automatically driven by the robot arm 12 to be the most comfortable position for the user to check the 3D image.
  • the 3D display system of the surgical robot is coupled to the robot arm 12 to the 3D display 10, the robot arm 12 is connected to the surgical robot 1, the surgical robot (
  • the basic structure is a configuration for controlling the movement of the 3D display 10 by the controller 20 included in 1).
  • the 3D display 10 is a device for displaying three-dimensional image information taken from the operating field, such as the operating room, the detailed configuration may be implemented by applying a variety of methods as described above, such as polarization method, time division method, color difference method, Here, a detailed description of the configuration of the 3D display 10 itself is omitted.
  • the vision system serves to provide the image information photographed in the surgical process, it is a system including a device such as a laparoscope, an endoscope, a microscope, a magnifying glass, a reflector.
  • a laparoscope is used as an example of a vision system.
  • the image displayed through the 3D display 10 may be an image taken from the laparoscope inserted into the body of the surgical patient, and if necessary, the image information taken from the laparoscope is displayed as a 3D image on the 3D display 10.
  • the image processed data can be used.
  • the robot arm 12 is coupled to the 3D display 10 according to the present exemplary embodiment so that the user can manually or automatically move the 3D display 10.
  • the robot arm 12 is driven by receiving an electrical signal by the controller 20 as described below, the robot arm 12 according to the present embodiment may be formed of a multi-articulated link.
  • the robot arm 12 may be configured in the form of a link connected by three joints. have.
  • the robot arm 12 is not necessarily configured in the form of a multi-joint link, and the robot is applied by applying various mechanisms capable of freely moving the 3D display 10 by receiving a signal from the controller 20.
  • the arm 12 may be configured.
  • the robot arm 12 is driven by the controller 20.
  • the controller 20 may be implemented in various forms such as a microprocessor embedded in the surgical robot 1, a control box connected to the surgical robot 1, or a personal computer connected to the surgical robot 1 through wired / wireless communication. Can be.
  • the controller 20 receives a user's input in the case of the manual method, a pre-stored data or a signal sensed from the sensor in the case of the automatic method, and determines the final position to which the 3D display 10 moves, and the 3D display 10. Calculates the degree to which the robot arm 12 should be driven to move from the current position to the final position, and then generates and transmits a signal such that the robot arm 12 is driven by the calculated degree.
  • the controller 20 calculates the degree of rotation of each joint, and then transmits a driving signal to each joint so that each joint rotates by the calculated degree. .
  • the robot arm 12 is connected to the surgical robot 1. That is, as shown in FIG. 1, the other end of the robot arm 12 having the 3D display 10 coupled to one end thereof may be coupled to the surgical robot 1.
  • the robot arm 12 coupled to the surgical robot 1 may also be electrically connected to the surgical robot 1 to receive a driving signal generated from the controller 20.
  • the surgical robot 1 is a master / slave system, that is, a master robot that generates and transmits a signal required by a user's operation, and a slave that receives a signal from a master robot and directly applies a manipulation necessary for surgery to a patient.
  • the instruments mounted on the surgical robot (1) by the user's operation of the robot is formed separately separately, or the master-slave integrated type, that is, integrated surgical robot (1) to perform the operation required for surgery It can be made in the form of adding.
  • the robot arm 12 is not necessarily coupled to the surgical robot 1.
  • the robot arm 12 is fixed to the ceiling of an operating room and the like, and is electrically connected to the surgical robot 1 by cable or wired / wireless communication, whereby the robot arm 12 is connected by the controller 20. It can also be configured to be driven.
  • the 3D display 10 may be connected to the surgeon no matter what position the doctor takes.
  • the 3D display 10 may be freely positioned so that a doctor or a nurse or an assistant may view a 3D image if necessary.
  • the 3D display 10 may be moved by manual operation by a user or automatic operation by preset data or sensors.
  • the controller 20 drives the robot arm 12 so that the 3D display 10 is located in front of the user's eyes.
  • the controller 20 transmits a driving signal to the robot arm 12 so that the 3D display 10 moves to a point where the user's eyes are located, where the user's eyes are located. It can grasp
  • an operation handle 5 is installed in the console unit 3, and the user operates the operation handle 5 so that the 3D display 10 is located in the user's eyes.
  • the controller 20 drives the robot arm 12 according to the value input from the manipulation handle 5 to position the 3D display 10 in front of the user's eyes.
  • the manipulation handle 5 may be a dedicated handle for the movement of the 3D display 10, and the main handle for the manipulation of the surgical instrument may move the 3D display 10. It can also be used as an operation handle (5) for.
  • a selection button or the like is additionally installed on the main handle, and the main handle is normally used for the operation of the instrument, and when the 3D display 10 is moved, the main handle is operated by pressing the selection button after pressing the selection button. In this embodiment, it can also be used as another operation handle (5).
  • the user may grasp the operation handle 5 and move the 3D display 10 directly.
  • the 3D display 10 may automatically move in front of the user by sensing a point where the user's eyes are located.
  • the sensing unit 22 which detects the user's eyes is connected to the controller 20, and the controller 20 receives the signal transmitted by the sensing unit 22 sensing the position of the user's eyes and the robot arm. (12) is driven.
  • the sensing unit 22 may include a sensor for detecting the eye of the user, a cable for signal transmission, and in some cases a processor for processing a signal.
  • the processor analyzes the image obtained from the image sensor to calculate the coordinates of the location where the user's eyes are located, the calculated coordinate value is a controller ( 20, the controller 20 may cause the 3D display 10 to move to the position of the user's eyes.
  • the sensing unit 22 does not necessarily sense the user's eyes, and when the user's face or other part of the body or the user wears a specific device, the corresponding device is detected and therefrom. The position of the eye of the user may be determined and a signal corresponding thereto may be transmitted to the controller 20.
  • the sensing unit 22 detects the corresponding mark.
  • the position of the eye of the user may be calculated from the position of the mark, and the sensing signal may be generated and transmitted accordingly.
  • the 3D display 10 system may detect the position or movement of the user, and allow the 3D display 10 to automatically track and move the position of the user's eyes within a predetermined range.
  • the surgical robot 1 For each user who operates the surgical robot 1 according to the present embodiment is set in advance to the point where the 3D display 10 should be located, so that when the user operates the surgical robot (1)
  • the 3D display 10 may be configured to move.
  • the storage unit 24 which stores data about the position of the 3D display 10, for example, the eye position, for each user in advance, is connected to the controller 20, and the user is a surgical robot.
  • the controller 20 loads data about the user from the storage unit 24 to drive the robot arm 12 so that the 3D display 10 is located in front of the user's eyes.
  • the surgical robot 1 may further include an ID identification unit for determining a user, so as to identify a user accessing the surgical robot 1.
  • the ID identification unit may be configured to manually recognize an ID card possessed by the user, or automatically recognize an ID chip worn on the user's body, or install a fingerprint reader to recognize the user.
  • An identification device can be used.
  • the controller 20 retrieves data about the user from the storage unit 24 and positions the 3D display 10 to fit the user. Let's go.
  • the 3D display 10 is moved by the above-described manual operation or sensing method, and the data used at this time is stored as data about the user. It can be stored in addition to section 24.
  • the data related to the position of the 3D display 10 can be updated and stored as needed. Data can be stored.
  • the controller 20 loads the pre-stored data about the posture of the user and moves the 3D display 10 to the desired position. You can.
  • the 3D display system individually stores the positions where the 3D display 10 is to be positioned for each user having different physical conditions, and the like, so that the 3D display is optimized for each user who uses the surgical robot 1.
  • the drive of (10) can be managed.
  • the 3D display system according to the present embodiment can be used regardless of the presence or absence of a slave robot for the operation of the surgical instrument, as described above, the instrument of the master-slave integrated surgical robot 1, that is, 'intelligent' It can also be used for 'intelligent instruments'.
  • FIG. 2 is a flowchart illustrating a method of controlling a 3D display system of a surgical robot according to an exemplary embodiment of the present invention.
  • a method of controlling the above-described 3D display system will be described.
  • the 3D display 10 is electrically connected to the surgical robot 1, and the surgical robot 1 is equipped with a controller 20 so that the 3D display 10 connected to the surgical robot 1 ( 10) to control the movement.
  • This embodiment is characterized in that the control method for moving the 3D display 10 to fit the user using the surgical robot 1 to position the 3D display 10 most suitable for each user and the posture of each user.
  • the control method for moving the 3D display 10 to fit the user using the surgical robot 1 to position the 3D display 10 most suitable for each user and the posture of each user is obtained (S10).
  • the information about the user corresponds to input information for moving the 3D display 10.
  • the information about the user is related to the position of the user's eyes. Can be coordinates.
  • the image displayed through the 3D display 10 may be an image taken from the abdominal cavity inserted into the body of the surgical patient, and if necessary, display the image information taken from the laparoscope, etc., as a 3D image on the 3D display 10. As described above, the image processed data may be used.
  • a method of sensing a position of a user's eyes by a sensor S12
  • a method of loading data about a position of an eye of a user stored in advance S16
  • the senor When the position of the user's eyes is sensed by the sensor, the sensor generates and transmits a predetermined signal according to the sensing, and the controller 20 according to the present embodiment receives the information about the user, that is, the eye of the user. Position coordinates and the like are derived (S14).
  • the data is a position to which the 3D display 10 is to be moved according to the position of the user's eyes for each user and for each user's posture as described above. It may be made in the form of a coordinate with respect to. Since the type of data to be stored for each user and the posture of each user and the specific means for storing and loading the data have been described above, a detailed description thereof will be omitted.
  • the robot arm 12 After sensing the position of the eye of the user (S12) or loading the position at which the 3D display 10 is to be moved according to previously stored data (S16), the robot arm 12 according to the obtained information.
  • the 3D display 10 is moved to drive (S20).
  • the 3D display 10 is automatically moved to the optimal position to fit the user.
  • the 3D display system and a control method of the surgical robot has been described in accordance with an embodiment, but is not necessarily limited thereto, the 3D display 10 and the surgical robot ( Even if the detailed configuration of 1), the connection method between the robot arm 12 and the surgical robot 1, the specific operating mode of the controller 20, etc. are different, the overall operation and effects are not different. It can be included in the scope of the right, and those skilled in the art can variously modify and change the present invention without departing from the spirit and scope of the present invention described in the claims below. You will understand.

Abstract

Disclosed are a three-dimensional display system for a surgical robot and a method for controlling same. The three-dimensional display system includes a 3D display for displaying 3-dimensional video information acquired from a surgery, a robotic arm coupled to the three-dimensional display, a surgical robot to which the robotic arm is connected, and a controller contained within the surgical robot. The controller drives the robot arm to move the three-dimensional display to a predetermined position. In the display system for a surgical robot of the present invention, a three-dimensional display is mounted on an articulated robotic arm, and the robotic arm is driven in a manual or automated system to enable the three-dimensional display to track and move along the position of the eye of an operator, to thereby eliminate the necessity of changing the posture of the operator even when the operator intends to check three-dimensional video information, and eliminate the necessity of putting a separate display on the body of the operator. Further, the present invention presets the position of the three-dimensional display for each operator, to thereby constantly provide accurate three-dimensional video information irrespective of the operator and the posture thereof.

Description

수술용 로봇의 3차원 디스플레이 시스템 및 그 제어방법3D display system of surgical robot and its control method
본 발명은 수술용 로봇의 3D 디스플레이 시스템 및 그 제어방법에 관한 것이다.The present invention relates to a 3D display system of a surgical robot and a control method thereof.
의학적으로 수술이란 피부나 점막, 기타 조직을 의료 기계를 사용하여 자르거나 째거나 조작을 가하여 병을 고치는 말한다. 특히, 수술부위의 피부를 절개하여 열고 그 내부에 있는 기관 등을 치료, 성형하거나 제거하는 개복 수술 등은 출혈, 부작용, 환자의 고통, 흉터 등의 문제로 인하여 최근에는 로봇(robot)을 사용한 수술이 대안으로서 각광받고 있다.Medically, surgery refers to repairing a disease by cutting, slitting, or manipulating skin, mucous membranes, or other tissues with a medical device. In particular, open surgery, which incise the skin of the surgical site and open, treat, shape, or remove the organs inside of the surgical site, has recently been performed using robots due to problems such as bleeding, side effects, patient pain, and scars. This alternative is in the spotlight.
이와 같이 수술용 로봇을 사용하는 로봇 수술의 경우, 의사는 환자의 체내를 3D(3-dimension) 디스플레이를 통해 3차원 영상으로 보면서 수술을 진행하게 된다. 수술용 로봇 시스템에서 3D 디스플레이를 구현하는 방식으로는, 첫째 현미경(binocular display) 방식을 들 수 있다.As described above, in the case of robotic surgery using a surgical robot, the doctor proceeds to surgery while looking at the patient's body as a 3D image through a 3D (3-dimension) display. As a method of implementing a 3D display in a surgical robot system, a first microscope display method may be used.
현미경 방식은 수술용 로봇에 설치된 현미경을 통해 수술 부위를 3차원으로 보면서 로봇 수술을 수행하는 방식으로서, 장시간 수술시 의사의 어깨나 목이 아파 수술에 집중하지 못하거나 로봇의 오동작 등을 야기할 우려가 있다.The microscope method is a method of performing robot surgery while looking at the surgical site in three dimensions through a microscope installed in the surgical robot, and there is a fear that the shoulder or neck of the doctor may not be able to concentrate on the operation or cause malfunction of the robot during a long operation. have.
둘째, HMD(head mount display) 방식을 들 수 있는데, 이는 머리에 헬멧 형태의 디스플레이를 쓰고 이를 통해 3차원 영상을 확인하면서 수술을 수행하는 방식이나, HMD가 대체로 무겁고 불편하며, 수술 중 다른 것을 보기 위해서는 헬멧을 벗어야 하는 번거로움이 있다.Second, HMD (head mount display) is a method that uses a helmet-shaped display on the head and checks three-dimensional images through this operation, but HMD is generally heavy and inconvenient, and sees other things during surgery. In order to remove the helmet is a hassle.
셋째, 안경 형태의 편광 디스플레이(polarizing display) 방식을 들 수 있는데, 이는 편광 방식으로 3차원 영상을 구현하기 때문에 영상이 전반적으로 어둡고, 수술 중 다른 것을 보기 위해서는 편광 안경을 벗어야 한다는 한계가 있다.Third, there is a polarizing display in the form of glasses, which realizes a three-dimensional image in a polarized manner, and thus the image is dark in general, and there is a limitation in that the polarized glasses must be removed to see another one during surgery.
전술한 배경기술은 발명자가 본 발명의 도출을 위해 보유하고 있었거나, 본 발명의 도출 과정에서 습득한 기술 정보로서, 반드시 본 발명의 출원 전에 일반 공중에게 공개된 공지기술이라 할 수는 없다.The background art described above is technical information possessed by the inventors for the derivation of the present invention or acquired during the derivation process of the present invention, and is not necessarily known technology disclosed to the general public before the present application.
본 발명은, 로봇 수술 과정에서 수술자의 위치에 따라 3D 디스플레이를 이동시킴으로써, 수술자가 자세를 바꿔 영상을 보거나 디스플레이 장치를 수술자의 신체에 착용하지 않아도, 필요에 따라 간편하게 3차원 영상을 확인하면서 로봇 수술을 진행할 수 있는 수술용 로봇의 3D 디스플레이 시스템 및 그 제어방법을 제공하는 것이다.The present invention, by moving the 3D display according to the operator's position in the robot surgery process, even if the operator does not change the posture to see the image or wear the display device on the operator's body, robot surgery while simply checking the three-dimensional image as needed It is to provide a 3D display system and a control method of the surgical robot that can proceed.
본 발명이 제시하는 이외의 기술적 과제들은 하기의 설명을 통해 쉽게 이해될 수 있을 것이다.Technical problems other than the present invention will be easily understood through the following description.
본 발명의 일 측면에 따르면, 수술 현장으로부터 획득된 3차원 영상 정보를 표시하는 3D 디스플레이와, 3D 디스플레이에 결합된 로봇 암과, 로봇 암이 연결되는 수술용 로봇과, 수술용 로봇에 포함된 컨트롤러를 포함하되, 컨트롤러는 3D 디스플레이가 소정의 위치로 이동하도록 로봇 암을 구동시키는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템이 제공된다.According to an aspect of the present invention, a 3D display for displaying three-dimensional image information obtained from the surgical site, a robot arm coupled to the 3D display, a surgical robot to which the robot arm is connected, a controller included in the surgical robot Including, but the controller is provided with a 3D display system of the surgical robot, characterized in that for driving the robot arm to move the 3D display to a predetermined position.
영상 정보는 수술 환자 내에 삽입되는 비전 시스템에 의해 촬영되어 3D 디스플레이에 표시되도록 처리된 영상 정보일 수 있다. 로봇 암은 3D 디스플레이가 소정의 위치로 이동하도록 컨트롤러에 의해 구동되는 다관절 링크를 포함할 수 있다. 로봇 암의 일단부는 수술실의 구조체에 고정되며, 로봇 암은 수술용 로봇과 전기적으로 연결될 수 있다.The image information may be image information taken by a vision system inserted into a surgical patient and processed to be displayed on a 3D display. The robotic arm may include an articulated link driven by a controller to move the 3D display to a predetermined position. One end of the robot arm is fixed to the structure of the operating room, the robot arm may be electrically connected to the surgical robot.
수술용 로봇에는 사용자의 조작을 위한 콘솔부가 설치되며, 컨트롤러는 사용자의 눈의 위치에 상응하여 3D 디스플레이를 이동시킬 수 있다. 이 경우, 콘솔부에는 조작 핸들이 설치되며, 조작 핸들에 대한 사용자 조작에 따라 컨트롤러는 3D 디스플레이를 이동시킬 수 있다.The surgical robot is provided with a console unit for user manipulation, and the controller can move the 3D display according to the position of the user's eyes. In this case, an operation handle is installed in the console, and the controller may move the 3D display according to a user's operation on the operation handle.
컨트롤러에는 사용자의 눈의 위치를 감지하는 센싱부가 연결되며, 컨트롤러는 센싱부로부터 신호를 수신하여, 3D 디스플레이를 사용자의 눈의 위치에 따라 이동시킬 수 있다. 컨트롤러에는 각 사용자별 눈의 위치에 따라 3D 디스플레이가 이동될 위치에 관한 데이터를 저장하는 저장부가 연결되며, 컨트롤러는 콘솔부를 조작하는 사용자에 상응하여 저장부에 저장된 데이터를 로딩하여, 3D 디스플레이를 미리 저장된 위치로 이동시킬 수 있다.The controller is connected to a sensing unit for sensing the position of the user's eyes, and the controller may receive a signal from the sensing unit and move the 3D display according to the position of the user's eyes. The controller is connected to a storage unit for storing data regarding the position of the 3D display to be moved according to the eye position of each user, and the controller loads the data stored in the storage unit in accordance with the user who operates the console unit, and previews the 3D display in advance. You can move it to a stored location.
한편, 본 발명의 다른 측면에 따르면, 수술용 로봇에 전기적으로 연결된 로봇 암에 결합되는 3D 디스플레이를 제어하는 방법으로서, 수술용 로봇에 억세스한 사용자에 관한 정보를 획득하는 단계, 및 획득된 정보에 따라 로봇 암을 구동시켜 3D 디스플레이를 이동시키는 단계를 포함하는 수술용 로봇의 3D 디스플레이 시스템 제어방법이 제공된다.Meanwhile, according to another aspect of the present invention, there is provided a method of controlling a 3D display coupled to a robot arm electrically connected to a surgical robot, the method comprising: acquiring information about a user accessing the surgical robot; Accordingly, a method of controlling a 3D display system of a surgical robot including driving a robot arm to move a 3D display is provided.
3D 디스플레이는 수술 환자의 체내에 삽입되는 비전 시스템에 의해 촬영된 영상 정보를 표시할 수 있으며, 사용자에 관한 정보는 사용자의 눈의 위치에 관한 정보를 포함할 수 있다.The 3D display may display image information captured by the vision system inserted into the body of the surgical patient, and the information about the user may include information about the position of the user's eyes.
획득 단계는, 센서에 의해 사용자의 눈의 위치를 센싱하는 단계, 및 센서로부터 신호를 수신하여 사용자에 관한 정보를 도출하는 단계를 포함할 수 있다. 한편, 사용자에 관한 정보는, 사용자의 눈의 위치에 따라 3D 디스플레이가 이동될 위치에 관한 데이터의 형태로 미리 저장되어 있으며, 이 경우 획득 단계는, 데이터를 로딩하는 단계를 포함할 수 있다.The acquiring step may include sensing a position of an eye of the user by a sensor, and receiving a signal from the sensor to derive information about the user. Meanwhile, the information about the user is stored in advance in the form of data regarding the position of the 3D display to be moved according to the position of the user's eyes, and in this case, the obtaining step may include loading the data.
전술한 것 외의 다른 측면, 특징, 잇점이 이하의 도면, 특허청구범위 및 발명의 상세한 설명으로부터 명확해질 것이다.Other aspects, features, and advantages other than those described above will become apparent from the following drawings, claims, and detailed description of the invention.
본 발명의 바람직한 실시예에 따르면, 3D 디스플레이를 다관절 로봇 암에 장착하고 로봇 암을 수동 또는 자동으로 구동시켜 3D 디스플레이가 수술자의 눈의 위치를 추적하여 이동하도록 함으로써, 3차원 영상을 확인하기 위해 수술자가 자세를 바꿀 필요가 없고, 별도의 디스플레이 장치를 수술자의 신체에 착용할 필요가 없을 뿐만 아니라, 수술자별로 3D 디스플레이의 위치를 미리 세팅시켜 놓음으로써, 수술자 및 수술자의 자세에 상관없이 항상 정확한 3차원 영상 정보를 제공할 수 있다.According to a preferred embodiment of the present invention, the 3D display is mounted on the articulated robot arm and the robot arm is manually or automatically driven so that the 3D display is moved by tracking the position of the operator's eyes, so as to confirm the 3D image. There is no need for the operator to change posture, no need for a separate display device to be worn on the operator's body, and the 3D display is preset for each operator, so that the operator and the operator's posture are always correct. Dimensional image information can be provided.
도 1은 본 발명의 실시예에 따른 수술용 로봇의 3D 디스플레이 시스템을 도시한 사시도.1 is a perspective view showing a 3D display system of a surgical robot according to an embodiment of the present invention.
도 2는 본 발명의 실시예에 따른 수술용 로봇의 3D 디스플레이 시스템 제어방법을 나타낸 순서도.Figure 2 is a flow chart showing a 3D display system control method of a surgical robot according to an embodiment of the present invention.
<도면의 주요부분에 대한 부호의 설명><Description of the symbols for the main parts of the drawings>
1 : 수술용 로봇 3 : 콘솔부1: Surgical Robot 3: Console
5 : 조작 핸들 10 : 3D 디스플레이5: operation handle 10: 3D display
12 : 로봇 암 20 : 컨트롤러12: robot arm 20: controller
22 : 센싱부 24 : 저장부22: sensing unit 24: storage unit
본 발명은 다양한 변경을 가할 수 있고 여러 가지 실시예를 가질 수 있는 바, 특정 실시예들을 도면에 예시하고 상세하게 설명하고자 한다. 그러나 이는 본 발명을 특정한 실시 형태에 대해 한정하려는 것이 아니며, 본 발명의 사상 및 기술 범위에 포함되는 모든 변경, 균등물 내지 대체물을 포함하는 것으로 이해되어야 한다.As the present invention allows for various changes and numerous embodiments, particular embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to limit the present invention to specific embodiments, it should be understood to include all changes, equivalents, and substitutes included in the spirit and scope of the present invention.
어떤 구성요소가 다른 구성요소에 "연결되어" 있다거나 "접속되어" 있다고 언급된 때에는, 그 다른 구성요소에 직접적으로 연결되어 있거나 또는 접속되어 있을 수도 있지만, 중간에 다른 구성요소가 존재할 수도 있다고 이해되어야 할 것이다. 반면에, 어떤 구성요소가 다른 구성요소에 "직접 연결되어" 있다거나 "직접 접속되어" 있다고 언급된 때에는, 중간에 다른 구성요소가 존재하지 않는 것으로 이해되어야 할 것이다. When a component is referred to as being "connected" or "connected" to another component, it may be directly connected to or connected to that other component, but it may be understood that other components may be present in between. Should be. On the other hand, when a component is said to be "directly connected" or "directly connected" to another component, it should be understood that there is no other component in between.
본 명세서에서 사용한 용어는 단지 특정한 실시예를 설명하기 위해 사용된 것으로, 본 발명을 한정하려는 의도가 아니다. 단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함한다. 본 명세서에서, "포함하다" 또는 "가지다" 등의 용어는 명세서상에 기재된 특징, 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것이 존재함을 지정하려는 것이지, 하나 또는 그 이상의 다른 특징들이나 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것들의 존재 또는 부가 가능성을 미리 배제하지 않는 것으로 이해되어야 한다.The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. As used herein, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, action, component, part, or combination thereof described on the specification, and one or more other features. It is to be understood that the present invention does not exclude the possibility of the presence or the addition of numbers, steps, operations, components, components, or a combination thereof.
또한, 첨부 도면을 참조하여 설명함에 있어, 도면 부호에 관계없이 동일한 구성 요소는 동일한 참조부호를 부여하고 이에 대한 중복되는 설명은 생략하기로 한다. 본 발명을 설명함에 있어서 관련된 공지 기술에 대한 구체적인 설명이 본 발명의 요지를 불필요하게 흐릴 수 있다고 판단되는 경우 그 상세한 설명을 생략한다.In addition, in the description with reference to the accompanying drawings, the same components regardless of reference numerals will be given the same reference numerals and duplicate description thereof will be omitted. In the following description of the present invention, if it is determined that the detailed description of the related known technology may unnecessarily obscure the subject matter of the present invention, the detailed description thereof will be omitted.
도 1은 본 발명의 실시예에 따른 수술용 로봇의 3D 디스플레이 시스템을 도시한 사시도이다. 도 1을 참조하면, 수술용 로봇(1), 콘솔부(3), 조작 핸들(5), 3D 디스플레이(10), 로봇 암(12), 컨트롤러(20), 센싱부(22), 저장부(24)가 도시되어 있다.1 is a perspective view showing a 3D display system of a surgical robot according to an embodiment of the present invention. Referring to FIG. 1, a surgical robot 1, a console unit 3, an operation handle 5, a 3D display 10, a robot arm 12, a controller 20, a sensing unit 22, and a storage unit 24 is shown.
본 실시예는 이른바, '로봇 암 3D 디스플레이(articulated 3D display)' 방식으로서, 현미경 방식이나 HMD 방식에서처럼 3차원 영상이 표시되는 부분을 콘솔에 고정하거나 사용자의 머리에 착용하는 것이 아니라, 3D 디스플레이(10)에 로봇 암(12)을 연결하여 3D 디스플레이(10)가 자유롭게 움직일 수 있도록 한 것을 특징으로 한다.The present embodiment is a so-called 'articulated 3D display' method, and does not fix the part where the 3D image is displayed on the console or wear it on the user's head as in the microscope method or the HMD method. It is characterized in that the 3D display 10 is free to move by connecting the robot arm 12 to 10).
3D 디스플레이란 3차원 영상을 재생시켜주는 시스템으로서, 3D로 보여질 수 있는 소프트웨어적인 기술과 그 소프트웨어로 만들어진 컨텐츠를 3D로 구현하는 하드웨어를 포함하는 개념이다. 사람의 눈이 약 65mm 정도 떨어져 있기 때문에, 좌우 정보에 대한 약간의 시차(양안시차; binocular disparity)를 가진 영상이 양 눈으로 들어오면, 이는 시신경을 통해 뇌로 전달되고 뇌에서 두 영상이 하나로 합쳐지면서 3차원의 공간감을 느낄 수 있게 되는 것이다.3D display is a system that reproduces 3D images, and includes a software technology that can be viewed in 3D and hardware that implements the contents created by the software in 3D. Since the human eye is about 65 mm apart, when images with a slight parallax (binocular disparity) of left and right information enter both eyes, it is transmitted through the optic nerve to the brain and the two images merge in the brain. You will feel a three-dimensional sense of space.
이처럼, 3D 디스플레이는 평면적인 디스플레이 하드웨어에서 가상적인 입체감을 느낄 수 있도록 하는 시스템으로서, 2D 디스플레이 장치에서 좌우 화상 2개를 동시에 표시하여 좌우 각각의 눈으로 보냄으로써 가상적인 입체감을 만들어 낸다.As such, the 3D display is a system that allows a user to feel a virtual three-dimensional feeling in a flat display hardware. A two-dimensional display device simultaneously displays two left and right images and sends them to each eye to create a virtual three-dimensional effect.
이러한, 3D 디스플레이에는, 직교한 편광소자의 조합에 의한 차광효과를 이용해서 좌우안의 화상을 분리하는 편광 방식, 하나의 화면에서 좌우 채널의 이미지가 빠른 속도로 번갈아 가면서 나타내는 화면을 셔터안경 등을 통해 빠르게 차단과 개폐를 반복하여 한쪽 눈에 한쪽 방향의 이미지만 들어갈 수 있도록 하는 시분할 방식, 헬멧 형태의 디스플레이 장치를 쓰고 눈 바로 앞에서 화면을 띄운 후 렌즈를 이용하여 초점을 맞춰 선명하게 볼 수 있게 하는 HMD 방식, 한쪽에는 붉은색 다른 한쪽에는 파란색으로 된 안경을 가지고 양 채널의 화상을 색상 차원에서 분리하여 입체를 구현하는 색차 방식 등이 있다.In such a 3D display, a polarization method that separates images of left and right eyes using a light shielding effect by a combination of orthogonal polarizing elements, and a screen in which images of left and right channels are alternately rapidly displayed on one screen through shutter glasses or the like. Time-division method that allows fast blocking and opening / closing, allowing only one direction image to enter one eye.HMD that displays the screen in front of the eye with a helmet-type display device and focuses clearly using the lens. There is a color difference method for realizing a stereoscopic by separating the images of both channels from the color dimension with glasses of red color on one side and blue color on the other side.
본 실시예는 전술한 바와 같은 다양한 방식으로 좌, 우 각각의 디스플레이가 가능하도록 한 3D 디스플레이(10)를, 도 1에 도시된 것처럼 소형, 경량으로 모듈화하고, 여기에 소정의 자유도를 갖는 로봇 암(12)을 결합시켜 사용자가 원하는 대로 3D 디스플레이(10)를 이동시킬 수 있도록 한 것이다.In this embodiment, a robot arm having a 3D display 10 which enables the display of each of the left and right in various ways as described above is compact and lightweight as shown in FIG. 1, and has a predetermined degree of freedom. Combination of the 12 allows the user to move the 3D display 10 as desired.
예를 들어, 로봇 수술을 수행하는 의사가 앉거나 서서 수술을 하는 경우뿐만 아니라, 심지어는 누워서 수술을 할 경우에도 의사의 자세를 추적하여 3D 디스플레이(10)를 의사가 원하는 위치로 이동시킬 수 있다. 이러한 3D 디스플레이(10)의 이동은, 수동 또는 자동으로 로봇 암(12)을 구동함으로써 사용자가 3D 영상을 확인하기에 가장 편안한 위치가 되도록 할 수 있다.For example, the doctor performing robotic surgery may move the 3D display 10 to a position desired by the doctor by tracking the posture of the doctor, as well as when the surgeon is sitting or standing. . This movement of the 3D display 10 can be manually or automatically driven by the robot arm 12 to be the most comfortable position for the user to check the 3D image.
이를 위해 본 실시예에 따른 수술용 로봇의 3D 디스플레이 시스템은 3D 디스플레이(10)에 로봇 암(12)을 결합하고, 로봇 암(12)을 수술용 로봇(1)에 연결하고, 수술용 로봇(1)에 포함된 컨트롤러(20)로 3D 디스플레이(10)의 이동을 제어하는 구성을 그 기본 구조로 한다.To this end, the 3D display system of the surgical robot according to the present embodiment is coupled to the robot arm 12 to the 3D display 10, the robot arm 12 is connected to the surgical robot 1, the surgical robot ( The basic structure is a configuration for controlling the movement of the 3D display 10 by the controller 20 included in 1).
3D 디스플레이(10)는 수술실 등 수술 현장으로부터 촬영된 3차원 영상 정보를 표시하는 장치로서, 그 상세한 구성은 편광 방식, 시분할 방식, 색차 방식 등 전술한 바와 같이 다양한 방식을 적용하여 구현될 수 있으며, 여기서는 3D 디스플레이(10) 자체의 구성에 대한 상세한 설명은 생략한다.The 3D display 10 is a device for displaying three-dimensional image information taken from the operating field, such as the operating room, the detailed configuration may be implemented by applying a variety of methods as described above, such as polarization method, time division method, color difference method, Here, a detailed description of the configuration of the 3D display 10 itself is omitted.
한편, 비전 시스템(vision system)은 수술 과정에서 촬영된 영상 정보를 제공하는 역할을 하며, 복강경(laparoscope), 내시경(endoscope), 현미경(microscope), 확대경, 반사경 등과 같은 기기를 포함하는 시스템이다. 이하에서는 비전 시스템의 일례로서 복강경이 사용되는 경우를 예로 들어 설명한다.On the other hand, the vision system (vision system) serves to provide the image information photographed in the surgical process, it is a system including a device such as a laparoscope, an endoscope, a microscope, a magnifying glass, a reflector. Hereinafter, a case where a laparoscope is used as an example of a vision system will be described.
3D 디스플레이(10)를 통해 표시되는 영상은 수술 환자의 체내에 삽입되는 복강경으로부터 촬영된 영상일 수 있으며, 필요에 따라서는 복강경 등으로부터 촬영된 영상 정보를 3D 디스플레이(10)에 3차원 영상으로 표시될 수 있도록 이미지 프로세싱한 데이터가 사용될 수 있다.The image displayed through the 3D display 10 may be an image taken from the laparoscope inserted into the body of the surgical patient, and if necessary, the image information taken from the laparoscope is displayed as a 3D image on the 3D display 10. The image processed data can be used.
전술한 바와 같이 본 실시예에 따른 3D 디스플레이(10)에는 로봇 암(12)이 결합되어 사용자가 수동 또는 자동으로 3D 디스플레이(10)를 이동시킬 수 있도록 구성된다. 후술하는 것처럼 컨트롤러(20)에 의해 전기 신호를 받아 로봇 암(12)이 구동된다고 할 때, 본 실시예에 따른 로봇 암(12)은 다관절 링크로 이루어질 수 있다. 예를 들어, 3 자유도를 갖는 로봇 암(12)을 사용하여 3D 디스플레이(10)가 공간상의 특정 지점으로 이동하도록 한다고 할 때, 로봇 암(12)은 3개의 관절로 연결된 링크 형태로 구성될 수 있다.As described above, the robot arm 12 is coupled to the 3D display 10 according to the present exemplary embodiment so that the user can manually or automatically move the 3D display 10. When the robot arm 12 is driven by receiving an electrical signal by the controller 20 as described below, the robot arm 12 according to the present embodiment may be formed of a multi-articulated link. For example, when the 3D display 10 is moved to a specific point in space using the robot arm 12 having three degrees of freedom, the robot arm 12 may be configured in the form of a link connected by three joints. have.
다만, 본 실시예에 따른 로봇 암(12)이 반드시 다관절 링크 형태로 구성되어야 하는 것은 아니며, 컨트롤러(20)로부터 신호를 받아 3D 디스플레이(10)를 자유롭게 이동시킬 수 있는 다양한 메커니즘을 적용하여 로봇 암(12)을 구성할 수도 있음은 물론이다.However, the robot arm 12 according to the present embodiment is not necessarily configured in the form of a multi-joint link, and the robot is applied by applying various mechanisms capable of freely moving the 3D display 10 by receiving a signal from the controller 20. Of course, the arm 12 may be configured.
본 실시예에 따른 로봇 암(12)은 컨트롤러(20)에 의해 구동된다. 컨트롤러(20)는 수술용 로봇(1)에 내장된 마이크로프로세서나, 수술용 로봇(1)에 연결된 컨트롤 박스, 또는 수술용 로봇(1)과 유/무선 통신으로 연결된 퍼스널 컴퓨터 등 다양한 형태로 구현될 수 있다.The robot arm 12 according to this embodiment is driven by the controller 20. The controller 20 may be implemented in various forms such as a microprocessor embedded in the surgical robot 1, a control box connected to the surgical robot 1, or a personal computer connected to the surgical robot 1 through wired / wireless communication. Can be.
컨트롤러(20)는, 수동 방식의 경우에는 사용자의 입력, 자동 방식의 경우에는 미리 저장된 데이터 또는 센서로부터 센싱된 신호를 받아, 3D 디스플레이(10)가 이동할 최종 위치를 파악하고, 3D 디스플레이(10)가 현재 위치로부터 최종 위치로 이동하기 위해 로봇 암(12)이 구동되어야 할 정도를 연산한 후, 로봇 암(12)이 연산된 정도만큼 구동되도록 신호를 생성, 전송한다.The controller 20 receives a user's input in the case of the manual method, a pre-stored data or a signal sensed from the sensor in the case of the automatic method, and determines the final position to which the 3D display 10 moves, and the 3D display 10. Calculates the degree to which the robot arm 12 should be driven to move from the current position to the final position, and then generates and transmits a signal such that the robot arm 12 is driven by the calculated degree.
전술한 것처럼, 다관절 링크 형태로 로봇 암(12)을 구성한 경우, 컨트롤러(20)는 각 관절의 회전 정도를 연산한 후, 각 관절이 연산된 정도만큼 회전하도록 각 관절에 구동 신호를 전송한다.As described above, when the robot arm 12 is configured in the form of articulated links, the controller 20 calculates the degree of rotation of each joint, and then transmits a driving signal to each joint so that each joint rotates by the calculated degree. .
본 실시예에 따른 로봇 암(12)은 수술용 로봇(1)에 연결된다. 즉, 도 1에 도시된 것처럼 일단부에 3D 디스플레이(10)가 결합된 로봇 암(12)의 타단부를 수술용 로봇(1)에 결합할 수 있다. 수술용 로봇(1)에 결합된 로봇 암(12)은 수술용 로봇(1)과 전기적으로도 연결됨으로써, 컨트롤러(20)로부터 생성된 구동 신호를 전송받을 수 있다.The robot arm 12 according to this embodiment is connected to the surgical robot 1. That is, as shown in FIG. 1, the other end of the robot arm 12 having the 3D display 10 coupled to one end thereof may be coupled to the surgical robot 1. The robot arm 12 coupled to the surgical robot 1 may also be electrically connected to the surgical robot 1 to receive a driving signal generated from the controller 20.
수술용 로봇(1)은 마스터/슬레이브 시스템, 즉 사용자의 조작에 의해 필요한 신호를 생성하여 전송하는 마스터 로봇과, 마스터(master) 로봇으로부터 신호를 받아 직접 환자에 수술에 필요한 조작을 가하는 슬레이브(slave) 로봇이 각각 별도로 분리된 형태로 이루어지거나, 마스터-슬레이브 통합형, 즉 하나로 통합된 수술용 로봇(1)에 대한 사용자의 조작에 의해 수술용 로봇(1)에 장착된 인스트루먼트가 수술에 필요한 조작을 가하는 형태로 이루어질 수 있다.The surgical robot 1 is a master / slave system, that is, a master robot that generates and transmits a signal required by a user's operation, and a slave that receives a signal from a master robot and directly applies a manipulation necessary for surgery to a patient. ) The instruments mounted on the surgical robot (1) by the user's operation of the robot is formed separately separately, or the master-slave integrated type, that is, integrated surgical robot (1) to perform the operation required for surgery It can be made in the form of adding.
다만, 본 실시예에 따른 로봇 암(12)이 반드시 수술용 로봇(1)에 결합되어야 하는 것은 아니다. 예를 들어, 로봇 암(12)은 수술실의 천정 등에 고정시키고, 수술용 로봇(1)과는 케이블이나 유/무선 통신에 의해 전기적으로 연결함으로써, 컨트롤러(20)에 의해 로봇 암(12)이 구동되도록 구성할 수도 있다.However, the robot arm 12 according to the present embodiment is not necessarily coupled to the surgical robot 1. For example, the robot arm 12 is fixed to the ceiling of an operating room and the like, and is electrically connected to the surgical robot 1 by cable or wired / wireless communication, whereby the robot arm 12 is connected by the controller 20. It can also be configured to be driven.
이처럼, 3D 디스플레이(10)를 로봇 암(12)에 결합하고, 로봇 암(12)을 수술용 로봇(1)에 전기적으로 연결함으로써, 의사가 어떤 자세를 취하든지 3D 디스플레이(10)를 의사가 원하는 위치로 이동시킬 수 있으며, 의사는 물론 필요에 따라서는 보조의나 간호사들도 3차원 영상을 볼 수 있도록 3D 디스플레이(10)를 자유롭게 위치시킬 수 있다.As such, by coupling the 3D display 10 to the robotic arm 12 and electrically connecting the robotic arm 12 to the surgical robot 1, the 3D display 10 may be connected to the surgeon no matter what position the doctor takes. The 3D display 10 may be freely positioned so that a doctor or a nurse or an assistant may view a 3D image if necessary.
수술용 로봇(1)에 사용자의 조작을 위한 콘솔부(3)가 설치된 경우, 사용자에 의한 수동 조작, 또는 미리 설정된 데이터나 센서에 의한 자동 조작에 의해 3D 디스플레이(10)를 이동시킬 수 있는데, 이 경우 컨트롤러(20)는 사용자의 눈앞에 3D 디스플레이(10)가 위치하도록 로봇 암(12)을 구동시킨다.When the console part 3 for the user's operation is installed in the surgical robot 1, the 3D display 10 may be moved by manual operation by a user or automatic operation by preset data or sensors. In this case, the controller 20 drives the robot arm 12 so that the 3D display 10 is located in front of the user's eyes.
즉, 본 실시예에 따른 컨트롤러(20)는 사용자의 눈이 위치하는 지점으로 3D 디스플레이(10)가 이동하도록 로봇 암(12)에 구동 신호를 전송하게 되는데, 사용자의 눈이 위치하는 지점은 사용자에 의한 조작, 미리 설정된 데이터, 또는 센서에 의해 검출된 값 등에 의해 파악할 수 있다.That is, the controller 20 according to the present exemplary embodiment transmits a driving signal to the robot arm 12 so that the 3D display 10 moves to a point where the user's eyes are located, where the user's eyes are located. It can grasp | ascertain by operation by, preset data, the value detected by the sensor, etc.
사용자에 의한 수동 조작에 의해 3D 디스플레이(10)를 이동시킬 경우, 콘솔부(3)에는 조작 핸들(5)이 설치되며, 3D 디스플레이(10)가 사용자의 눈에 위치하도록 사용자가 조작 핸들(5)을 조작하면, 컨트롤러(20)는 조작 핸들(5)로부터 입력된 값에 따라 로봇 암(12)을 구동시켜 3D 디스플레이(10)를 사용자의 눈앞에 위치시키게 된다.When the 3D display 10 is moved by a manual operation by a user, an operation handle 5 is installed in the console unit 3, and the user operates the operation handle 5 so that the 3D display 10 is located in the user's eyes. ), The controller 20 drives the robot arm 12 according to the value input from the manipulation handle 5 to position the 3D display 10 in front of the user's eyes.
본 실시예에 따른 조작 핸들(5)은 3D 디스플레이(10)의 이동을 위한 전용 핸들일 수도 있고, 수술용 인스트루먼트(instrument)의 조작을 위한 메인(main) 핸들이 3D 디스플레이(10)의 이동을 위한 조작 핸들(5)로 사용되도록 할 수도 있다. 후자의 경우 메인 핸들에 선택 버튼 등을 추가로 설치하여, 평소에는 메인 핸들을 인스트루먼트의 조작을 위해 사용하다가, 3D 디스플레이(10)를 이동시킬 경우에는 선택 버튼을 누른 후 핸들을 조작함으로써, 메인 핸들이 본 실시예에 다른 조작 핸들(5)로도 사용되도록 할 수 있다.The manipulation handle 5 according to the present embodiment may be a dedicated handle for the movement of the 3D display 10, and the main handle for the manipulation of the surgical instrument may move the 3D display 10. It can also be used as an operation handle (5) for. In the latter case, a selection button or the like is additionally installed on the main handle, and the main handle is normally used for the operation of the instrument, and when the 3D display 10 is moved, the main handle is operated by pressing the selection button after pressing the selection button. In this embodiment, it can also be used as another operation handle (5).
또한, 본 실시예에 따른 조작 핸들(5)을 3D 디스플레이(10)에 설치하여, 사용자가 조작 핸들(5)을 잡고 3D 디스플레이(10)를 직접 움직일 수도 있다.In addition, by installing the operation handle 5 according to the present embodiment on the 3D display 10, the user may grasp the operation handle 5 and move the 3D display 10 directly.
한편, 사용자의 눈이 위치하는 지점을 센싱하여 3D 디스플레이(10)가 자동으로 사용자의 눈앞으로 이동하도록 할 수도 있다. 이 경우에는 사용자의 눈을 감지하는 센싱부(22)를 컨트롤러(20)에 연결하고, 센싱부(22)가 사용자의 눈의 위치를 센싱하여 전송한 신호를 컨트롤러(20)가 수신하여 로봇 암(12)을 구동시키게 된다.Meanwhile, the 3D display 10 may automatically move in front of the user by sensing a point where the user's eyes are located. In this case, the sensing unit 22 which detects the user's eyes is connected to the controller 20, and the controller 20 receives the signal transmitted by the sensing unit 22 sensing the position of the user's eyes and the robot arm. (12) is driven.
센싱부(22)는 사용자의 눈을 감지하는 센서와 신호 전송을 위한 케이블, 그리고 경우에 따라서는 신호를 처리하는 프로세서 등을 포함할 수 있다. 예를 들어, 사용자의 눈의 위치를 파악하기 위해 이미지 센서를 사용할 경우, 프로세서는 이미지 센서로부터 획득된 영상을 분석하여 사용자의 눈이 위치하는 지점의 좌표를 산출하고, 산출된 좌표값은 컨트롤러(20)에 전송됨으로써, 컨트롤러(20)가 사용자의 눈의 위치로 3D 디스플레이(10)를 이동시키도록 할 수 있다.The sensing unit 22 may include a sensor for detecting the eye of the user, a cable for signal transmission, and in some cases a processor for processing a signal. For example, when using the image sensor to determine the position of the user's eyes, the processor analyzes the image obtained from the image sensor to calculate the coordinates of the location where the user's eyes are located, the calculated coordinate value is a controller ( 20, the controller 20 may cause the 3D display 10 to move to the position of the user's eyes.
다만, 본 실시예에 따른 센싱부(22)가 반드시 사용자의 눈을 센싱해야만 하는 것은 아니며, 사용자의 얼굴이나 신체의 다른 부분, 또는 사용자가 특정 기구물을 착용할 경우에는 해당 기구물을 감지하여, 그로부터 사용자의 눈의 위치를 파악하고 그에 따른 신호를 컨트롤러(20)로 전송하도록 할 수도 있다.However, the sensing unit 22 according to the present embodiment does not necessarily sense the user's eyes, and when the user's face or other part of the body or the user wears a specific device, the corresponding device is detected and therefrom. The position of the eye of the user may be determined and a signal corresponding thereto may be transmitted to the controller 20.
예를 들어, 본 실시예에 따른 센싱부(22)가 감지할 수 있는 전용 표식을 사용자가 신체의 특정 위치에 착용하고 로봇 수술을 수행하게 되면, 센싱부(22)는 해당 표식을 감지한 후, 그 표식의 위치로부터 사용자의 눈의 위치를 산출하고 그에 따라 센싱 신호를 생성, 전송할 수 있다.For example, when the user wears a dedicated mark that can be detected by the sensing unit 22 according to the present embodiment at a specific position of the body and performs robot surgery, the sensing unit 22 detects the corresponding mark. The position of the eye of the user may be calculated from the position of the mark, and the sensing signal may be generated and transmitted accordingly.
이로써, 본 실시예에 따른 3D 디스플레이(10) 시스템은 사용자의 위치 또는 움직임을 감지하여, 일정 범위 내에서 3D 디스플레이(10)가 자동으로 사용자의 눈의 위치를 추적하여 이동하도록 할 수 있다.As a result, the 3D display 10 system according to the present exemplary embodiment may detect the position or movement of the user, and allow the 3D display 10 to automatically track and move the position of the user's eyes within a predetermined range.
한편, 본 실시예에 따른 수술용 로봇(1)을 조작하는 사용자별로 3D 디스플레이(10)가 위치해야할 지점을 미리 세팅해 놓고, 해당 사용자가 수술용 로봇(1)을 조작할 경우에는 그에 맞도록 3D 디스플레이(10)를 이동시키도록 구성할 수 있다.On the other hand, for each user who operates the surgical robot 1 according to the present embodiment is set in advance to the point where the 3D display 10 should be located, so that when the user operates the surgical robot (1) The 3D display 10 may be configured to move.
이 경우에는 각 사용자별로 3D 디스플레이(10)가 위치할 지점, 예를 들면 눈의 위치에 관한 데이터를 미리 저장하고 있는 저장부(24)를 컨트롤러(20)에 연결하고, 해당 사용자가 수술용 로봇(1)에 억세스하게 되면, 컨트롤러(20)는 저장부(24)로부터 해당 사용자에 관한 데이터를 로딩하여 사용자의 눈앞에 3D 디스플레이(10)가 위치하도록 로봇 암(12)을 구동시키게 된다.In this case, the storage unit 24, which stores data about the position of the 3D display 10, for example, the eye position, for each user in advance, is connected to the controller 20, and the user is a surgical robot. When accessing (1), the controller 20 loads data about the user from the storage unit 24 to drive the robot arm 12 so that the 3D display 10 is located in front of the user's eyes.
이를 위해 본 실시예에 따른 수술용 로봇(1)에는 사용자를 판단할 수 있는 ID 식별부를 추가로 구비하여, 수술용 로봇(1)에 억세스하는 사용자를 파악하도록 할 수 있다. ID 식별부는 사용자가 소지하고 있는 ID 카드 등을 수동으로 인식시키는 방식, 또는 사용자의 몸에 착용된 ID 칩 등을 자동으로 인식하는 방식, 지문 인식 장치를 설치하여 사용자를 인식하는 방식 등 다양한 방식의 식별 장치가 사용될 수 있다.To this end, the surgical robot 1 according to the present exemplary embodiment may further include an ID identification unit for determining a user, so as to identify a user accessing the surgical robot 1. The ID identification unit may be configured to manually recognize an ID card possessed by the user, or automatically recognize an ID chip worn on the user's body, or install a fingerprint reader to recognize the user. An identification device can be used.
이에 따라, 특정 사용자가 수술용 로봇(1)에 억세스하였음이 파악되면, 컨트롤러(20)는 해당 사용자에 관한 데이터를 저장부(24)로부터 불러와 3D 디스플레이(10)를 해당 사용자에게 맞도록 위치시키게 된다.Accordingly, when it is determined that a specific user has accessed the surgical robot 1, the controller 20 retrieves data about the user from the storage unit 24 and positions the 3D display 10 to fit the user. Let's go.
한편, 저장부(24)에 데이터가 저장되어 있지 않은 사용자의 경우에는, 전술한 수동 조작 또는 센싱 방식에 의해 3D 디스플레이(10)를 이동시키고, 이 때 사용된 데이터를 해당 사용자에 관한 데이터로서 저장부(24)에 추가로 저장할 수 있다.On the other hand, in the case of a user whose data is not stored in the storage unit 24, the 3D display 10 is moved by the above-described manual operation or sensing method, and the data used at this time is stored as data about the user. It can be stored in addition to section 24.
또한, 이미 저장부(24)에 데이터가 저장되어 있는 사용자라 하더라도, 필요에 따라서는 3D 디스플레이(10)의 위치에 관한 데이터를 갱신하여 저장할 수도 있으며, 1명의 사용자에 대해서도 자세나 움직임 등에 따라 복수의 데이터를 저장할 수 있다. 이 경우에는 사용자의 ID 뿐만 아니라 자세나 움직임 등에 대한 정보도 같이 파악한 후, 컨트롤러(20)가 해당 사용자의 해당 자세 등에 관하여 미리 저장되어 있는 데이터를 로딩하여 3D 디스플레이(10)를 정확히 원하는 위치로 이동시킬 수 있다.In addition, even if the user has already stored data in the storage unit 24, the data related to the position of the 3D display 10 can be updated and stored as needed. Data can be stored. In this case, after identifying not only the user's ID but also information on the posture or motion, the controller 20 loads the pre-stored data about the posture of the user and moves the 3D display 10 to the desired position. You can.
이로써, 본 실시예에 따른 3D 디스플레이 시스템은 신체조건 등이 상이한 각 사용자별로 3D 디스플레이(10)가 위치할 지점을 개별적으로 저장하여, 수술용 로봇(1)을 사용하는 각 사용자에 최적화되도록 3D 디스플레이(10)의 구동을 관리할 수 있다.As a result, the 3D display system according to the present embodiment individually stores the positions where the 3D display 10 is to be positioned for each user having different physical conditions, and the like, so that the 3D display is optimized for each user who uses the surgical robot 1. The drive of (10) can be managed.
한편, 본 실시예에 따른 3D 디스플레이 시스템은 수술용 인스트루먼트의 조작을 위한 슬레이브 로봇의 유무에 관계없이 사용할 수 있으므로, 전술한 바와 같이 마스터-슬레이브 통합형 수술용 로봇(1)의 인스트루먼트, 즉 이른바 '인텔리전트 인스트루먼트(intelligent instrument)'의 경우에도 활용할 수 있다.On the other hand, since the 3D display system according to the present embodiment can be used regardless of the presence or absence of a slave robot for the operation of the surgical instrument, as described above, the instrument of the master-slave integrated surgical robot 1, that is, 'intelligent' It can also be used for 'intelligent instruments'.
도 2는 본 발명의 실시예에 따른 수술용 로봇의 3D 디스플레이 시스템 제어방법을 나타낸 순서도이다. 이하, 전술한 3D 디스플레이 시스템을 제어하는 방법에 관하여 설명한다.2 is a flowchart illustrating a method of controlling a 3D display system of a surgical robot according to an exemplary embodiment of the present invention. Hereinafter, a method of controlling the above-described 3D display system will be described.
본 실시예에 따른 3D 디스플레이(10)는 수술용 로봇(1)에 전기적으로 연결되어 있으며, 수술용 로봇(1)에는 컨트롤러(20)가 장착되어 있어 수술용 로봇(1)에 연결된 3D 디스플레이(10)의 이동을 제어한다.The 3D display 10 according to the present embodiment is electrically connected to the surgical robot 1, and the surgical robot 1 is equipped with a controller 20 so that the 3D display 10 connected to the surgical robot 1 ( 10) to control the movement.
본 실시예는 수술용 로봇(1)을 사용하는 사용자에 맞도록 3D 디스플레이(10)를 이동시켜 사용자별, 그리고 각 사용자의 자세에 가장 적합하게 3D 디스플레이(10)를 위치시키는 제어방법을 특징으로 하며, 이를 위해 먼저 수술용 로봇(1)에 억세스한 사용자에 관한 정보를 획득한다(S10).This embodiment is characterized in that the control method for moving the 3D display 10 to fit the user using the surgical robot 1 to position the 3D display 10 most suitable for each user and the posture of each user. To this end, first, information about a user accessing the surgical robot 1 is obtained (S10).
사용자에 관한 정보는 3D 디스플레이(10)를 이동시키기 위한 입력 정보에 해당하는 것으로, 예를 들어 사용자의 눈앞에 3D 디스플레이(10)가 위치하도록 할 경우 사용자에 관한 정보는 사용자의 눈의 위치에 관한 좌표가 될 수 있다.The information about the user corresponds to input information for moving the 3D display 10. For example, when the 3D display 10 is positioned in front of the user's eyes, the information about the user is related to the position of the user's eyes. Can be coordinates.
3D 디스플레이(10)를 통해 표시되는 영상은 수술 환자의 체내에 삽입되는 복강으로부터 촬영된 영상일 수 있으며, 필요에 따라서는 복강경 등으로부터 촬영된 영상 정보를 3D 디스플레이(10)에 3차원 영상으로 표시될 수 있도록 이미지 프로세싱한 데이터가 사용될 수 있음은 전술한 바와 같다.The image displayed through the 3D display 10 may be an image taken from the abdominal cavity inserted into the body of the surgical patient, and if necessary, display the image information taken from the laparoscope, etc., as a 3D image on the 3D display 10. As described above, the image processed data may be used.
사용자에 관한 정보를 획득하는 방법으로는, 센서에 의해 사용자의 눈의 위치를 센싱(S12)하는 방법과, 미리 저장된 사용자의 눈의 위치에 관한 데이터를 로딩(S16)하는 방법 등을 들 수 있다.As a method of acquiring information about a user, a method of sensing a position of a user's eyes by a sensor (S12), a method of loading data about a position of an eye of a user stored in advance (S16), and the like can be given. .
사용자의 눈의 위치를 센싱하기 위한 구체적인 수단에 관하여는 전술한바 있으므로 그에 대한 상세한 설명은 생략한다. 센서에 의해 사용자의 눈의 위치를 센싱한 경우, 센서는 센싱에 따른 소정의 신호를 생성, 전송하며, 본 실시예에 따른 컨트롤러(20)는 이를 수신하여 사용자에 관한 정보, 즉 사용자의 눈의 위치 좌표 등을 도출한다(S14).Specific means for sensing the position of the user's eyes have been described above, so a detailed description thereof will be omitted. When the position of the user's eyes is sensed by the sensor, the sensor generates and transmits a predetermined signal according to the sensing, and the controller 20 according to the present embodiment receives the information about the user, that is, the eye of the user. Position coordinates and the like are derived (S14).
사용자에 관한 정보를 미리 데이터 형태로 저장하고 필요에 따라 로딩하는 경우, 이 데이터는 전술한 바와 같이 사용자별로, 또한 각 사용자의 자세별로 사용자의 눈의 위치에 따라 3D 디스플레이(10)가 이동될 위치에 관한 좌표의 형태로 이루어질 수 있다. 사용자별, 각 사용자의 자세별 저장될 데이터의 형태, 데이터의 저장 및 로딩을 위한 구체적인 수단에 관하여는 전술한바 있으므로 그에 대한 상세한 설명은 생략한다.When information about a user is stored in advance in the form of data and loaded as necessary, the data is a position to which the 3D display 10 is to be moved according to the position of the user's eyes for each user and for each user's posture as described above. It may be made in the form of a coordinate with respect to. Since the type of data to be stored for each user and the posture of each user and the specific means for storing and loading the data have been described above, a detailed description thereof will be omitted.
상기와 같이, 사용자의 눈의 위치를 센싱(S12)하거나, 미리 저장된 데이터에 따라 상기 3D 디스플레이(10)가 이동될 위치를 로딩(S16)한 후에는, 획득된 정보에 따라 로봇 암(12)을 구동시켜 3D 디스플레이(10) 이동시킨다(S20).As described above, after sensing the position of the eye of the user (S12) or loading the position at which the 3D display 10 is to be moved according to previously stored data (S16), the robot arm 12 according to the obtained information. The 3D display 10 is moved to drive (S20).
이로써, 본 실시예에 따른 수술용 로봇(1)에 특정 사용자가 억세스할 경우, 해당 사용자에 맞도록 3D 디스플레이(10)가 자동으로 움직여 최적의 위치로 이동하게 된다.Thus, when a specific user accesses the surgical robot 1 according to the present embodiment, the 3D display 10 is automatically moved to the optimal position to fit the user.
상기와 같이, 본 발명의 실시예에 따른 수술용 로봇의 3D 디스플레이 시스템 및 그 제어방법을 일 실시예에 따라 기술하였으나, 반드시 이에 한정될 필요는 없고, 상기 3D 디스플레이(10) 및 수술용 로봇(1)의 세부적인 구성, 로봇 암(12)과 수술용 로봇(1) 간의 연결 방식, 컨트롤러(20)의 구체적인 작동 양태 등이 달라지더라도 전체적인 작용 및 효과에는 차이가 없다면 이러한 다른 구성은 본 발명의 권리범위에 포함될 수 있으며, 해당 기술분야에서 통상의 지식을 가진 자라면 하기의 특허 청구의 범위에 기재된 본 발명의 사상 및 영역으로부터 벗어나지 않는 범위 내에서 본 발명을 다양하게 수정 및 변경시킬 수 있음을 이해할 수 있을 것이다.As described above, the 3D display system and a control method of the surgical robot according to an embodiment of the present invention has been described in accordance with an embodiment, but is not necessarily limited thereto, the 3D display 10 and the surgical robot ( Even if the detailed configuration of 1), the connection method between the robot arm 12 and the surgical robot 1, the specific operating mode of the controller 20, etc. are different, the overall operation and effects are not different. It can be included in the scope of the right, and those skilled in the art can variously modify and change the present invention without departing from the spirit and scope of the present invention described in the claims below. You will understand.

Claims (13)

  1. 수술 현장으로부터 획득된 3차원 영상 정보를 표시하는 3D 디스플레이와;3D display for displaying three-dimensional image information obtained from the surgical site;
    상기 3D 디스플레이에 결합된 로봇 암과;A robot arm coupled to the 3D display;
    상기 로봇 암이 연결되는 수술용 로봇과;A surgical robot to which the robot arm is connected;
    상기 수술용 로봇에 포함된 컨트롤러를 포함하되,Including a controller included in the surgical robot,
    상기 컨트롤러는 상기 3D 디스플레이가 소정의 위치로 이동하도록 상기 로봇 암을 구동시키는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템.And the controller drives the robot arm to move the 3D display to a predetermined position.
  2. 제1항에 있어서,The method of claim 1,
    상기 영상 정보는 수술 환자 내에 삽입되는 비전 시스템에 의해 촬영되어 상기 3D 디스플레이에 표시되도록 처리된 영상 정보인 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템.And the image information is image information processed by a vision system inserted into a surgical patient and processed to be displayed on the 3D display.
  3. 제1항에 있어서,The method of claim 1,
    상기 로봇 암은 상기 3D 디스플레이가 소정의 위치로 이동하도록 상기 컨트롤러에 의해 구동되는 다관절 링크를 포함하는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템.The robotic arm includes a articulated link driven by the controller to move the 3D display to a predetermined position.
  4. 제1항에 있어서,The method of claim 1,
    상기 로봇 암의 일단부는 수술실의 구조체에 고정되며, 상기 로봇 암은 상기 수술용 로봇과 전기적으로 연결되는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템.One end of the robot arm is fixed to the structure of the operating room, the robot arm is 3D display system of the surgical robot, characterized in that electrically connected with the surgical robot.
  5. 제1항에 있어서,The method of claim 1,
    상기 수술용 로봇에는 사용자의 조작을 위한 콘솔부가 설치되며, 상기 컨트롤러는 사용자의 눈의 위치에 상응하여 상기 3D 디스플레이를 이동시키는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템.The surgical robot is provided with a console unit for the user's operation, the controller 3D display system of the surgical robot, characterized in that for moving the 3D display corresponding to the position of the user's eyes.
  6. 제5항에 있어서,The method of claim 5,
    상기 콘솔부에는 조작 핸들이 설치되며, 상기 조작 핸들에 대한 사용자 조작에 따라 상기 컨트롤러는 상기 3D 디스플레이를 이동시키는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템.The control unit is provided with an operation handle, the controller 3D display system of the surgical robot, characterized in that for moving the 3D display in accordance with a user operation on the operation handle.
  7. 제5항에 있어서,The method of claim 5,
    상기 컨트롤러에는 사용자의 눈의 위치를 감지하는 센싱부가 연결되며, 상기 컨트롤러는 상기 센싱부로부터 신호를 수신하여, 상기 3D 디스플레이를 사용자의 눈의 위치에 따라 이동시키는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템.The controller is connected to a sensing unit for detecting the position of the user's eyes, the controller receives a signal from the sensing unit, and moves the 3D display according to the position of the user's eyes 3D of the surgical robot Display system.
  8. 제5항에 있어서,The method of claim 5,
    상기 컨트롤러에는 각 사용자별 눈의 위치에 따라 상기 3D 디스플레이가 이동될 위치에 관한 데이터를 저장하는 저장부가 연결되며, 상기 컨트롤러는 상기 콘솔부를 조작하는 사용자에 상응하여 상기 저장부에 저장된 데이터를 로딩하여, 상기 3D 디스플레이를 미리 저장된 위치로 이동시키는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템.The controller is connected to a storage unit for storing data relating to a position to which the 3D display is to be moved according to the eye position of each user, and the controller loads data stored in the storage unit corresponding to a user who operates the console unit. 3D display system of the surgical robot, characterized in that for moving the 3D display to a pre-stored position.
  9. 수술용 로봇에 전기적으로 연결된 로봇 암에 결합되는 3D 디스플레이를 제어하는 방법으로서,A method of controlling a 3D display coupled to a robotic arm electrically connected to a surgical robot,
    상기 수술용 로봇에 억세스한 사용자에 관한 정보를 획득하는 단계; 및Acquiring information about a user accessing the surgical robot; And
    상기 획득된 정보에 따라 상기 로봇 암을 구동시켜 상기 3D 디스플레이를 이동시키는 단계를 포함하는 수술용 로봇의 3D 디스플레이 시스템 제어방법.And driving the robot arm in accordance with the obtained information to move the 3D display.
  10. 제9항에 있어서,The method of claim 9,
    상기 3D 디스플레이는 수술 환자의 체내에 삽입되는 비전 시스템에 의해 촬영된 영상 정보를 표시하는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템 제어방법.The 3D display is a control method for a 3D display system of a surgical robot, characterized in that for displaying the image information taken by the vision system inserted into the body of the surgical patient.
  11. 제9항에 있어서,The method of claim 9,
    상기 사용자에 관한 정보는 사용자의 눈의 위치에 관한 정보를 포함하는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템 제어방법The information about the user 3D display system control method of the surgical robot, characterized in that it includes information about the position of the user's eyes
  12. 제9항에 있어서,The method of claim 9,
    상기 획득 단계는,The obtaining step,
    센서에 의해 사용자의 눈의 위치를 센싱하는 단계; 및Sensing a position of an eye of a user by a sensor; And
    상기 센서로부터 신호를 수신하여 상기 사용자에 관한 정보를 도출하는 단계를 포함하는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템 제어방법.Receiving a signal from the sensor to derive information about the user 3D display system control method for a surgical robot comprising the.
  13. 제9항에 있어서,The method of claim 9,
    상기 사용자에 관한 정보는, 사용자의 눈의 위치에 따라 상기 3D 디스플레이가 이동될 위치에 관한 데이터의 형태로 미리 저장되어 있으며,The information about the user is previously stored in the form of data regarding the position to which the 3D display is to be moved according to the position of the user's eyes,
    상기 획득 단계는, 상기 데이터를 로딩하는 단계를 포함하는 것을 특징으로 하는 수술용 로봇의 3D 디스플레이 시스템 제어방법.The acquiring step includes the step of loading the data.
PCT/KR2009/001388 2008-08-21 2009-03-18 Three-dimensional display system for surgical robot and method for controlling same WO2010021447A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009801410053A CN102186434B (en) 2008-08-21 2009-03-18 Three-dimensional display system for surgical robot and method for controlling same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020080081671A KR100998182B1 (en) 2008-08-21 2008-08-21 3D display system of surgical robot and control method thereof
KR10-2008-0081671 2008-08-21

Publications (1)

Publication Number Publication Date
WO2010021447A1 true WO2010021447A1 (en) 2010-02-25

Family

ID=41707308

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2009/001388 WO2010021447A1 (en) 2008-08-21 2009-03-18 Three-dimensional display system for surgical robot and method for controlling same

Country Status (3)

Country Link
KR (1) KR100998182B1 (en)
CN (1) CN102186434B (en)
WO (1) WO2010021447A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011125007A1 (en) * 2010-04-07 2011-10-13 Sofar Spa Robotized surgery system with improved control
CN102599876A (en) * 2012-03-08 2012-07-25 珠海迈德豪医用科技有限公司 Endoscope having automatic positioning and viewing unit
WO2018067611A1 (en) 2016-10-03 2018-04-12 Verb Surgical Inc. Immersive three-dimensional display for robotic surgery
WO2019050883A1 (en) * 2017-09-06 2019-03-14 Covidien Lp Mobile surgical control console

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5800616B2 (en) 2011-07-15 2015-10-28 オリンパス株式会社 Manipulator system
KR101698961B1 (en) * 2015-10-26 2017-01-24 (주)미래컴퍼니 Surgical robot system and laparoscope handling method thereof
KR101706994B1 (en) * 2016-10-17 2017-02-17 (주)미래컴퍼니 Surgical robot system and laparoscope handling method thereof
KR101709911B1 (en) * 2016-10-17 2017-02-27 (주)미래컴퍼니 Surgical robot system and laparoscope handling method thereof
CN107669340A (en) * 2017-10-28 2018-02-09 深圳市前海安测信息技术有限公司 3D image surgical navigational robots and its control method
CN109806002B (en) 2019-01-14 2021-02-23 微创(上海)医疗机器人有限公司 Surgical robot
TWI683132B (en) * 2019-01-31 2020-01-21 創新服務股份有限公司 Application of human face and eye positioning system in microscope
KR102274167B1 (en) 2019-09-05 2021-07-12 큐렉소 주식회사 Robot positioning guide apparautus, method therof and system comprising the same
KR102349862B1 (en) 2020-01-31 2022-01-12 큐렉소 주식회사 Apparatus for providing robotic surgery inforamtion during joint replacement robotic surgery, and method thereof
CN113662494B (en) * 2021-08-17 2023-12-26 岱川医疗(深圳)有限责任公司 Endoscope workstation, control method thereof, control device thereof, and storage medium
KR20240041681A (en) 2022-09-23 2024-04-01 큐렉소 주식회사 Apparatus for planning cutting path of surgical robot, and mehtod thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08187246A (en) * 1995-01-10 1996-07-23 Olympus Optical Co Ltd Manipulator device for operation inside celom
KR20010038799A (en) * 1999-10-27 2001-05-15 김춘호 A telescope auto control system and the control method
US20060100642A1 (en) * 2002-09-25 2006-05-11 Guang-Zhong Yang Control of robotic manipulation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0011455D0 (en) * 2000-05-13 2000-06-28 Mathengine Plc Browser system and method for using it

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08187246A (en) * 1995-01-10 1996-07-23 Olympus Optical Co Ltd Manipulator device for operation inside celom
KR20010038799A (en) * 1999-10-27 2001-05-15 김춘호 A telescope auto control system and the control method
US20060100642A1 (en) * 2002-09-25 2006-05-11 Guang-Zhong Yang Control of robotic manipulation

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011125007A1 (en) * 2010-04-07 2011-10-13 Sofar Spa Robotized surgery system with improved control
US9360934B2 (en) 2010-04-07 2016-06-07 Transenterix Italia S.R.L. Robotized surgery system with improved control
US11857278B2 (en) 2010-04-07 2024-01-02 Asensus Surgical Italia, S.R.L. Roboticized surgery system with improved control
US11224489B2 (en) 2010-04-07 2022-01-18 Asensus Surgical Italia, S.R.L. Robotized surgery system with improved control
US10251713B2 (en) 2010-04-07 2019-04-09 Transenterix Italia S.R.L. Robotized surgery system with improved control
CN102599876A (en) * 2012-03-08 2012-07-25 珠海迈德豪医用科技有限公司 Endoscope having automatic positioning and viewing unit
KR20190043143A (en) * 2016-10-03 2019-04-25 버브 서지컬 인크. Immersive 3D display for robotic surgery
EP3518730A4 (en) * 2016-10-03 2020-05-27 Verb Surgical Inc. Immersive three-dimensional display for robotic surgery
KR102265060B1 (en) * 2016-10-03 2021-06-16 버브 서지컬 인크. Immersive 3D display for robotic surgery
US11439478B2 (en) 2016-10-03 2022-09-13 Verb Surgical Inc. Immersive three-dimensional display for robotic surgery
US11813122B2 (en) 2016-10-03 2023-11-14 Verb Surgical Inc. Immersive three-dimensional display for robotic surgery
WO2018067611A1 (en) 2016-10-03 2018-04-12 Verb Surgical Inc. Immersive three-dimensional display for robotic surgery
CN110167476A (en) * 2017-09-06 2019-08-23 柯惠Lp公司 Mobile surgical console
WO2019050883A1 (en) * 2017-09-06 2019-03-14 Covidien Lp Mobile surgical control console

Also Published As

Publication number Publication date
CN102186434B (en) 2013-11-06
CN102186434A (en) 2011-09-14
KR100998182B1 (en) 2010-12-03
KR20100023086A (en) 2010-03-04

Similar Documents

Publication Publication Date Title
WO2010021447A1 (en) Three-dimensional display system for surgical robot and method for controlling same
CN106725857B (en) Robot system
US20200405411A1 (en) Patient introducer for a robotic system
KR102512876B1 (en) Medical devices, systems, and methods integrating eye gaze tracking for stereo viewer
JP5938977B2 (en) Head mounted display and surgical system
WO2020147691A1 (en) Imaging system for surgical robot, and surgical robot
JP6939778B2 (en) Control devices, control methods and surgical systems
WO2010093153A2 (en) Surgical navigation apparatus and method for same
JP2021531910A (en) Robot-operated surgical instrument location tracking system and method
US20190314097A1 (en) Secondary instrument control in a computer-assisted teleoperated system
CN110177518A (en) System and method for the detection object in the visual field of image capture apparatus
WO2021071336A1 (en) Gaze detection-based smart glasses display device
WO2018088105A1 (en) Medical support arm and medical system
EP3552573A1 (en) Remote control apparatus and method of identifying target pedal
CN112022357B (en) Doctor console, surgical robot system, and control method for doctor console
WO2015186930A1 (en) Apparatus for real-time interactive transmission of medical image and information and for remote support
JP2004320722A (en) Stereoscopic observation system
JP3482228B2 (en) Manipulator control system by gaze detection
WO2021045546A2 (en) Device for guiding position of robot, method therefor, and system comprising same
US20230270321A1 (en) Drive assembly for surgical robotic system
WO2020231157A1 (en) Augmented reality colonofiberscope system and monitoring method using same
WO2019022464A1 (en) Apparatus and method for controlling surgical visualization system by using detection of movement
EP4192330A1 (en) Jig assembled on stereoscopic surgical microscope for applying augmented reality techniques to surgical procedures
KR102304962B1 (en) Surgical system using surgical robot
WO2023013832A1 (en) Surgical robot control system using headset-based contactless hand-tracking technology

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980141005.3

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09808353

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09808353

Country of ref document: EP

Kind code of ref document: A1