US20150305828A1 - Apparatus for adjusting a robotic surgery plan - Google Patents
Apparatus for adjusting a robotic surgery plan Download PDFInfo
- Publication number
- US20150305828A1 US20150305828A1 US14/697,840 US201514697840A US2015305828A1 US 20150305828 A1 US20150305828 A1 US 20150305828A1 US 201514697840 A US201514697840 A US 201514697840A US 2015305828 A1 US2015305828 A1 US 2015305828A1
- Authority
- US
- United States
- Prior art keywords
- surgery
- image
- user interface
- diseased part
- image associated
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- A61B19/56—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/56—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- A61B2019/562—
-
- A61B2019/566—
-
- A61B2019/568—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/252—User interfaces for surgical systems indicating steps of a surgical procedure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/256—User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/258—User interfaces for surgical systems providing specific settings for specific users
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- the present invention relates to a robotic surgery. More specifically, the present invention relates to an apparatus for adjusting a robotic surgery plan.
- the deepening of a low birthrate and an aging phenomenon is acting as a catalyst in developing robotic industry.
- the robots can be utilized in various fields, including operations in biologically dangerous regions like the scene of a fire, the reconnaissance in battlefield, and the lengthy surgery, etc.
- the main principles in developing medical robots are to provide convenience in using to doctors, to provide no inconvenience to patients, to minimize invasions of patients, to minimize pains of patients, etc.
- the medical robot technology is a technic field combining BT (Bio-Tech), NT (Nano-Tech), RT (Robot-Tech), and MT (Medical-Tech).
- the orthopedic surgery using a robot enables elaborate and precise bone cutting, it has problems to increase surgery time and cost caused by using robot equipment.
- the orthopedic surgery is performed using a robot, there is a need to make a decision about how the robot cuts bones. If the decision is made before surgery, there are problems to take more time in addition to surgery, and to have difficulty to apply any anatomical information discovered during surgery. On the other hand, if the decision is made during surgery, there is a problem the surgery time is increased.
- the surgery plan should be able to be modified during surgery because a surgeon might modify osteoplastic goals based on real anatomical information and lesions of the patient. Such modification of surgical plan should be made as soon as possible and securely.
- the known orthopedic robot system only allows planning surgery before surgery, and adjusting the plan a little bit during surgery, using identical user interface.
- ROBODOC Cosmeticxo Technology Corp, USA, California
- ROBODOC provides a method to decide a position of an artificial joint based on preoperative CT bone images of a patient before surgery, and cut the bone in order to insert the artificial joint into the predetermined position during surgery (U.S. Pat. No. 6,430,434 etc.).
- it has difficulty in modifying the position of implants based on intraoperative lesions or in modifying the approach direction of the robot during surgery.
- MAKOplasty (Mako surgical, USA, Florida) allows deciding a position of an artificial joint just before surgery in surgery room, and modifying the plan of surgery after incising a diseased part in surgery room. Furthermore, it has the advantage that the approach direction of the robot is decided by doctor, not by robot. The doctor decides the approach direction of the robot by pulling the robot with his hands.
- the surgery plan has to be decided based on CT images of bones. Because the CT images show only the shape of bones, to modify the surgery plan based on lesions of the patient not showed on the CT images, after observing lesions with eyes, while watching the CT images on display of a robot controller, the doctor should modify the surgery plan based on the status of lesions. Therefore, the MAKOplasty also has difficulty in modifying the surgery plan as ROBODOC.
- an apparatus for adjusting a robotic surgery plan including a surgery information storage unit storing an examined first image associated with an inputted robotic surgery plan and a target bone of surgery, a scene image obtaining unit obtaining a second image associated with a diseased part in real time in surgery room, an image registration unit matching coordinates of the examined first image with coordinates of the second image associated with the diseased part, a user interface displaying the examined first image and the second image associated with the diseased part, and a surgery control unit controlling the user interface so that the user interface displays the examined first image to be superimposed on the second image associated with the diseased part, which is inputted in real time.
- the surgery information storage unit could further store phased cutting options of the robotic surgery plan and related images thereof.
- the surgery control unit could provide at least one image associated with the cutting options that is applicable to a corresponding surgery step, according to request for modifying the surgery plan inputted via the user interface.
- the surgery control unit could control the user interface so that at least one image associated with the cutting options is superimposed on the second image associated with the diseased part, and also displayed to be distinguishable from the second image associated with the diseased part.
- the scene image obtaining unit could include an optical camera and a mechanical arm that is attached to the optical camera and supports movements of the optical camera.
- the scene image obtaining unit and the user interface could be attached to each other to be moveable together.
- the surgery control unit could control the user interface so that the user interface displays outlines of the first image to be superimposed on the second image associated with the diseased part.
- the surgery control unit could display the outlines of the first image to be superimposed on the second image associated with the diseased part using augmented reality technology.
- the surgery control unit could modify the robotic surgery plan based on a selected cutting option.
- the apparatus for adjusting a robotic surgery plan further include a cutting robot processing a target bone of surgery according to the modified robotic surgery plan inputted from the surgery control unit.
- the apparatus for adjusting a robotic surgery plan mentioned above can respond actively and promptly to various requests of modifications of plan during robotic surgery performed according to pre-inputted sequence.
- FIG. 1 is a schematic diagram of an apparatus for adjusting a robotic surgery plan according to the present invention.
- FIG. 2 illustrates an example of a scene image obtaining unit and a user interface according to the present invention.
- FIG. 3 illustrates another example of the scene image obtaining unit and the user interface according to the present invention.
- FIG. 4 illustrates an example of a screen of cutting options provided during surgery by the apparatus for adjusting a robotic surgery plan according to the present invention.
- FIG. 5 is a flowchart that depicts a method for adjusting a robotic surgery plan according to the present invention.
- FIG. 6 illustrates an example of a screen of the user interface in which the present invention can be applied.
- tissues means a part of tissues of body
- soft tissues means tissues such as skins and muscles, etc. except bones in the body tissues.
- images used herein includes static images and moving images.
- the present invention considers difficult problems coming up when a robotic surgery plan needs to be modified.
- the present invention displays an image associated with a processing plan intended to modify to be superimposed on an image of real diseased part, during surgery.
- the present invention can support surgeons' judgements effectively in surgery room, where swift decisions are required.
- the present invention provides images of options for modifying the surgery plan using augmented reality technology. Accordingly, the present invention enables surgeons to modify the surgery plan swiftly and securely.
- FIG. 1 is a schematic diagram of an apparatus for adjusting a robotic surgery plan according to the present invention.
- the elements according to the present invention are those elements defined by functional classification, not by physical classification.
- the elements according to the present invention could be defined by functions performed by each of the elements.
- Each of the elements could be implemented as hardware and/or program codes performing each function and processing units. They also could be implemented so that the functions of two or more elements are included in one element. Therefore, it needs to be noted that the names of elements, given in following embodiments, are not for distinguishing elements physically, for representing main function performed by each element. Furthermore, it needs to be noted that the spirit of the present invention is not be limited by the names of elements.
- the apparatus for adjusting a robotic surgery plan comprises a cutting robot 100 to which surgery equipment for cutting bones by using an orthopedic surgery robot is attached, a position measuring unit 200 measuring the position of bones, a surgery control unit 300 finding the position of bones and determining cutting paths, a scene image obtaining unit 330 (for example, cameras etc.), an image registration unit 310 , a user interface 320 , and a surgery information storage unit 340 .
- the position measuring unit 200 measures the position of bones exposed outwards by incising skins and skin tissues in surgery. Digitizers, infrared units, laser units, etc. could be used for measuring the position of bones.
- the surgery control unit 300 determines the real position of bones by matching three-dimensional shape images of bones, which is obtained by computerized tomography equipment, etc. before surgery, with three-dimensional position data obtained by the position measuring unit 200 . Accordingly, the cutting robot 100 can determine exact cutting positions, and cutting paths.
- the step of matching the three-dimensional shape images of bones, which is obtained by computerized tomography equipment, etc. before surgery, with the three-dimensional position data obtained by the position measuring unit 200 is referred to as registration.
- the position registration is a step to calculate preferred surgery position based on the anatomical position of bones measured by an anatomical position finder and a surgery robot. Although, there are various methods for registration, the most representative registration method is explained hereafter.
- the coordinate systems are classified into a reference coordinate system ⁇ F ⁇ , a robot coordinate system ⁇ R ⁇ about paths programmed in robot, and a bone coordinate system ⁇ B ⁇ about bones of a patient in real surgery.
- a reference coordinate system ⁇ F ⁇ a relative coordinate system relative to the reference coordinate system ⁇ F ⁇
- a bone coordinate system ⁇ B ⁇ a bone coordinate system ⁇ B ⁇ about bones of a patient in real surgery.
- the robot coordinate system ⁇ R ⁇ and the bone coordinate system ⁇ B ⁇ are converted relative to the same reference coordinate system ⁇ F ⁇ .
- calculate transformation matrix T of the converted robot coordinate system ⁇ R ⁇ and the converted bone coordinate system ⁇ B ⁇ and apply the transformation matrix T into the converted robot coordinate system ⁇ R ⁇ .
- a processing path of robot can be applied appropriately according to the real position of bones.
- the registration method for calculating the transformation matrix T there are pin-basis registration, image-basis registration, etc.
- CT images are taken before surgery. After that, the processing path of robot is determined based on the CT images. At this time, the reference coordinate system of the processing path of robot is established by the pins in the CT images.
- the registration is performed by matching the real pins inserted into the surgical region with the pins in the CT images, which are basis of the processing path of robot.
- Such the pin-basis registration method may cause pain and discomfort of patients due to pins inserted into the diseased part from start to the end of the surgery.
- the processing path of robot is determined by CT images of a thighbone of a patient, which is obtained before surgery.
- the registration was made by matching three-dimensional images obtained from CT images with two-dimensional X-ray images of bones of patients obtained during surgery.
- Such method causes many errors in the process of distinguishing tissues like bone tissues, ligaments, etc. and the process of detecting edges.
- the registration method that matches a particular point of a pre-surgery CT image with a particular point measured by digitizer during surgery has been being used.
- the registration method using the digitizer needs to press surface of a thighbone with a tip of measuring pin with a steady pressure in order to measure the particular point of bone tissues with the measuring pin of digitizer in surgery.
- pressing the surface of a thighbone if pressing force is too small, it causes an error in measuring the particular point, and if pressing force is too big, it causes cracks in surface of the bone. Furthermore, it causes discomfort due to many measuring points for reducing the error, and it causes difficulty for the surgeon to correspond a measuring pin exactly with a measured point guided by a monitor attached to surgery equipment.
- the surgeon determines a robotic surgery plan, considering three-dimensional surface data of bones obtained by computer tomography equipment (CT), etc. before surgery, and the status of patients, etc.
- the determined robotic surgery plan is stored in the surgery information storage unit 340 by the surgery control unit 300 according the present invention.
- the robotic surgery plan applying to the present invention may be comprised of plurality of steps, and has various cutting options that are applicable to each surgery step.
- the surgery information storage unit 340 according to the present invention stores libraries related to such cutting options of each surgery step.
- the surgery information storage unit 340 may be implemented as a form of database, and the term “database” used in the present invention means a functional element storing information, does not mean database in a strict sense like relational database, objected-oriented database.
- the surgery information storage unit 340 could be implemented as various forms of storage elements including a simple storage element of a form of file-base, etc.
- the surgery control unit 300 accordingly, in the step that the surgeon needs additional information to modify the surgical plan during surgery, selects appropriate options in information stored in the surgery information storage unit 340 and provides the selected options.
- the surgery control unit 300 provides at least one image associated with cutting options that are applicable to corresponding surgery step. Furthermore, the surgery control unit 300 could display at least one image associated with cutting options to be superimposed on a real-time image associated with the diseased part, and could display those images to be distinguishable from one another. For example, to display the images to be distinguishable from one another, while displaying the real-time image associated to the diseased part without any processing, the surgery control unit 300 displays the image associated with cutting options by using only outlines or translucent gray scales. At this time, the surgery control unit 300 could use augmented reality technology in displaying the two images.
- the surgery control unit 300 modifies the preset robotic surgery plan by applying the cutting options selected by the surgeon, and controls the cutting robot 100 according to the modified robotic surgery plan.
- the scene image obtaining unit 330 takes pictures of surgery scenes regarding diseased parts in surgery room and obtains images of surgery scenes.
- the preferred embodiment of the scene image obtaining unit 330 is an optical camera.
- the image registration unit 310 finds the positional relation of the scene image obtaining unit 330 , for example, the optical camera, and the cutting robots 100 , and matches coordinates of the image of the diseased part with coordinates of the image held by the robot.
- the user interface 320 displays the scene images obtained by the scene image obtaining unit 330 and displays pre-recognized position of bones, which is stored in the surgery information storage unit 340 , to be superimposed on the scene images according to the control of the surgery control unit 300 .
- the user interface 320 displays the two images to be superimposed on each other.
- FIG. 2 shows one embodiment of the scene image obtaining unit and the user interface according to the present invention.
- FIG. 2 illustrates an optical camera as an example of the scene image obtaining unit 330 , and shows a display screen to which the optical camera is attached in the rear as an example of the user interface 340 .
- FIG. 2 shows that the camera and the user interface 340 are integrated with each other.
- the camera is connected with a mechanical arm 331 , and the user can move the camera 330 and the user interface 340 at the same time by moving the mechanical arm 331 .
- a sensor included in the mechanical arm 331 can find position of the camera, and the found position of the camera is used in image registration of the image registration unit 310 according to the present invention. Furthermore, besides of the method of using the sensor included in the mechanical arm 331 , the position of camera and the position of display could be found by wireless methods such as infrared rays.
- the A of FIG. 2 is a front view of a display screen to which the optical camera is attached.
- the B of FIG. 2 is a side view of the display screen to which the optical camera is attached.
- the C of FIG. 2 is a rear view of the display screen to which the optical camera is attached.
- the computer attached to the robot displays the shape, which is to be processed by the robot, to be superimposed on the image obtained by the optical camera. It can be understood by the display screen shown in A of FIG. 2 .
- the surgeon can determine whether the shape, which is to be processed by the robot, has a risk of conflicts with soft tissues.
- the surgeon can omit the cutting, which is possible to cause any problem, by removing a part of the shape in the surgery plan displayed superimposed by the user interface 340 , or the surgeon can add the amount of cutting as he wants.
- the apparatus for adjusting a surgery robotic plan provides many possible libraries of cutting paths.
- the apparatus displays the shape, which is to be processed using the selected option, to be superimposed on the real image of surgery, which is being showed currently, thereby helping a choice of the surgeon.
- the apparatus for adjusting a surgery robotic plan according to the present invention has the position of bones in advance before surgery, the apparatus can display the known position of bones to be superimposed on the real position of bones of surgery room inputted by camera, after matching those two kinds of position of bones. For example, in the position of bones previously known, when the apparatus displays outlines of bones previously known to be superimposed on the image of bones being showed currently, it can be easily understood that whether the known position of bones is correct or not.
- the augmented reality technology could be used when the apparatus displays the image associated with cutting options of the robotic surgery plan to be superimposed on the real image of bones showed in surgery room.
- the augmented reality technology is a technology to superimpose some virtual objects on the real world that the user can see with eyes. It can be also called by mixed reality (MR), because it shows as a one image, combining the virtual world having additional information with the real world in real time.
- MR mixed reality
- augmented reality which is a concept of complementing the real world with the virtual world
- a leading part is the real world in spite of using the virtual world made by computer graphics.
- the computer graphics have a role to provide information additionally required by the real world. It means that to make ambiguous to distinguish the real world from the virtual screen by overlapping a three-dimensional virtual image on the real image showed to user.
- the augmented reality technology is achieved by superimposing the image associated with cutting options of the surgery plan, which is data of the virtual world, on the image of the diseased part of the real world, which is about the target of surgery.
- the apparatus for adjusting a robotic surgery plan adjusts the camera 330 toward the robot or sensors attached to the robot, and displays outlines of robot, which is previously known, to be superimposed on the real image of the robot inputted by the camera. Accordingly, it can be easily understood that whether the relationship of measured position between the robot and camera is correct.
- FIG. 3 shows an anther embodiment of the scene image obtaining unit and the user interface.
- FIG. 3 as an example of the scene image obtaining unit 330 , also illustrates the optical camera that is attached to the mechanical arm 331 to move with the mechanical arm 331 .
- the different thing with the embodiment of FIG. 2 is that the user interface 340 is not attached with the optical camera, but locates away from the optical camera to give user comfort to see.
- the optical camera 330 and the user interface 340 could communicate with each other by wired or wireless network.
- the sensor of the mechanical arm 331 can find position of the camera.
- the found position of the camera is used in image registration of the image registration unit 310 .
- the position of the camera also could be found by using wireless methods like infrared rays, etc. besides using the sensor attached to the mechanical arm 331 .
- FIG. 4 shows an example of a screen of cutting options provided during surgery by the apparatus for adjusting a robotic surgery plan according to the present invention.
- the apparatus displays the image that is to be processed by the robot to be superimposed on the image obtained by the optical camera via the user interface 340 .
- the apparatus displays alternative cutting options 410 , 420 , 430 on top of the main screen 400 of the user interface 340 .
- the cutting options are displayed to be superimposed on the image of the exposed bone.
- the main screen 400 displays the selected cutting option to be superimposed on the real image of the diseased part.
- the image on the main screen 400 of FIG. 4 is a related image in the case that the user selected option 1 among three options.
- the surgery information storage unit stores libraries of possible cutting path
- the apparatus according to the present invention provides the libraries of possible cutting path to the user, thereby helping the user make choices.
- FIG. 5 is a flowchart that depicts the method for adjusting a robotic surgery plan according to the present invention.
- each step of the method for adjusting a robotic surgery plan according to the present invention is performed in corresponding elements of the apparatus for adjusting a robotic surgery plan, which was explained through FIG. 1 , the each step of the method should be limited as function itself, which defines the each step.
- the performer of each step is not limited by the names of elements that are given as examples of performer of each step.
- step S 510 an image associated with a diseased part of surgery room, which is obtained by the optical camera, etc. is displayed.
- the apparatus matches coordinates of the image associated with the diseased part with coordinates of an image associated with a bone of surgical target that has been already obtained by equipment like CT, etc.
- step S 530 the apparatus displays the pre-examined image associated with the bone of surgical target to be superimposed on the real-time image associated with the diseased part of surgery room.
- the apparatus when receiving a request for modifying a surgery plan from a surgeon in step S 540 , the apparatus provides at least one image associated with cutting options, which can be applied to a corresponding surgery step in step S 550 .
- step S 560 when a selected cutting option that is to be applied is inputted, in step S 570 , the apparatus displays the selected cutting option to be superimposed on the image associated with the diseased part of surgery room.
- step S 580 as the selected cutting option is fixed, the apparatus modifies the surgery plan, applying the fixed cutting option.
- FIG. 6 illustrates an example of a screen of the user interface in which the present invention can be applied.
- the screen of the user interface of FIG. 6 shows an example of a screen providing the various processing options that can be applied to a bone transplant surgery that cuts real bones and transplants artificial bones, and displaying the selected option superimposed on the real image of bones.
- FIG. 6 the images associated with processing options 411 , 421 , 431 according to the present invention are displayed on top of the screen. Furthermore, in FIG. 6 , on the top right-hand side of the screen of the user display device, a menu for selecting options 341 is provided, so that the surgeon can select a processing option that is to be applied to the robotic surgery.
- FIG. 6 on the bottom left-hand side of the screen 610 of user display device, an image of a real diseased part is displayed, and on the bottom right-hand side 620 , an image of the selected processing option is displayed in addition to the image of the real diseased part.
- the processing option of selected size 5 is displayed to be superimposed on the real image of the diseased part using augmented reality, thereby providing the surgeon with predicted appearances of the diseased part when a bone transplant surgery has been performed by applying the processing option of size 5. If the surgeon thinks that the transplant model of size 5 is not matched with the status of real bone of diseased part properly, the surgeon can select a processing option of the most proper size by selecting other option.
- the surgeon modifies the surgery plan by selecting and determining the processing option of the most proper size. Accordingly, the surgeon can adjust the robotic surgery so that the robotic surgery is performed by the modified surgery plan.
- the present invention can deal with various requests for modifying robotic surgery plans actively and promptly.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
Disclosed is an apparatus for adjusting a robotic surgery plan and a method thereof. The apparatus according to the present invention comprises a surgery information storage unit storing an examined first image associated with an inputted robotic surgery plan and a target bone of surgery, a scene image obtaining unit obtaining a second image associated with a diseased part in real time in surgery room, an image registration unit matching coordinates of the examined first image with coordinates of the second image associated with the diseased part, a user interface displaying the examined first image and the second image associated with the diseased part, and a surgery control unit controlling the user interface so that the user interface displays the examined first image to be superimposed on the second image associated with the diseased part, which is inputted in real time.
Description
- 1. Field of the Invention
- The present invention relates to a robotic surgery. More specifically, the present invention relates to an apparatus for adjusting a robotic surgery plan.
- 2. Description of the Related Art
- The deepening of a low birthrate and an aging phenomenon is acting as a catalyst in developing robotic industry. As the need for smart robots working instead of people increases, the worldwide robot market is rapidly expanding. The robots can be utilized in various fields, including operations in biologically dangerous regions like the scene of a fire, the reconnaissance in battlefield, and the lengthy surgery, etc.
- Among those robots, medical robots have been being developed focusing most on user convenience. The main principles in developing medical robots are to provide convenience in using to doctors, to provide no inconvenience to patients, to minimize invasions of patients, to minimize pains of patients, etc. The medical robot technology is a technic field combining BT (Bio-Tech), NT (Nano-Tech), RT (Robot-Tech), and MT (Medical-Tech).
- Although the orthopedic surgery using a robot enables elaborate and precise bone cutting, it has problems to increase surgery time and cost caused by using robot equipment. In addition, when the orthopedic surgery is performed using a robot, there is a need to make a decision about how the robot cuts bones. If the decision is made before surgery, there are problems to take more time in addition to surgery, and to have difficulty to apply any anatomical information discovered during surgery. On the other hand, if the decision is made during surgery, there is a problem the surgery time is increased.
- Although the direction of surgery can be planned before the beginning of surgery based on medical images and status of a patient, the surgery plan should be able to be modified during surgery because a surgeon might modify osteoplastic goals based on real anatomical information and lesions of the patient. Such modification of surgical plan should be made as soon as possible and securely. However, among the common systems or known systems, there is little or no product considering such matters. The known orthopedic robot system only allows planning surgery before surgery, and adjusting the plan a little bit during surgery, using identical user interface.
- For instance, ROBODOC (Curexo Technology Corp, USA, California) provides a method to decide a position of an artificial joint based on preoperative CT bone images of a patient before surgery, and cut the bone in order to insert the artificial joint into the predetermined position during surgery (U.S. Pat. No. 6,430,434 etc.). However, according to the method, it has difficulty in modifying the position of implants based on intraoperative lesions or in modifying the approach direction of the robot during surgery.
- In addition, MAKOplasty (Mako surgical, USA, Florida) allows deciding a position of an artificial joint just before surgery in surgery room, and modifying the plan of surgery after incising a diseased part in surgery room. Furthermore, it has the advantage that the approach direction of the robot is decided by doctor, not by robot. The doctor decides the approach direction of the robot by pulling the robot with his hands. However, according to the MAKOplasty as well, the surgery plan has to be decided based on CT images of bones. Because the CT images show only the shape of bones, to modify the surgery plan based on lesions of the patient not showed on the CT images, after observing lesions with eyes, while watching the CT images on display of a robot controller, the doctor should modify the surgery plan based on the status of lesions. Therefore, the MAKOplasty also has difficulty in modifying the surgery plan as ROBODOC.
- Therefore, there is a need for a robot system, which can apply modifications of the surgery plan during surgery properly.
- It is an object of the present invention, which is to overcome aforementioned problems, to provide an apparatus enabling to adjust a robotic surgery plan actively and flexibly.
- In accordance with one aspect of the present invention, there is provided an apparatus for adjusting a robotic surgery plan, including a surgery information storage unit storing an examined first image associated with an inputted robotic surgery plan and a target bone of surgery, a scene image obtaining unit obtaining a second image associated with a diseased part in real time in surgery room, an image registration unit matching coordinates of the examined first image with coordinates of the second image associated with the diseased part, a user interface displaying the examined first image and the second image associated with the diseased part, and a surgery control unit controlling the user interface so that the user interface displays the examined first image to be superimposed on the second image associated with the diseased part, which is inputted in real time.
- The surgery information storage unit could further store phased cutting options of the robotic surgery plan and related images thereof.
- The surgery control unit could provide at least one image associated with the cutting options that is applicable to a corresponding surgery step, according to request for modifying the surgery plan inputted via the user interface.
- The surgery control unit could control the user interface so that at least one image associated with the cutting options is superimposed on the second image associated with the diseased part, and also displayed to be distinguishable from the second image associated with the diseased part.
- The scene image obtaining unit could include an optical camera and a mechanical arm that is attached to the optical camera and supports movements of the optical camera.
- The scene image obtaining unit and the user interface could be attached to each other to be moveable together.
- The surgery control unit could control the user interface so that the user interface displays outlines of the first image to be superimposed on the second image associated with the diseased part.
- The surgery control unit could display the outlines of the first image to be superimposed on the second image associated with the diseased part using augmented reality technology.
- The surgery control unit could modify the robotic surgery plan based on a selected cutting option.
- The apparatus for adjusting a robotic surgery plan further include a cutting robot processing a target bone of surgery according to the modified robotic surgery plan inputted from the surgery control unit.
- The apparatus for adjusting a robotic surgery plan mentioned above can respond actively and promptly to various requests of modifications of plan during robotic surgery performed according to pre-inputted sequence.
-
FIG. 1 is a schematic diagram of an apparatus for adjusting a robotic surgery plan according to the present invention. -
FIG. 2 illustrates an example of a scene image obtaining unit and a user interface according to the present invention. -
FIG. 3 illustrates another example of the scene image obtaining unit and the user interface according to the present invention. -
FIG. 4 illustrates an example of a screen of cutting options provided during surgery by the apparatus for adjusting a robotic surgery plan according to the present invention. -
FIG. 5 is a flowchart that depicts a method for adjusting a robotic surgery plan according to the present invention. -
FIG. 6 illustrates an example of a screen of the user interface in which the present invention can be applied. - Hereinafter, exemplary embodiments of the present invention will be described in detail. However, the present invention is not limited to the exemplary embodiments disclosed below, but can be implemented in various forms. The following exemplary embodiments are described in order to enable those of ordinary skill in the art to embody and practice the invention.
- It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed as a second element, and similarly, a second element could be termed as a first element, without departing from the scope of the present invention. The term and/or used herein includes any or all combinations of one or more of the associated listed items.
- It will be understood that when an element is referred to as being connected or coupled to another element, it can be directly connected or coupled to the other element or intervening elements may be present.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms a, an and the are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms comprises, comprising, includes and/or including, when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meanings as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly uses dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined here.
- Hereinafter, preferred embodiments of the present invention will be described in detail with the accompanying drawings. In the following description, the same reference numerals denote the same elements to facilitate the overall understanding, and repeated description thereof will be omitted.
- The human body comprises bones, skins, muscles, etc. In the specification, the term “tissues” means a part of tissues of body, and the term “soft tissues” means tissues such as skins and muscles, etc. except bones in the body tissues. The term “images” used herein includes static images and moving images.
- The present invention considers difficult problems coming up when a robotic surgery plan needs to be modified.
- For example, when the surgery plan modifies, there may be a case to modify positions and angles of bones that are to be cut. On the other hand, there may be a case to modify approach directions or angles of robot without modifying the positions or angles of bones to be cut. Furthermore, because of difficulty in accessing to bones due to soft tissues, there also may be a case to finish with hands instead of the robot.
- Therefore, the present invention displays an image associated with a processing plan intended to modify to be superimposed on an image of real diseased part, during surgery. Thus, the present invention can support surgeons' judgements effectively in surgery room, where swift decisions are required. For this, the present invention provides images of options for modifying the surgery plan using augmented reality technology. Accordingly, the present invention enables surgeons to modify the surgery plan swiftly and securely.
-
FIG. 1 is a schematic diagram of an apparatus for adjusting a robotic surgery plan according to the present invention. - Hereafter, the elements according to the present invention, which will be described by referring to
FIG. 1 , are those elements defined by functional classification, not by physical classification. The elements according to the present invention could be defined by functions performed by each of the elements. Each of the elements could be implemented as hardware and/or program codes performing each function and processing units. They also could be implemented so that the functions of two or more elements are included in one element. Therefore, it needs to be noted that the names of elements, given in following embodiments, are not for distinguishing elements physically, for representing main function performed by each element. Furthermore, it needs to be noted that the spirit of the present invention is not be limited by the names of elements. - As illustrated in
FIG. 1 , the apparatus for adjusting a robotic surgery plan according to the present invention comprises a cuttingrobot 100 to which surgery equipment for cutting bones by using an orthopedic surgery robot is attached, aposition measuring unit 200 measuring the position of bones, asurgery control unit 300 finding the position of bones and determining cutting paths, a scene image obtaining unit 330 (for example, cameras etc.), animage registration unit 310, auser interface 320, and a surgeryinformation storage unit 340. - The
position measuring unit 200 measures the position of bones exposed outwards by incising skins and skin tissues in surgery. Digitizers, infrared units, laser units, etc. could be used for measuring the position of bones. - The
surgery control unit 300 determines the real position of bones by matching three-dimensional shape images of bones, which is obtained by computerized tomography equipment, etc. before surgery, with three-dimensional position data obtained by theposition measuring unit 200. Accordingly, the cuttingrobot 100 can determine exact cutting positions, and cutting paths. - Herein, the step of matching the three-dimensional shape images of bones, which is obtained by computerized tomography equipment, etc. before surgery, with the three-dimensional position data obtained by the
position measuring unit 200 is referred to as registration. - In robotic surgery, the position registration is a step to calculate preferred surgery position based on the anatomical position of bones measured by an anatomical position finder and a surgery robot. Although, there are various methods for registration, the most representative registration method is explained hereafter.
- In robotic surgery, the coordinate systems are classified into a reference coordinate system {F}, a robot coordinate system {R} about paths programmed in robot, and a bone coordinate system {B} about bones of a patient in real surgery. For registration, at first, convert the robot coordinate system {R} into a relative coordinate system relative to the reference coordinate system {F}, and convert the bone coordinate system {B} into the relative coordinate system relative to the reference coordinate system {F}. Thereby the robot coordinate system {R} and the bone coordinate system {B} are converted relative to the same reference coordinate system {F}. After that, calculate transformation matrix T of the converted robot coordinate system {R} and the converted bone coordinate system {B}, and apply the transformation matrix T into the converted robot coordinate system {R}. Thus, a processing path of robot can be applied appropriately according to the real position of bones.
- As the registration method for calculating the transformation matrix T, there are pin-basis registration, image-basis registration, etc.
- According to the pin-basis registration method, before surgery, with pins inserted from a diseased part above a bone into the bone, CT images are taken. After that, the processing path of robot is determined based on the CT images. At this time, the reference coordinate system of the processing path of robot is established by the pins in the CT images.
- As completed the set-up of the processing path of robot, the registration is performed by matching the real pins inserted into the surgical region with the pins in the CT images, which are basis of the processing path of robot. Such the pin-basis registration method may cause pain and discomfort of patients due to pins inserted into the diseased part from start to the end of the surgery.
- On the other hand, according to the image-basis registration method, the processing path of robot is determined by CT images of a thighbone of a patient, which is obtained before surgery. In the early days, the registration was made by matching three-dimensional images obtained from CT images with two-dimensional X-ray images of bones of patients obtained during surgery. Such method causes many errors in the process of distinguishing tissues like bone tissues, ligaments, etc. and the process of detecting edges. To reduce such errors, recently, the registration method that matches a particular point of a pre-surgery CT image with a particular point measured by digitizer during surgery has been being used. According to the registration method using the digitizer needs to press surface of a thighbone with a tip of measuring pin with a steady pressure in order to measure the particular point of bone tissues with the measuring pin of digitizer in surgery. When pressing the surface of a thighbone, if pressing force is too small, it causes an error in measuring the particular point, and if pressing force is too big, it causes cracks in surface of the bone. Furthermore, it causes discomfort due to many measuring points for reducing the error, and it causes difficulty for the surgeon to correspond a measuring pin exactly with a measured point guided by a monitor attached to surgery equipment.
- Meanwhile, the surgeon determines a robotic surgery plan, considering three-dimensional surface data of bones obtained by computer tomography equipment (CT), etc. before surgery, and the status of patients, etc. The determined robotic surgery plan is stored in the surgery
information storage unit 340 by thesurgery control unit 300 according the present invention. - At this time, the robotic surgery plan applying to the present invention may be comprised of plurality of steps, and has various cutting options that are applicable to each surgery step. The surgery
information storage unit 340 according to the present invention stores libraries related to such cutting options of each surgery step. - The surgery
information storage unit 340 may be implemented as a form of database, and the term “database” used in the present invention means a functional element storing information, does not mean database in a strict sense like relational database, objected-oriented database. The surgeryinformation storage unit 340 could be implemented as various forms of storage elements including a simple storage element of a form of file-base, etc. - The
surgery control unit 300, accordingly, in the step that the surgeon needs additional information to modify the surgical plan during surgery, selects appropriate options in information stored in the surgeryinformation storage unit 340 and provides the selected options. - Concretely, at the request of modifying the surgery plan of the surgeon, the
surgery control unit 300 provides at least one image associated with cutting options that are applicable to corresponding surgery step. Furthermore, thesurgery control unit 300 could display at least one image associated with cutting options to be superimposed on a real-time image associated with the diseased part, and could display those images to be distinguishable from one another. For example, to display the images to be distinguishable from one another, while displaying the real-time image associated to the diseased part without any processing, thesurgery control unit 300 displays the image associated with cutting options by using only outlines or translucent gray scales. At this time, thesurgery control unit 300 could use augmented reality technology in displaying the two images. - In addition, the
surgery control unit 300 according to the present invention modifies the preset robotic surgery plan by applying the cutting options selected by the surgeon, and controls the cuttingrobot 100 according to the modified robotic surgery plan. - Meanwhile, the scene
image obtaining unit 330 takes pictures of surgery scenes regarding diseased parts in surgery room and obtains images of surgery scenes. The preferred embodiment of the sceneimage obtaining unit 330 is an optical camera. - The
image registration unit 310 finds the positional relation of the sceneimage obtaining unit 330, for example, the optical camera, and the cuttingrobots 100, and matches coordinates of the image of the diseased part with coordinates of the image held by the robot. - The
user interface 320 displays the scene images obtained by the sceneimage obtaining unit 330 and displays pre-recognized position of bones, which is stored in the surgeryinformation storage unit 340, to be superimposed on the scene images according to the control of thesurgery control unit 300. - At this time, considering the relationship of matching of the scene image and the image held by the robot that is provided by the
image registration unit 310, theuser interface 320 displays the two images to be superimposed on each other. -
FIG. 2 shows one embodiment of the scene image obtaining unit and the user interface according to the present invention. - The embodiment of
FIG. 2 illustrates an optical camera as an example of the sceneimage obtaining unit 330, and shows a display screen to which the optical camera is attached in the rear as an example of theuser interface 340. - That is, the embodiment of
FIG. 2 shows that the camera and theuser interface 340 are integrated with each other. In addition, in the embodiment ofFIG. 2 , the camera is connected with amechanical arm 331, and the user can move thecamera 330 and theuser interface 340 at the same time by moving themechanical arm 331. - Meanwhile, in the present embodiment, a sensor included in the
mechanical arm 331 can find position of the camera, and the found position of the camera is used in image registration of theimage registration unit 310 according to the present invention. Furthermore, besides of the method of using the sensor included in themechanical arm 331, the position of camera and the position of display could be found by wireless methods such as infrared rays. - In the embodiment of
FIG. 2 , it is easy for the user to find with naked eyes on the space because theuser interface 340 locates at same position with thecamera 330. - The A of
FIG. 2 is a front view of a display screen to which the optical camera is attached. The B ofFIG. 2 is a side view of the display screen to which the optical camera is attached. The C ofFIG. 2 is a rear view of the display screen to which the optical camera is attached. - When the surgeon moves the optical camera to desired position during surgery, the computer attached to the robot displays the shape, which is to be processed by the robot, to be superimposed on the image obtained by the optical camera. It can be understood by the display screen shown in A of
FIG. 2 . - As moving the position of camera to desired position, the surgeon can determine whether the shape, which is to be processed by the robot, has a risk of conflicts with soft tissues. In addition, the surgeon can omit the cutting, which is possible to cause any problem, by removing a part of the shape in the surgery plan displayed superimposed by the
user interface 340, or the surgeon can add the amount of cutting as he wants. - Furthermore, the apparatus for adjusting a surgery robotic plan according to the present invention provides many possible libraries of cutting paths. When the surgeon selects one of the options in the libraries, the apparatus displays the shape, which is to be processed using the selected option, to be superimposed on the real image of surgery, which is being showed currently, thereby helping a choice of the surgeon.
- In addition, because the apparatus for adjusting a surgery robotic plan according to the present invention has the position of bones in advance before surgery, the apparatus can display the known position of bones to be superimposed on the real position of bones of surgery room inputted by camera, after matching those two kinds of position of bones. For example, in the position of bones previously known, when the apparatus displays outlines of bones previously known to be superimposed on the image of bones being showed currently, it can be easily understood that whether the known position of bones is correct or not.
- Meanwhile, it was mentioned above that the augmented reality technology could be used when the apparatus displays the image associated with cutting options of the robotic surgery plan to be superimposed on the real image of bones showed in surgery room.
- The augmented reality technology is a technology to superimpose some virtual objects on the real world that the user can see with eyes. It can be also called by mixed reality (MR), because it shows as a one image, combining the virtual world having additional information with the real world in real time. The research and development about hybrid VR system combining the real world and the virtual world have been in the progress since the late 1990s centered, especially in the United States and Japan.
- In the augmented reality, which is a concept of complementing the real world with the virtual world, a leading part is the real world in spite of using the virtual world made by computer graphics. The computer graphics have a role to provide information additionally required by the real world. It means that to make ambiguous to distinguish the real world from the virtual screen by overlapping a three-dimensional virtual image on the real image showed to user.
- Therefore, according to the present invention, the augmented reality technology is achieved by superimposing the image associated with cutting options of the surgery plan, which is data of the virtual world, on the image of the diseased part of the real world, which is about the target of surgery.
- Meanwhile, in the present embodiment, the apparatus for adjusting a robotic surgery plan according to the present invention adjusts the
camera 330 toward the robot or sensors attached to the robot, and displays outlines of robot, which is previously known, to be superimposed on the real image of the robot inputted by the camera. Accordingly, it can be easily understood that whether the relationship of measured position between the robot and camera is correct. -
FIG. 3 shows an anther embodiment of the scene image obtaining unit and the user interface. - The embodiment of
FIG. 3 , as an example of the sceneimage obtaining unit 330, also illustrates the optical camera that is attached to themechanical arm 331 to move with themechanical arm 331. The different thing with the embodiment ofFIG. 2 is that theuser interface 340 is not attached with the optical camera, but locates away from the optical camera to give user comfort to see. - The
optical camera 330 and theuser interface 340 could communicate with each other by wired or wireless network. - In this case, as in the other case, the sensor of the
mechanical arm 331 can find position of the camera. The found position of the camera is used in image registration of theimage registration unit 310. The position of the camera also could be found by using wireless methods like infrared rays, etc. besides using the sensor attached to themechanical arm 331. -
FIG. 4 shows an example of a screen of cutting options provided during surgery by the apparatus for adjusting a robotic surgery plan according to the present invention. - As illustrated in A of
FIG. 4 , during surgery, when the surgeon moves theoptical camera 330 to desired position, in other words, the surgeon moves theoptical camera 330 to above an exposed bone, the apparatus displays the image that is to be processed by the robot to be superimposed on the image obtained by the optical camera via theuser interface 340. - As referring B of
FIG. 4 , in the situation that the predetermined surgery plan has been suspended, the apparatus displaysalternative cutting options main screen 400 of theuser interface 340. - The cutting options are displayed to be superimposed on the image of the exposed bone. When the surgeon selects one option among those cutting options, the
main screen 400 displays the selected cutting option to be superimposed on the real image of the diseased part. The image on themain screen 400 ofFIG. 4 is a related image in the case that the user selectedoption 1 among three options. - At this time, the surgery information storage unit stores libraries of possible cutting path, and the apparatus according to the present invention provides the libraries of possible cutting path to the user, thereby helping the user make choices.
-
FIG. 5 is a flowchart that depicts the method for adjusting a robotic surgery plan according to the present invention. - In explanation about an embodiment hereinafter, although it can be understood that each step of the method for adjusting a robotic surgery plan according to the present invention is performed in corresponding elements of the apparatus for adjusting a robotic surgery plan, which was explained through
FIG. 1 , the each step of the method should be limited as function itself, which defines the each step. In other words, the performer of each step is not limited by the names of elements that are given as examples of performer of each step. - According to the method for adjusting a robotic surgery plan, in step S510, an image associated with a diseased part of surgery room, which is obtained by the optical camera, etc. is displayed. When the image associated with the diseased part of surgery room is obtained, in step S520, the apparatus matches coordinates of the image associated with the diseased part with coordinates of an image associated with a bone of surgical target that has been already obtained by equipment like CT, etc. As the matching is complete, in step S530, the apparatus displays the pre-examined image associated with the bone of surgical target to be superimposed on the real-time image associated with the diseased part of surgery room.
- After that, when receiving a request for modifying a surgery plan from a surgeon in step S540, the apparatus provides at least one image associated with cutting options, which can be applied to a corresponding surgery step in step S550. In step S560, when a selected cutting option that is to be applied is inputted, in step S570, the apparatus displays the selected cutting option to be superimposed on the image associated with the diseased part of surgery room. In step S580, as the selected cutting option is fixed, the apparatus modifies the surgery plan, applying the fixed cutting option.
-
FIG. 6 illustrates an example of a screen of the user interface in which the present invention can be applied. - The screen of the user interface of
FIG. 6 shows an example of a screen providing the various processing options that can be applied to a bone transplant surgery that cuts real bones and transplants artificial bones, and displaying the selected option superimposed on the real image of bones. - In
FIG. 6 , the images associated withprocessing options FIG. 6 , on the top right-hand side of the screen of the user display device, a menu for selectingoptions 341 is provided, so that the surgeon can select a processing option that is to be applied to the robotic surgery. - In
FIG. 6 , on the bottom left-hand side of thescreen 610 of user display device, an image of a real diseased part is displayed, and on the bottom right-hand side 620, an image of the selected processing option is displayed in addition to the image of the real diseased part. In other words, on the bottom right-hand side of thescreen 620, the processing option of selectedsize 5 is displayed to be superimposed on the real image of the diseased part using augmented reality, thereby providing the surgeon with predicted appearances of the diseased part when a bone transplant surgery has been performed by applying the processing option ofsize 5. If the surgeon thinks that the transplant model ofsize 5 is not matched with the status of real bone of diseased part properly, the surgeon can select a processing option of the most proper size by selecting other option. - According to the present invention, after advance checking a virtual preview of transplant of when the provided processing options is applied to the real diseased part, the surgeon modifies the surgery plan by selecting and determining the processing option of the most proper size. Accordingly, the surgeon can adjust the robotic surgery so that the robotic surgery is performed by the modified surgery plan.
- According to the present invention that has been described above with the embodiments, the present invention can deal with various requests for modifying robotic surgery plans actively and promptly.
- While the invention has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (10)
1. An apparatus for adjusting a robotic surgery plan, comprising:
a surgery information storage unit storing an examined first image associated with an inputted robotic surgery plan and a target bone of surgery;
a scene image obtaining unit obtaining a second image associated with a diseased part in real time in surgery room;
an image registration unit matching coordinates of the examined first image with coordinates of the second image associated with the diseased part;
a user interface displaying the examined first image and the second image associated with the diseased part; and
a surgery control unit controlling the user interface to display the examined first image to be superimposed on the second image associated with the diseased part, which is inputted in real time.
2. The apparatus according to claim 1 , wherein the surgery information storage unit further stores phased cutting options of the robotic surgery plan and related images thereof.
3. The apparatus according to claim 2 , wherein the surgery control unit provides at least one image associated with the cutting options that is applicable to a corresponding surgery step, according to request for modifying the surgery plan inputted via the user interface.
4. The apparatus according to claim 3 , wherein the surgery control unit controls the user interface so that at least one image associated with the cutting options is superimposed on the second image associated with the diseased part, and also displayed to be distinguishable from the second image associated with the diseased part.
5. The apparatus according to claim 1 , wherein the scene image obtaining unit comprises an optical camera and a mechanical arm which is attached to the optical camera and supports movements of the optical camera.
6. The apparatus according to claim 1 , wherein the scene image obtaining unit and the user interface are attached to each other to be moveable together.
7. The apparatus according to claim 1 , wherein the surgery control unit controls the user interface so that the user interface displays outlines of the first image to be superimposed on the second image associated with the diseased part.
8. The apparatus according to claim 7 , wherein the surgery control unit displays the outlines of the first image to be superimposed on the second image associated with the diseased part using augmented reality technology.
9. The apparatus according to claim 3 , wherein the surgery control unit modifies the robotic surgery plan based on a selected cutting option.
10. The apparatus according to claim 9 , further comprising a cutting robot processing a target bone of surgery according to the modified robotic surgery plan inputted from the surgery control unit.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2014-0051464 | 2014-04-29 | ||
KR1020140051464A KR101570857B1 (en) | 2014-04-29 | 2014-04-29 | Apparatus for adjusting robot surgery plans |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150305828A1 true US20150305828A1 (en) | 2015-10-29 |
Family
ID=54333686
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/697,840 Abandoned US20150305828A1 (en) | 2014-04-29 | 2015-04-28 | Apparatus for adjusting a robotic surgery plan |
Country Status (2)
Country | Link |
---|---|
US (1) | US20150305828A1 (en) |
KR (1) | KR101570857B1 (en) |
Cited By (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150140535A1 (en) * | 2012-05-25 | 2015-05-21 | Surgical Theater LLC | Hybrid image/scene renderer with hands free control |
US20150332458A1 (en) * | 2014-05-16 | 2015-11-19 | CUREXO, Inc | Method for detecting positions of tissues and apparatus using the same |
US20160078682A1 (en) * | 2013-04-24 | 2016-03-17 | Kawasaki Jukogyo Kabushiki Kaisha | Component mounting work support system and component mounting method |
US20160228191A1 (en) * | 2013-09-24 | 2016-08-11 | Koninklijke Philips N.V. | Method of calculating a surgical intervention plan |
CN107221032A (en) * | 2017-06-28 | 2017-09-29 | 华中科技大学鄂州工业技术研究院 | Virtual laparoscope hepatic cyst excision fenestration operation teaching method and system |
CN108697473A (en) * | 2015-12-29 | 2018-10-23 | 皇家飞利浦有限公司 | Image guiding robot assembles ablation |
US10134166B2 (en) * | 2015-03-24 | 2018-11-20 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
US20190201130A1 (en) * | 2017-12-28 | 2019-07-04 | Ethicon Llc | Communication of data where a surgical network is using context of the data and requirements of a receiving system / user to influence inclusion or linkage of data and metadata to establish continuity |
US10499997B2 (en) | 2017-01-03 | 2019-12-10 | Mako Surgical Corp. | Systems and methods for surgical navigation |
US10650594B2 (en) | 2015-02-03 | 2020-05-12 | Globus Medical Inc. | Surgeon head-mounted display apparatuses |
US10646283B2 (en) | 2018-02-19 | 2020-05-12 | Globus Medical Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
CN111563932A (en) * | 2020-05-18 | 2020-08-21 | 苏州立威新谱生物科技有限公司 | Overlapping coaxial surgical operation control method, system and readable storage medium |
US10759074B2 (en) * | 2016-06-24 | 2020-09-01 | Zünd Systemtechnik Ag | System for cutting of cutting stock |
US20200367733A1 (en) * | 2018-02-21 | 2020-11-26 | Olympus Corporation | Medical system and medical system operating method |
US10939977B2 (en) | 2018-11-26 | 2021-03-09 | Augmedics Ltd. | Positioning marker |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11160614B2 (en) | 2017-10-25 | 2021-11-02 | Synaptive Medical Inc. | Surgical imaging sensor and display unit, and surgical navigation system associated therewith |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US11532250B2 (en) * | 2017-01-11 | 2022-12-20 | Sony Corporation | Information processing device, information processing method, screen, and information drawing system |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11701185B2 (en) | 2017-12-28 | 2023-07-18 | Cilag Gmbh International | Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices |
US11701189B2 (en) | 2020-01-31 | 2023-07-18 | Curexo, Inc. | Device for providing joint replacement robotic surgery information and method for providing same |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
US11744604B2 (en) | 2017-12-28 | 2023-09-05 | Cilag Gmbh International | Surgical instrument with a hardware-only control circuit |
US11751958B2 (en) | 2017-12-28 | 2023-09-12 | Cilag Gmbh International | Surgical hub coordination of control and communication of operating room devices |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
US11771487B2 (en) | 2017-12-28 | 2023-10-03 | Cilag Gmbh International | Mechanisms for controlling different electromechanical systems of an electrosurgical instrument |
US11775682B2 (en) | 2017-12-28 | 2023-10-03 | Cilag Gmbh International | Data stripping method to interrogate patient records and create anonymized record |
US11779337B2 (en) | 2017-12-28 | 2023-10-10 | Cilag Gmbh International | Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices |
US11786379B2 (en) | 2017-08-11 | 2023-10-17 | Think Surgical, Inc. | System and method for implant verification |
US11786251B2 (en) | 2017-12-28 | 2023-10-17 | Cilag Gmbh International | Method for adaptive control schemes for surgical network control and interaction |
US11793537B2 (en) | 2017-10-30 | 2023-10-24 | Cilag Gmbh International | Surgical instrument comprising an adaptive electrical system |
US11801098B2 (en) | 2017-10-30 | 2023-10-31 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US11801115B2 (en) | 2019-12-22 | 2023-10-31 | Augmedics Ltd. | Mirroring in image guided surgery |
US11818052B2 (en) | 2017-12-28 | 2023-11-14 | Cilag Gmbh International | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
US11832899B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical systems with autonomously adjustable control programs |
US11839396B2 (en) | 2018-03-08 | 2023-12-12 | Cilag Gmbh International | Fine dissection mode for tissue classification |
US11844579B2 (en) | 2017-12-28 | 2023-12-19 | Cilag Gmbh International | Adjustments based on airborne particle properties |
US11857152B2 (en) | 2017-12-28 | 2024-01-02 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
CN117338437A (en) * | 2023-08-24 | 2024-01-05 | 首都医科大学宣武医院 | Mixed reality robot system |
US11864728B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
US11864845B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Sterile field interactive control displays |
US11871901B2 (en) | 2012-05-20 | 2024-01-16 | Cilag Gmbh International | Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage |
US11890065B2 (en) | 2017-12-28 | 2024-02-06 | Cilag Gmbh International | Surgical system to limit displacement |
US11896322B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub |
US11896443B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Control of a surgical system through a surgical barrier |
US11903587B2 (en) | 2017-12-28 | 2024-02-20 | Cilag Gmbh International | Adjustment to the surgical stapling control based on situational awareness |
US11911045B2 (en) | 2017-10-30 | 2024-02-27 | Cllag GmbH International | Method for operating a powered articulating multi-clip applier |
US11925350B2 (en) | 2019-02-19 | 2024-03-12 | Cilag Gmbh International | Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge |
US11931027B2 (en) | 2018-03-28 | 2024-03-19 | Cilag Gmbh Interntional | Surgical instrument comprising an adaptive control system |
US11969142B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws |
US11969216B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11974887B2 (en) | 2018-05-02 | 2024-05-07 | Augmedics Ltd. | Registration marker for an augmented reality system |
US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
US11986233B2 (en) | 2018-03-08 | 2024-05-21 | Cilag Gmbh International | Adjustment of complex impedance to compensate for lost power in an articulating ultrasonic device |
US11986185B2 (en) | 2018-03-28 | 2024-05-21 | Cilag Gmbh International | Methods for controlling a surgical stapler |
US11992373B2 (en) | 2019-12-10 | 2024-05-28 | Globus Medical, Inc | Augmented reality headset with varied opacity for navigated robotic surgery |
US11998193B2 (en) | 2017-12-28 | 2024-06-04 | Cilag Gmbh International | Method for usage of the shroud as an aspect of sensing or controlling a powered surgical device, and a control algorithm to adjust its default operation |
US12009095B2 (en) | 2017-12-28 | 2024-06-11 | Cilag Gmbh International | Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes |
US12029506B2 (en) | 2021-06-29 | 2024-07-09 | Cilag Gmbh International | Method of cloud based data analytics for use with the hub |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101863574B1 (en) | 2016-12-29 | 2018-06-01 | 경북대학교 산학협력단 | Application method to fluoroscopy in laser guidance system, recording medium and laser guidance system including calibration tool for performing the method |
KR102107019B1 (en) * | 2020-02-24 | 2020-05-28 | 대구보건대학교산학협력단 | Dental sculpture training system and dental sculpture training method using the same |
KR20240041681A (en) * | 2022-09-23 | 2024-04-01 | 큐렉소 주식회사 | Apparatus for planning cutting path of surgical robot, and mehtod thereof |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080243142A1 (en) * | 2007-02-20 | 2008-10-02 | Gildenberg Philip L | Videotactic and audiotactic assisted surgical methods and procedures |
US8010180B2 (en) * | 2002-03-06 | 2011-08-30 | Mako Surgical Corp. | Haptic guidance system and method |
US20130293578A1 (en) * | 2012-05-02 | 2013-11-07 | Empire Technology Development LLC. a corporation | Four Dimensional Image Registration Using Dynamical Model For Augmented Reality In Medical Applications |
US20160148052A1 (en) * | 2013-07-16 | 2016-05-26 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002085421A (en) * | 2000-09-20 | 2002-03-26 | Takahiro Ochi | Surgery supporting system |
KR101334007B1 (en) | 2012-01-12 | 2013-11-27 | 의료법인 우리들의료재단 | Surgical Robot Control System and Method therefor |
-
2014
- 2014-04-29 KR KR1020140051464A patent/KR101570857B1/en active IP Right Grant
-
2015
- 2015-04-28 US US14/697,840 patent/US20150305828A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8010180B2 (en) * | 2002-03-06 | 2011-08-30 | Mako Surgical Corp. | Haptic guidance system and method |
US20080243142A1 (en) * | 2007-02-20 | 2008-10-02 | Gildenberg Philip L | Videotactic and audiotactic assisted surgical methods and procedures |
US20130293578A1 (en) * | 2012-05-02 | 2013-11-07 | Empire Technology Development LLC. a corporation | Four Dimensional Image Registration Using Dynamical Model For Augmented Reality In Medical Applications |
US20160148052A1 (en) * | 2013-07-16 | 2016-05-26 | Seiko Epson Corporation | Information processing apparatus, information processing method, and information processing system |
Cited By (91)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11871901B2 (en) | 2012-05-20 | 2024-01-16 | Cilag Gmbh International | Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage |
US10056012B2 (en) * | 2012-05-25 | 2018-08-21 | Surgical Theatre LLC | Hybrid image/scene renderer with hands free control |
US20150140535A1 (en) * | 2012-05-25 | 2015-05-21 | Surgical Theater LLC | Hybrid image/scene renderer with hands free control |
US20160078682A1 (en) * | 2013-04-24 | 2016-03-17 | Kawasaki Jukogyo Kabushiki Kaisha | Component mounting work support system and component mounting method |
US11020184B2 (en) * | 2013-09-24 | 2021-06-01 | Koninklijke Philips N.V. | System and method for accurately determining and marking skin incision lines for surgical intervention |
US20160228191A1 (en) * | 2013-09-24 | 2016-08-11 | Koninklijke Philips N.V. | Method of calculating a surgical intervention plan |
US20150332458A1 (en) * | 2014-05-16 | 2015-11-19 | CUREXO, Inc | Method for detecting positions of tissues and apparatus using the same |
US9848834B2 (en) * | 2014-05-16 | 2017-12-26 | CUREXO, Inc | Method for detecting positions of tissues and apparatus using the same |
US11763531B2 (en) | 2015-02-03 | 2023-09-19 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
US11217028B2 (en) | 2015-02-03 | 2022-01-04 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
US11734901B2 (en) | 2015-02-03 | 2023-08-22 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
US11176750B2 (en) | 2015-02-03 | 2021-11-16 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
US12002171B2 (en) | 2015-02-03 | 2024-06-04 | Globus Medical, Inc | Surgeon head-mounted display apparatuses |
US10650594B2 (en) | 2015-02-03 | 2020-05-12 | Globus Medical Inc. | Surgeon head-mounted display apparatuses |
US11062522B2 (en) | 2015-02-03 | 2021-07-13 | Global Medical Inc | Surgeon head-mounted display apparatuses |
US11461983B2 (en) | 2015-02-03 | 2022-10-04 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
US11750794B2 (en) * | 2015-03-24 | 2023-09-05 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
US20240022704A1 (en) * | 2015-03-24 | 2024-01-18 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
US20190273916A1 (en) * | 2015-03-24 | 2019-09-05 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
US10134166B2 (en) * | 2015-03-24 | 2018-11-20 | Augmedics Ltd. | Combining video-based and optic-based augmented reality in a near eye display |
CN108697473A (en) * | 2015-12-29 | 2018-10-23 | 皇家飞利浦有限公司 | Image guiding robot assembles ablation |
US10759074B2 (en) * | 2016-06-24 | 2020-09-01 | Zünd Systemtechnik Ag | System for cutting of cutting stock |
US10499997B2 (en) | 2017-01-03 | 2019-12-10 | Mako Surgical Corp. | Systems and methods for surgical navigation |
US11707330B2 (en) | 2017-01-03 | 2023-07-25 | Mako Surgical Corp. | Systems and methods for surgical navigation |
US11532250B2 (en) * | 2017-01-11 | 2022-12-20 | Sony Corporation | Information processing device, information processing method, screen, and information drawing system |
CN107221032A (en) * | 2017-06-28 | 2017-09-29 | 华中科技大学鄂州工业技术研究院 | Virtual laparoscope hepatic cyst excision fenestration operation teaching method and system |
US11786379B2 (en) | 2017-08-11 | 2023-10-17 | Think Surgical, Inc. | System and method for implant verification |
US11160614B2 (en) | 2017-10-25 | 2021-11-02 | Synaptive Medical Inc. | Surgical imaging sensor and display unit, and surgical navigation system associated therewith |
US11793537B2 (en) | 2017-10-30 | 2023-10-24 | Cilag Gmbh International | Surgical instrument comprising an adaptive electrical system |
US11925373B2 (en) | 2017-10-30 | 2024-03-12 | Cilag Gmbh International | Surgical suturing instrument comprising a non-circular needle |
US11819231B2 (en) | 2017-10-30 | 2023-11-21 | Cilag Gmbh International | Adaptive control programs for a surgical system comprising more than one type of cartridge |
US11801098B2 (en) | 2017-10-30 | 2023-10-31 | Cilag Gmbh International | Method of hub communication with surgical instrument systems |
US11911045B2 (en) | 2017-10-30 | 2024-02-27 | Cllag GmbH International | Method for operating a powered articulating multi-clip applier |
US11857152B2 (en) | 2017-12-28 | 2024-01-02 | Cilag Gmbh International | Surgical hub spatial awareness to determine devices in operating theater |
US11779337B2 (en) | 2017-12-28 | 2023-10-10 | Cilag Gmbh International | Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices |
US11969142B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws |
US11918302B2 (en) | 2017-12-28 | 2024-03-05 | Cilag Gmbh International | Sterile field interactive control displays |
US11969216B2 (en) | 2017-12-28 | 2024-04-30 | Cilag Gmbh International | Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution |
US11903587B2 (en) | 2017-12-28 | 2024-02-20 | Cilag Gmbh International | Adjustment to the surgical stapling control based on situational awareness |
US11896443B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Control of a surgical system through a surgical barrier |
US11744604B2 (en) | 2017-12-28 | 2023-09-05 | Cilag Gmbh International | Surgical instrument with a hardware-only control circuit |
US11751958B2 (en) | 2017-12-28 | 2023-09-12 | Cilag Gmbh International | Surgical hub coordination of control and communication of operating room devices |
US11998193B2 (en) | 2017-12-28 | 2024-06-04 | Cilag Gmbh International | Method for usage of the shroud as an aspect of sensing or controlling a powered surgical device, and a control algorithm to adjust its default operation |
US11771487B2 (en) | 2017-12-28 | 2023-10-03 | Cilag Gmbh International | Mechanisms for controlling different electromechanical systems of an electrosurgical instrument |
US12009095B2 (en) | 2017-12-28 | 2024-06-11 | Cilag Gmbh International | Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes |
US11775682B2 (en) | 2017-12-28 | 2023-10-03 | Cilag Gmbh International | Data stripping method to interrogate patient records and create anonymized record |
US11701185B2 (en) | 2017-12-28 | 2023-07-18 | Cilag Gmbh International | Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices |
US11890065B2 (en) | 2017-12-28 | 2024-02-06 | Cilag Gmbh International | Surgical system to limit displacement |
US11786251B2 (en) | 2017-12-28 | 2023-10-17 | Cilag Gmbh International | Method for adaptive control schemes for surgical network control and interaction |
US11844579B2 (en) | 2017-12-28 | 2023-12-19 | Cilag Gmbh International | Adjustments based on airborne particle properties |
US11864728B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Characterization of tissue irregularities through the use of mono-chromatic light refractivity |
US20190201130A1 (en) * | 2017-12-28 | 2019-07-04 | Ethicon Llc | Communication of data where a surgical network is using context of the data and requirements of a receiving system / user to influence inclusion or linkage of data and metadata to establish continuity |
US11832899B2 (en) | 2017-12-28 | 2023-12-05 | Cilag Gmbh International | Surgical systems with autonomously adjustable control programs |
US11818052B2 (en) | 2017-12-28 | 2023-11-14 | Cilag Gmbh International | Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs |
US11896322B2 (en) | 2017-12-28 | 2024-02-13 | Cilag Gmbh International | Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub |
US11864845B2 (en) | 2017-12-28 | 2024-01-09 | Cilag Gmbh International | Sterile field interactive control displays |
US10646283B2 (en) | 2018-02-19 | 2020-05-12 | Globus Medical Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
US11800966B2 (en) * | 2018-02-21 | 2023-10-31 | Olympus Corporation | Medical system and medical system operating method |
US20200367733A1 (en) * | 2018-02-21 | 2020-11-26 | Olympus Corporation | Medical system and medical system operating method |
US11844545B2 (en) | 2018-03-08 | 2023-12-19 | Cilag Gmbh International | Calcified vessel identification |
US11839396B2 (en) | 2018-03-08 | 2023-12-12 | Cilag Gmbh International | Fine dissection mode for tissue classification |
US11986233B2 (en) | 2018-03-08 | 2024-05-21 | Cilag Gmbh International | Adjustment of complex impedance to compensate for lost power in an articulating ultrasonic device |
US11931027B2 (en) | 2018-03-28 | 2024-03-19 | Cilag Gmbh Interntional | Surgical instrument comprising an adaptive control system |
US11986185B2 (en) | 2018-03-28 | 2024-05-21 | Cilag Gmbh International | Methods for controlling a surgical stapler |
US11980508B2 (en) | 2018-05-02 | 2024-05-14 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
US11980507B2 (en) | 2018-05-02 | 2024-05-14 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
US11974887B2 (en) | 2018-05-02 | 2024-05-07 | Augmedics Ltd. | Registration marker for an augmented reality system |
US10939977B2 (en) | 2018-11-26 | 2021-03-09 | Augmedics Ltd. | Positioning marker |
US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
US11980429B2 (en) | 2018-11-26 | 2024-05-14 | Augmedics Ltd. | Tracking methods for image-guided surgery |
US11925350B2 (en) | 2019-02-19 | 2024-03-12 | Cilag Gmbh International | Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge |
US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
US11992373B2 (en) | 2019-12-10 | 2024-05-28 | Globus Medical, Inc | Augmented reality headset with varied opacity for navigated robotic surgery |
US11801115B2 (en) | 2019-12-22 | 2023-10-31 | Augmedics Ltd. | Mirroring in image guided surgery |
US11883117B2 (en) | 2020-01-28 | 2024-01-30 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11701189B2 (en) | 2020-01-31 | 2023-07-18 | Curexo, Inc. | Device for providing joint replacement robotic surgery information and method for providing same |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11690697B2 (en) | 2020-02-19 | 2023-07-04 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11838493B2 (en) | 2020-05-08 | 2023-12-05 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11839435B2 (en) | 2020-05-08 | 2023-12-12 | Globus Medical, Inc. | Extended reality headset tool tracking and control |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
CN111563932A (en) * | 2020-05-18 | 2020-08-21 | 苏州立威新谱生物科技有限公司 | Overlapping coaxial surgical operation control method, system and readable storage medium |
US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
US12029506B2 (en) | 2021-06-29 | 2024-07-09 | Cilag Gmbh International | Method of cloud based data analytics for use with the hub |
CN117338437A (en) * | 2023-08-24 | 2024-01-05 | 首都医科大学宣武医院 | Mixed reality robot system |
Also Published As
Publication number | Publication date |
---|---|
KR101570857B1 (en) | 2015-11-24 |
KR20150125069A (en) | 2015-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150305828A1 (en) | Apparatus for adjusting a robotic surgery plan | |
US10951872B2 (en) | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with real time visualization of tracked instruments | |
US11844574B2 (en) | Patient-specific preoperative planning simulation techniques | |
US11806085B2 (en) | Guidance for placement of surgical ports | |
US11660142B2 (en) | Method for generating surgical simulation information and program | |
TWI707660B (en) | Wearable image display device for surgery and surgery information real-time system | |
KR20190088419A (en) | Program and method for generating surgical simulation information | |
KR101940706B1 (en) | Program and method for generating surgical simulation information | |
WO2020210972A1 (en) | Wearable image display device for surgery and surgical information real-time presentation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CUREXO, INC, KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARK, YOUNG-BAE;SONG, CHANG-HUN;LEE, JAE-JUN;SIGNING DATES FROM 20150416 TO 20150420;REEL/FRAME:035510/0994 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |