WO2014077613A1 - 정복 시술 로봇 및 그의 구동 제어 방법 - Google Patents
정복 시술 로봇 및 그의 구동 제어 방법 Download PDFInfo
- Publication number
- WO2014077613A1 WO2014077613A1 PCT/KR2013/010387 KR2013010387W WO2014077613A1 WO 2014077613 A1 WO2014077613 A1 WO 2014077613A1 KR 2013010387 W KR2013010387 W KR 2013010387W WO 2014077613 A1 WO2014077613 A1 WO 2014077613A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- bone
- region
- area
- fracture
- interpolation image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 143
- 238000012545 processing Methods 0.000 claims abstract description 10
- 238000005452 bending Methods 0.000 claims description 5
- 210000003275 diaphysis Anatomy 0.000 claims description 5
- 208000010392 Bone Fractures Diseases 0.000 description 92
- 230000009467 reduction Effects 0.000 description 14
- 238000003384 imaging method Methods 0.000 description 6
- 238000001356 surgical procedure Methods 0.000 description 6
- 239000000284 extract Substances 0.000 description 5
- 210000000689 upper leg Anatomy 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000004904 shortening Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000008439 repair process Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 241000282693 Cercopithecidae Species 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 210000002745 epiphysis Anatomy 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/56—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
- A61B17/58—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/56—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor
- A61B17/58—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like
- A61B17/60—Surgical instruments or methods for treatment of bones or joints; Devices specially adapted therefor for osteosynthesis, e.g. bone plates, screws, setting implements or the like for external osteosynthesis, e.g. distractors, contractors
- A61B17/66—Alignment, compression or distraction mechanisms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/009—Nursing, e.g. carrying sick persons, pushing wheelchairs, distributing drugs
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/211—Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/505—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of bone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/647—Three-dimensional objects by matching two-dimensional images to three-dimensional objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
- H04N13/117—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
Definitions
- the present invention relates to a reduction robot and a drive control method thereof, and to a reduction robot and a drive control method thereof capable of displaying a fracture site and a three-dimensional interpolation image of a symmetrical position where a fracture does not occur.
- Robots are mechanical devices that make it possible for people to do many difficult and difficult tasks. Recently, robots have been applied not only to the industrial field but also to the medical field.
- a robot for conquest surgery includes a photographing unit for photographing a plurality of bone images of different angles with respect to a fracture region of a patient and a normal region corresponding to the fracture region.
- An image processor configured to generate a 3D interpolation image for each of the fracture region and the normal region by using the photographed plurality of bone images, and a 3D interpolation image for the generated fracture region and the generated normal region It includes a display unit for displaying a three-dimensional interpolation image for.
- the photographing unit may photograph the fracture area and the normal area by using at least one device of X-ray, MRI, and CT.
- the image processor may extract a diaphysis portion of the bone image and generate a 3D interpolation image of the extracted interbone portion.
- the image processor may generate a 3D interpolation image in consideration of at least one of a diameter, a bending, and a length of the bone of the bone image.
- the robot may further include a position detecting unit configured to detect a position of a movable bone area among the fracture areas, and the display unit may include a 3D interpolation image of the normal area based on the detected position.
- the 3D interpolation image of the fracture area may be overlapped with each other.
- the reduction robot extracts a bone outline for each of the generated 3D interpolation image and the 3D interpolation image for the generated normal region, and calculates the degree of coincidence of the extracted bone outline.
- the display unit may further include a matching unit, and the display unit may display the calculated degree of matching.
- the matching unit may divide the 3D interpolation image into a plurality of regions, and calculate a degree of agreement of the divided bone outlines for each divided region.
- the reduction robot is fixed to one side of the bone area that is movable among the fracture area, the driving unit for moving the bone area, and the operation unit for receiving a control command for controlling the operation of the drive unit It may include.
- the operation unit may receive a stop command as a control command, and when the stop command is input, the driving unit may maintain the position of the bone region in the current state.
- the reduction robot is a path calculation unit for calculating a movement path for reduction of the bone area that is movable among the fracture area, and, if the bone area is out of the calculated movement path, the driving unit It may further include a control unit for stopping the operation.
- the driving unit may be connected to one side of the bone region through a fixing pin, a connecting portion, and an outer fixing portion.
- the fixing pin may include a spiral region inserted into the fractured bone region and a threshold region protruding so that the spiral region is inserted only by a predetermined depth.
- an outer shape of the threshold region may have a polygonal shape with respect to the central axis of the fixing pin.
- connection part may include a first area capable of fixing the fixing pin on one side and a second area capable of fixing the external fixing part on the other side.
- the outer fixing portion may have at least one form of a U-shape, annular ring form, semi-circle form, date form.
- the drive control method of the conquest robot the step of photographing a plurality of bone images of different angles for each of the fracture area of the patient and the normal area corresponding to the fracture area, the plurality of the photographed Generating a 3D interpolation image of each of the fracture area and the normal area using a bone image, and generating a 3D interpolation image of the generated fracture area and a 3D interpolation image of the generated normal area And displaying together.
- the fracture region and the normal region may be photographed using at least one device of X-ray, MRI, and CT.
- the diaphysis portion of the bone image may be extracted, and the 3D interpolation image of the extracted interbone portion may be generated.
- the 3D interpolation image may be generated in consideration of at least one of a diameter, a bending, and a length of the bone of the bone image.
- the driving control method may further include detecting a position of a movable bone area among the fracture areas, and the displaying of the driving control method may include a 3D interpolation image of the normal area based on the detected position.
- the 3D interpolation image of the fracture area may be overlapped and displayed on the screen.
- the driving control method extracts a bone outline for each of the generated 3D interpolation image for the fracture area and the generated 3D interpolation image for the generated normal area, and calculates the degree of coincidence of the extracted bone outline.
- the method may further include displaying the calculated degree of match.
- the 3D interpolation image may be divided into a plurality of regions, and the degree of coincidence of the bone outlines of the divided regions may be calculated.
- the drive control method the step of receiving a control command for controlling the drive of the conquest surgical robot, and driving to move the bone area that is movable among the fracture area in accordance with the input control command It may further include.
- the step of receiving the control command, receiving the stop command, the driving step, when the stop command is input it is possible to maintain the position of the bone area in the current state.
- the drive control method the step of calculating the movement path for reduction (reduction) of the movable bone area of the fracture area, and if the bone area is out of the calculated movement path of the reduction robot
- the method may further include stopping the driving.
- FIG. 1 is a simplified block diagram of a conquest robot according to an embodiment of the present invention
- FIG. 2 is a specific block diagram of a conquest robot according to an embodiment of the present invention.
- FIG. 3 is a view showing the shape of the fixing pin according to the present embodiment
- connection part and the external fixing part are a view illustrating a coupling example of the connection part and the external fixing part according to the present embodiment
- FIG. 7 to 9 illustrate various examples of a user interface window that may be displayed on the display of FIG. 1;
- FIG. 10 is a flowchart for explaining a driving control method of a conquest robot according to an embodiment of the present invention.
- FIG. 11 is a flowchart for describing the driving control method of FIG. 10 in detail.
- FIG. 1 is a simplified block diagram of a conquest robot according to an embodiment of the present invention.
- the conquest robot 100 may include a photographing unit 110, an image processing unit 120, and a display unit 130.
- the imaging unit 110 captures a plurality of bone images of different angles for each of the fracture area and the fracture area of the patient.
- the photographing unit 110 may photograph a plurality of bone images while rotating about each of the normal areas corresponding to the fracture area and the fracture area of the patient using techniques such as X-ray, MRI, and CT capable of bone imaging.
- the normal region is a bone region located opposite to the bone located in the fracture region. For example, when the fracture region is the left femur, the normal region is the right femur.
- the reason for photographing at different angles for the same area is that when checking the fracture reduction using only the two-dimensional image, it is difficult to check the rotation state of the bone and the like. Therefore, in the present embodiment, photographing is performed at different angles for the same region, and three-dimensional interpolation is performed using bone images photographed at different angles in the image processing unit 120 described later.
- the image processor 120 generates an interpolation image for each of the fracture area and the normal area by using the plurality of captured bone images.
- the image processor 120 may use a plurality of bone images photographed by the photographing unit 110 to determine a three-dimensional interpolation image (hereinafter referred to as a three-dimensional fracture image) and a three-dimensional interpolation of a normal region.
- An image hereinafter referred to as a three-dimensional normal image
- a method of generating a more specific interpolation image will be described later with reference to FIG. 6.
- the image processor 120 may perform image processing on the generated 3D interpolation image according to the position of the bone detected by the sensor 140 to be described later. Specifically, the bone of the fracture area is moved by the reduction operation. Accordingly, the image processor 120 may generate a 3D fracture image having a variable position of the movable region on the 3D fracture image according to the degree of movement of the bone sensed by the detector 140.
- the display unit 120 displays various types of information provided by the conquest robot 100.
- the display unit 120 displays a 3D interpolation image (ie, 3D fracture image) of the generated fracture area and a 3D interpolation image (ie, 3D normal image) of the generated normal area.
- the display unit 120 may overlay and display the 3D fracture image on the 3D normal image as shown in FIG. 7.
- the display unit 120 may display the matching degree calculated by the matching unit to be described later, as shown in FIG. 9.
- Various user interface windows that may be displayed on the display unit 120 will be described later with reference to FIGS. 7 to 9.
- the conquest robot 100 generates and displays a three-dimensional interpolation image of a normal region corresponding to the fracture region and the fracture region, and thus the doctor can perform a more precise conquest procedure.
- the photographing unit 110, the image processing unit 120, and the display unit 130 are illustrated and described as being applied only to the conquest robot 100, but in the implementation, the photographing unit 110 and the image processing unit ( 120 and the display 130 may be implemented as a diagnostic device for measuring the degree of fracture of the patient.
- the conquest robot 100 may further include other components in addition to the aforementioned configuration. More detailed configuration of the conquest robot 100 will be described below with reference to FIG. 2.
- FIG. 2 is a detailed block diagram of the conquest robot according to an embodiment of the present invention.
- the conquest robot 100 includes a photographing unit 110, an image processing unit 120, a display unit 130, a sensing unit 140, a matching unit 150, a driving unit 160, and an operation unit 170. ), The path calculator 180, and the controller 190.
- the sensing unit 140 detects a position of a bone area that can be moved among the fracture areas.
- the conquest robot 100 moves a bone area that is movable among the fracture areas for conquest of the bone.
- the bone position of the fracture area is changed by this movement, so it is necessary to confirm how much such movement should be performed.
- the state of the bone caused by such movement is confirmed through additional X-ray imaging.
- a marker is installed in each of the fixed and movable regions of the fracture region, and the sensing unit 140 moves. Markers installed in these possible areas can be tracked to detect the location of the movable bone area. By using such a sensing unit 140, it is possible to track the movement of the movable bone area of the fracture area without additional X-ray imaging.
- the matching unit 150 extracts a bone outline for each of the generated 3D interpolation image of the fracture area and the generated 3D interpolation image, and calculates the degree of matching of the extracted bone outline.
- the matching unit 150 may divide the 3D interpolation image into a plurality of regions, and calculate a degree of matching of the divided outlines of the divided regions.
- the matching unit 150 may be divided into an upper region, an intermediate region, and a lower region on the 3D interpolation image, and separately calculate the matching degree of each region. In this way, by calculating whether the bone is matched by dividing it into a plurality of areas, it is possible to determine whether the rotation and shortening or excess dog of the entire bone.
- the bone outline is extracted from the 3D interpolation image, and the degree of coincidence between the two bones is calculated using the extracted bone outline.
- various other parameters such as the volume and area of the bone are used. You can calculate the degree of agreement between the two bones.
- the degree of coincidence calculated in this manner may be displayed on the display unit 130 described above.
- the calculation of the matching degree may be performed in real time according to the movement of the bone.
- the driving unit 160 is fixed to one side of the bone area that can be moved among the fracture areas, and moves the bone area.
- the driving unit 160 includes a motor and a robot arm, and moves the bone area, which is movable in the fracture area, using the robot arm based on an operation command of the operation unit 170 to be described later.
- the driving unit 160 may be connected to the fracture area through the apparatus as shown in FIGS. 3 to 5.
- the operation unit 170 includes a plurality of functions for setting or selecting various functions supported by the conquest robot 100. And the operation unit 170 receives a control command for controlling the operation of the drive unit.
- the display unit 130 and the operation unit 170 are illustrated and described as having separate configurations, but at the time of implementation, the display unit 130 and the operation unit 170 are implemented as a device such as a touch screen in which input and output are simultaneously implemented. It may be implemented.
- the path calculator 180 calculates a movement path for conquering a bone region that can be moved among the fracture regions. Specifically, the path calculating unit 180 uses the 3D normal image and the 3D fracture image, so that the fractured bone on the 3D fracture image has the same position as the normal bone on the 3D normal image.
- the movement path of the area can be calculated.
- the movement path may be calculated by an optimized algorithm by various experiments.
- the controller 190 controls each component in the conquest robot 100. Specifically, when a command for starting a reduction operation is input, the controller 190 may control the photographing unit 110 to perform imaging of the fracture area and the normal area. The controller 190 may control the image processor 120 to generate a 3D interpolation image of a plurality of bone images, and may control the display 130 to display the generated 3D interpolation image.
- the controller 190 may control the path calculator 180 to calculate an interpolation path based on the generated 3D interpolation image.
- the driving unit 160 may be controlled to perform driving according to the received driving command.
- the sensing unit 140 detects the movement of the bone, and the controller 190 may update the 3D interpolation image displayed on the display unit 130 according to the detected movement path. have.
- the controller 190 When the controller 190 receives a net command through the manipulation unit 170, the controller 190 maintains the position of the bone region in the current state, that is, the motor state of the driving unit 160 is frozen (more specifically, the motor rotation angle).
- the driving unit 160 may be controlled to maintain the current state.
- the reduction robot 100 generates and displays a 3D interpolation image of a normal region corresponding to the fracture region and the fracture region, and changes the 3D interpolation in real time in response to bone movement during the surgery.
- a 3D interpolation image of a normal region corresponding to the fracture region and the fracture region, and changes the 3D interpolation in real time in response to bone movement during the surgery.
- doctors can more easily perform conquest.
- using the three-dimensional interpolation image it is possible to perform a conquest procedure while more precisely checking the rotational state, shortening state and the like of the entire bone.
- 3 is a view showing the shape of the fixing pin according to this embodiment.
- the fixing pin 200 is connected to the bone of the fracture region and the robot arm of the driving unit 160. Specifically, the fixing pin 200 includes a spiral region 210 and a threshold region 220.
- Spiral region 210 is a region inserted into the fractured bone region. Such a spiral region may be approximately 7 cm long. However, the present invention is not limited to such a value, and the length of the spiral region 210 may vary depending on the condition of the patient.
- the threshold region 220 is a protruding region for preventing the fixing pin 200 from being inserted into the bone more than a predetermined time.
- a threshold region 220 may have a form of a circle, polygon (triangle, quadrilateral, etc.).
- the threshold region 220 is positioned as a predetermined length at the end of the spiral region 210 of the fixing pin 200, in the implementation, the threshold region 220 is the spiral region 210. It may also be implemented in the form located from the end of the fixing pin 200 to the end.
- the diameter of the threshold region 220 is shown to be the same, in the implementation, the threshold region 220 may have a different diameter between the end of the spiral region 210 and the end of the fixing pin 200.
- the appearance of the threshold region is preferably a polygonal shape based on the central axis of the fixing pin 200.
- the shape of the threshold region has a polygonal shape, and there is an advantage that it can be easily inserted into a bone even with a tool such as a general Roche driver or a monkey, without using a special fixing pin insertion tool.
- the strength can be implemented stronger than the original.
- the fixing pin 200 is provided with a threshold region 220 to prevent the pin from being inserted into the bone more than a predetermined time, thereby preventing the fixing pin from penetrating through the bone.
- connection unit 4 is a view showing the shape of the connection unit according to the present embodiment.
- connection part 300 couples the fixing pin 200 and the external fixing part 400.
- the connection part 300 includes a first area for fixing the fixing pin on one side and a second area for fixing the outer fixing part 400 on the other side.
- the connection part 300 may include a first region capable of fixing the fixing pin on one side, and the pin may be positioned in the first region and may be attached to the fixing pin 200 by using a wrench or the like.
- the shape of the first region may have a shape corresponding to the outer shape of the fixing pin 200. For example, if the external shape of the fixing pin 200 is circular, the external shape of the first region may also be circular. If the external shape of the fixing pin 200 is triangular, the external shape of the first region may also be triangular.
- the shape in which the external fixing part 400 is attached to the connection part 300 is illustrated in FIG. 5.
- the outer fixing part 400 may have at least one of a U shape, an annular ring shape, a semicircle shape, and a straight shape, and the shape of the second region corresponds to the outer shape of the outer fixing part 400. It may have a shape.
- the external fixing unit 400 may be connected to the robot arm of the driving unit 160 described above. Accordingly, the driving unit 160 is fixed to the fractured bone through a continuous binding relationship such as the robot arm, the external fixing part 400, the connecting part 300, and the fixing pin 200, and moves the fractured bone area. It becomes possible.
- FIG. 5 only the connection relationship between one external fixing part 400 and one connecting part 300 is illustrated.
- a plurality of fixing pins 200 and a plurality of connecting parts 300 may also be implemented in a form that the plurality of external fixing unit 400 is connected in combination.
- FIG. 6 is a view for explaining the structure of the bone.
- the bone shown in Figure 6 is the bone of the human thigh, such bone is to be divided into the upper portion (upper and lower epiphysis), the middle portion (diaphysis), the end (metaphysis) Can be.
- the robot according to the present embodiment mainly targets such fractures in the interosseous bones.
- the middle region of the bone that is, the intervertebral bone
- the image processor 120 may extract parameters of bone length, diameter, and bending degree from each of the plurality of images, and generate a 3D interpolation image using the extracted parameters.
- a three-dimensional interpolation image may be generated using only at least two bone images, and thus the radiation coating of the patient may be reduced, and the present invention may be applied to a pregnant or pediatric patient.
- the three-dimensional interpolation as described above may be performed not only on the whole bone but also on a certain area of the bone.
- interpolation of upper and lower bones and inter-terminal end where 3-D interpolation is difficult may be performed using standardized data of a general person, and only 3-D interpolation may be performed using only an intermediate part (ie, inter-bone) where interpolation is relatively simple. have.
- FIG. 7 to 9 illustrate various examples of a user interface window that may be displayed on the display of FIG. 1.
- the UI window overlays and displays the 3D fracture image 720 on the 3D normal image 710.
- the 3D fracture image 720 is overlaid and displayed on the 3D normal image, and a surgeon may more easily identify a movement path of the bone area 721 that is movable among the fracture areas. And such a user interface window can be changed in real time by bone movement.
- the user interface window may display the color, illuminance, brightness, etc. of the 3D fracture image 720 which is overlaid and displayed differently from the 3D normal image.
- the overlay degree (ie, transparency) and overlay position of the displayed 3D fracture image may be changed by a user's manipulation.
- FIG. 7 an example of overlaying and displaying a 3D fracture image and a 3D normal image is illustrated.
- the 3D fracture image 820 and the 3D normal image 180 are illustrated as shown in FIG. 8. It can also be implemented as a separate display without overlay.
- the user interface window 900 displays an image 910 in which a 3D fracture image is overlaid on a 3D normal image, as shown in FIG. 7, and a matching degree 920 for each region. Is displayed together. In this way, the user interface window 900 displays the matching degree, the doctor can easily determine whether to conquer during the operation, and can check the degree of matching by area, the rotation and shortening of the entire bone and excess dog You can check whether or not.
- FIG. 10 is a flowchart illustrating a driving control method of a conquest robot according to an embodiment of the present invention.
- a plurality of bone images of different angles are photographed for each of a fracture area and a fracture area of a patient (S1010).
- a plurality of bone images may be taken while rotating with respect to each fracture area and the normal area corresponding to the fracture area of the patient using techniques such as X-ray, MRI, and CT capable of bone imaging.
- the normal region is a bone region located opposite to the bone located in the fracture region. For example, when the fracture region is the left femur, the normal region is the right femur.
- a 3D interpolation image of each of the fracture area and the normal area is generated using the plurality of captured bone images.
- the 3D interpolation image of the generated fracture region and the 3D interpolation image of the generated normal region are displayed together (S1030).
- the 3D interpolation image of the fracture area may be overlaid and displayed on the generated 3D interpolation image of the normal region.
- the driving control method generates and displays a 3D interpolation image corresponding to the fracture area and the fracture area, so that a doctor can more easily perform the conquest procedure using the displayed 3D interpolation image. do.
- the drive control method as shown in FIG. 10 may be executed on the conquest robot 100 having the configuration of FIG. 1 or 2, or may be executed on the conquest robot 100 having other configurations.
- the driving control method as described above may be implemented as a program including an executable algorithm that can be executed in a computer, and the program may be stored and provided in a non-transitory computer readable medium. have.
- the non-transitory readable medium refers to a medium that stores data semi-permanently and is readable by a device, not a medium storing data for a short time such as a register, a cache, a memory, and the like.
- a non-transitory readable medium such as a CD, a DVD, a hard disk, a Blu-ray disk, a USB, a memory card, a ROM, or the like.
- FIG. 11 is a flowchart for describing the driving control method of FIG. 10 in detail.
- the doctor connects the fractured area and the robot arm by using the fixing pin, the connection part, and the external fixing part as shown in relation to FIGS. 3 to 5 (S1110).
- a plurality of bone images of different angles are photographed with respect to the fracture area and the fracture area of the patient.
- the bone image is taken after the operation of connecting the fractured area and the robot arm.
- a 3D fracture image and a 3D normal image are generated by using the plurality of photographed bone images.
- a process of generating a 3D interpolation image has been described above with reference to FIG. 6, and thus description thereof will not be repeated.
- the degree of matching is calculated using the generated 3D fracture image and the 3D normal image (S1140). Specifically, the coincidence may be calculated using the outlines or other parameters (eg, volume, etc.) of the two generated 3D interpolation images. The calculation of the match may be performed for each divided region (eg, upper region, middle region, and lower region).
- the movement path of the bone area that can be moved among the fracture areas is calculated.
- the generated 3D fracture image and the 3D normal image are displayed (S1160). Specifically, the 3D fracture image may be overlaid and displayed on the generated 3D normal image. In this case, the matching degree calculated previously may be displayed together.
- a driving command is input from a user (specifically, a doctor).
- the robot arm is moved to move the movable bone area of the fracture area (S1180).
- the driving control method generates and displays a three-dimensional interpolation image of the fracture region and the normal region corresponding to the fracture region, and displays the three-dimensional interpolation image changed in real time in response to the bone movement during the surgery.
- doctors can more easily perform conquest procedures.
- using the three-dimensional interpolation image it is possible to perform the conquest procedure while more precisely checking the rotational state, shortening state, etc. of the entire bone.
- the drive control method as shown in FIG. 11 may be executed on the conquest robot 100 having the constitution of FIG. 2 or may be executed on the conquest robot 100 having other constitutions.
- the driving control method as described above may be implemented as a program including an executable algorithm that can be executed in a computer, and the program may be stored and provided in a non-transitory computer readable medium. have.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Robotics (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Mechanical Engineering (AREA)
- Radiology & Medical Imaging (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Geometry (AREA)
- Data Mining & Analysis (AREA)
- Nursing (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims (25)
- 정복 시술 로봇에 있어서,환자의 골절 영역 및 상기 골절 영역에 대응되는 정상 영역 각각에 대한 서로 다른 각도의 복수의 뼈 이미지를 촬영하는 촬영부;상기 촬영된 복수의 뼈 이미지를 이용하여 상기 골절 영역 및 상기 정상 영역 각각에 대한 3차원 보간 이미지를 생성하는 이미지 처리부; 및상기 생성된 골절 영역에 대한 3차원 보간 이미지 및 상기 생성된 정상 영역에 대한 3차원 보간 이미지를 함께 표시하는 표시부;를 포함하는 정복 시술 로봇.
- 제1항에 있어서,상기 촬영부는,X-ray, MRI 및 CT 중 적어도 하나의 장치를 이용하여 상기 골절 영역 및 상기 정상 영역을 촬영하는 것을 특징으로 하는 정복 시술 로봇.
- 제1항에 있어서,상기 이미지 처리부는,상기 뼈 이미지의 골간(diaphysis) 부분을 추출하고, 상기 추출된 골간 부분에 대한 3차원 보간 이미지를 생성하는 것을 특징으로 하는 정복 시술 로봇.
- 제1항에 있어서,상기 이미지 처리부는,상기 뼈 이미지의 뼈의 지름, 휨, 길이 중 적어도 하나를 고려하여 3차원 보간 이미지를 생성하는 것을 특징으로 하는 정복 시술 로봇.
- 제1항에 있어서,상기 골절 영역 중 이동이 가능한 뼈 영역의 위치를 감지하는 위치 감지부;를 더 포함하고,상기 표시부는,상기 감지된 위치를 기준으로 상기 정상 영역에 대한 3차원 보간 이미지 상에 상기 골절 영역에 대한 3차원 보간 이미지를 오버랩하여 표시하는 것을 특징으로 하는 정복 시술 로봇.
- 제1항에 있어서,상기 생성된 골절 영역에 대한 3차원 보간 이미지 및 상기 생성된 정상 영역에 대한 3차원 보간 이미지 각각에 대한 뼈 외곽선을 추출하고, 추출된 뼈 외곽선의 일치 정도를 계산하는 매칭부;를 더 포함하고,상기 표시부는, 상기 계산된 일치 정도를 표시하는 것을 특징으로 하는 정복 시술 로봇.
- 제6항에 있어서,상기 매칭부는,상기 3차원 보간 이미지를 복수의 영역으로 구분하고, 상기 구분된 영역별 뼈 외곽선의 일치 정도를 계산하는 것을 특징으로 하는 정복 시술 로봇.
- 제1항에 있어서,상기 골절 영역 중 이동이 가능한 뼈 영역의 일 측에 고정되며, 상기 뼈 영역을 이동시키는 구동부; 및상기 구동부의 동작을 제어하기 위한 제어 명령을 입력받는 조작부;를 더 포함하는 것을 특징으로 하는 정복 시술 로봇.
- 제8항에 있어서,상기 조작부는, 정지 명령을 제어 명령으로 입력받고,상기 구동부는, 상기 정지 명령이 입력되면, 상기 뼈 영역의 위치를 현 상태로 유지하는 것을 특징으로 하는 정복 시술 로봇.
- 제8항에 있어서,상기 골절 영역 중 이동이 가능한 뼈 영역을 정복(reduction)하기 위한 이동 경로를 산출하는 경로 산출부; 및상기 뼈 영역이 상기 산출된 이동 경로를 벗어나면 상기 구동부의 동작을 정지시키는 제어부;를 더 포함하는 것을 특징으로 하는 정복 시술 로봇.
- 제8항에 있어서,상기 구동부는,고정 핀, 연결부 및 외 고정부를 통하여 상기 뼈 영역의 일 측에 연결되는 것을 특징으로 하는 정복 시술 로봇.
- 제11항에 있어서,상기 고정 핀은,골절된 뼈 영역에 삽입되는 나선 영역 및 상기 나선 영역이 기설정된 깊이만큼만 삽입되도록 돌출된 문턱 영역을 포함하는 것을 특징으로 하는 정복 시술 로봇.
- 제12항에 있어서,상기 문턱 영역의 외형은 상기 고정 핀의 중심 축을 기준으로 다각형 형태인 것을 특징으로 하는 정복 시술 로봇.
- 제11항에 있어서,상기 연결부는,일 측에 상기 고정핀을 고정할 수 있는 제1 영역 및 타 측에 상기 외 고정부를 고정할 수 있는 제2 영역을 포함하는 것을 특징으로 하는 정복 시술 로봇.
- 제11항에 있어서,상기 외 고정부는U자 형태, 환형 링 형태, 반원 형태, 일자 형태 중 적어도 하나의 형태를 갖는 것을 특징으로 하는 정복 시술 로봇.
- 정복 시술 로봇의 구동 제어 방법에 있어서,환자의 골절 영역 및 상기 골절 영역에 대응되는 정상 영역 각각에 대한 서로 다른 각도의 복수의 뼈 이미지를 촬영하는 단계;상기 촬영된 복수의 뼈 이미지를 이용하여 상기 골절 영역 및 상기 정상 영역 각각에 대한 3차원 보간 이미지를 생성하는 단계; 및상기 생성된 골절 영역에 대한 3차원 보간 이미지 및 상기 생성된 정상 영역에 대한 3차원 보간 이미지를 함께 표시하는 단계;를 포함하는 구동 제어 방법.
- 제16항에 있어서,상기 촬영하는 단계는,X-ray, MRI 및 CT 중 적어도 하나의 장치를 이용하여 골절 영역 및 상기 정상 영역을 촬영하는 것을 특징으로 하는 구동 제어 방법.
- 제16항에 있어서,상기 3차원 보간 이미지를 생성하는 단계는,상기 뼈 이미지의 골간(diaphysis) 부분을 추출하고, 상기 추출된 골간 부분에 대한 3차원 보간 이미지를 생성하는 것을 특징으로 하는 구동 제어 방법.
- 제16항에 있어서,상기 3차원 보간 이미지를 생성하는 단계는,상기 뼈 이미지의 뼈의 지름, 휨, 길이 중 적어도 하나를 고려하여 3차원 보간 이미지를 생성하는 것을 특징으로 하는 구동 제어 방법.
- 제16항에 있어서,상기 골절 영역 중 이동이 가능한 뼈 영역의 위치를 감지하는 단계;를 더 포함하고,상기 표시하는 단계는,상기 감지된 위치를 기준으로 상기 정상 영역에 대한 3차원 보간 이미지 상에 상기 골절 영역에 대한 3차원 보간 이미지를 오버랩하여 표시하는 것을 특징으로 하는 구동 제어 방법.
- 제16항에 있어서,상기 생성된 골절 영역에 대한 3차원 보간 이미지 및 상기 생성된 정상 영역에 대한 3차원 보간 이미지 각각에 대한 뼈 외곽선을 추출하고, 추출된 뼈 외곽선의 일치 정도를 계산하는 단계;를 더 포함하고,상기 표시하는 단계는,상기 계산된 일치 정도를 표시하는 것을 특징으로 하는 구동 제어 방법.
- 제21항에 있어서,상기 계산하는 단계는,상기 3차원 보간 이미지를 복수의 영역으로 구분하고, 상기 구분된 영역별 뼈 외곽선의 일치 정도를 계산하는 것을 특징으로 하는 구동 제어 방법.
- 제16항에 있어서,상기 정복 시술 로봇의 구동을 제어하기 위한 제어 명령을 입력받는 단계; 및상기 입력된 제어 명령에 따라 상기 골절 영역 중 이동이 가능한 뼈 영역을 이동시키도록 구동하는 단계;를 더 포함하는 것을 특징으로 하는 구동 제어 방법.
- 제23항에 있어서,상기 제어 명령을 입력받는 단계는, 정지 명령을 입력받고,상기 구동하는 단계는,상기 정지 명령이 입력되면, 상기 뼈 영역의 위치를 현 상태로 유지하는 것을 특징으로 하는 구동 제어 방법.
- 제23항에 있어서,상기 골절 영역 중 이동이 가능한 뼈 영역을 정복(reduction)하기 위한 이동 경로를 산출하는 단계; 및상기 뼈 영역이 상기 산출된 이동 경로를 벗어나면 상기 정복 시술 로봇의 구동을 정지시키는 단계;를 더 포함하는 것을 특징으로 하는 구동 제어 방법.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/443,027 US10015470B2 (en) | 2012-11-16 | 2013-11-15 | Robot for repositioning procedure, and method for controlling operation thereof |
JP2015542951A JP6291503B2 (ja) | 2012-11-16 | 2013-11-15 | 整復施術ロボット及びその駆動制御方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0130543 | 2012-11-16 | ||
KR1020120130543A KR101433242B1 (ko) | 2012-11-16 | 2012-11-16 | 정복 시술 로봇 및 그의 구동 제어 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014077613A1 true WO2014077613A1 (ko) | 2014-05-22 |
Family
ID=50731456
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2013/010387 WO2014077613A1 (ko) | 2012-11-16 | 2013-11-15 | 정복 시술 로봇 및 그의 구동 제어 방법 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10015470B2 (ko) |
JP (1) | JP6291503B2 (ko) |
KR (1) | KR101433242B1 (ko) |
WO (1) | WO2014077613A1 (ko) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105643640A (zh) * | 2014-11-12 | 2016-06-08 | 沈阳新松机器人自动化股份有限公司 | 一种关节定位识别系统 |
CN111590584A (zh) * | 2020-05-27 | 2020-08-28 | 京东方科技集团股份有限公司 | 安全限位区的确定方法和装置、复位方法和医用机器人 |
WO2021142213A1 (en) * | 2020-01-09 | 2021-07-15 | Smith & Nephew, Inc. | Methods and arrangements to describe deformity of a bone |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6951116B2 (ja) * | 2016-05-11 | 2021-10-20 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、医用画像診断装置、及び画像処理方法 |
KR101871601B1 (ko) * | 2016-11-30 | 2018-06-26 | 한국과학기술연구원 | 안와벽 재건술을 위한 수술 계획 생성 방법, 이를 수행하는 수술 계획 생성 서버, 및 이를 저장하는 기록매체 |
KR101961682B1 (ko) * | 2017-07-13 | 2019-07-17 | 재단법인대구경북과학기술원 | 골절 교정을 위한 내비게이션 장치 및 방법 |
KR102168431B1 (ko) * | 2017-10-24 | 2020-10-21 | 경북대학교 산학협력단 | 능동형 견인장치 및 이의 제어 방법 |
CN109820590B (zh) * | 2019-02-15 | 2024-04-12 | 中国人民解放军总医院 | 一种骨盆骨折复位智能监控系统 |
US11109914B2 (en) | 2019-03-08 | 2021-09-07 | DePuy Synthes Products, Inc. | Outcome-based splinting |
KR102081834B1 (ko) * | 2019-09-03 | 2020-02-26 | 의료법인 명지의료재단 | 종골 골절의 정복 판단 방법 |
EP4065023A1 (en) * | 2019-11-26 | 2022-10-05 | Howmedica Osteonics Corporation | Pre-operative planning and intra operative guidance for orthopedic surgical procedures in cases of bone fragmentation |
KR102581628B1 (ko) * | 2020-12-18 | 2023-09-21 | 가톨릭관동대학교산학협력단 | 임플란트 자동 컨투어링 시스템 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20000011134A (ko) * | 1996-05-15 | 2000-02-25 | 인드래니 머캐르지 | 정위 수술 방법 및 장치 |
WO2003105659A2 (en) * | 2002-06-17 | 2003-12-24 | Mazor Surgical Technologies Ltd. | Robot for use with orthopaedic inserts |
US20090054910A1 (en) * | 2006-12-15 | 2009-02-26 | Ao Technology Ag | Method and device for computer assisted distal locking of intramedullary nails |
KR20100081588A (ko) * | 2009-01-06 | 2010-07-15 | 삼성전자주식회사 | 로봇 및 그 제어방법 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5546942A (en) * | 1994-06-10 | 1996-08-20 | Zhang; Zhongman | Orthopedic robot and method for reduction of long-bone fractures |
US20020045812A1 (en) * | 1996-02-01 | 2002-04-18 | Shlomo Ben-Haim | Implantable sensor for determining position coordinates |
US5799055A (en) | 1996-05-15 | 1998-08-25 | Northwestern University | Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy |
US8944070B2 (en) * | 1999-04-07 | 2015-02-03 | Intuitive Surgical Operations, Inc. | Non-force reflecting method for providing tool force information to a user of a telesurgical system |
US20040068187A1 (en) * | 2000-04-07 | 2004-04-08 | Krause Norman M. | Computer-aided orthopedic surgery |
US20020194023A1 (en) | 2001-06-14 | 2002-12-19 | Turley Troy A. | Online fracture management system and associated method |
JP4056791B2 (ja) * | 2002-05-22 | 2008-03-05 | 策雄 米延 | 骨折整復誘導装置 |
US8750960B2 (en) * | 2002-07-19 | 2014-06-10 | Warsaw Orthopedic, Inc. | Process for selecting bone for transplantation |
JP5323885B2 (ja) | 2003-02-12 | 2013-10-23 | 剛 村瀬 | 骨矯正のための方法、部材、システムおよびプログラム |
US8860753B2 (en) * | 2004-04-13 | 2014-10-14 | University Of Georgia Research Foundation, Inc. | Virtual surgical system and methods |
US9782229B2 (en) * | 2007-02-16 | 2017-10-10 | Globus Medical, Inc. | Surgical robot platform |
US7835559B1 (en) * | 2007-04-27 | 2010-11-16 | The Board Of Trustees Of The Leland Stanford Junior University | Method and apparatus for quantitative and comparative analysis of image intensities in radiographs |
US20090015680A1 (en) * | 2007-07-10 | 2009-01-15 | Harris David P | Device, System and Method for Aligning Images |
JP4369969B2 (ja) * | 2007-10-26 | 2009-11-25 | 株式会社アールビーエス | 生体適合性材料並びにその製造方法 |
US8600149B2 (en) * | 2008-08-25 | 2013-12-03 | Telesecurity Sciences, Inc. | Method and system for electronic inspection of baggage and cargo |
US20110033516A1 (en) * | 2009-08-06 | 2011-02-10 | Medical University Of South Carolina | Methods and compositions for bone healing by periostin |
US8870799B2 (en) * | 2010-05-28 | 2014-10-28 | Fixes 4 Kids Inc. | Systems, devices, and methods for mechanically reducing and fixing bone fractures |
US20110313479A1 (en) * | 2010-06-22 | 2011-12-22 | Philip Rubin | System and method for human anatomic mapping and positioning and therapy targeting |
JP2012075702A (ja) * | 2010-10-01 | 2012-04-19 | Fujifilm Corp | 管状構造物内画像再構成装置、管状構造物内画像再構成方法および管状構造物内画像再構成プログラム |
WO2012056686A1 (ja) * | 2010-10-27 | 2012-05-03 | パナソニック株式会社 | 3次元画像補間装置、3次元撮像装置および3次元画像補間方法 |
JP5606423B2 (ja) * | 2011-11-04 | 2014-10-15 | 三菱電機株式会社 | ロボットハンド及びフィンガー交換装置 |
CN103903303B (zh) * | 2012-12-27 | 2018-01-30 | 清华大学 | 三维模型创建方法和设备 |
-
2012
- 2012-11-16 KR KR1020120130543A patent/KR101433242B1/ko active IP Right Grant
-
2013
- 2013-11-15 WO PCT/KR2013/010387 patent/WO2014077613A1/ko active Application Filing
- 2013-11-15 JP JP2015542951A patent/JP6291503B2/ja active Active
- 2013-11-15 US US14/443,027 patent/US10015470B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20000011134A (ko) * | 1996-05-15 | 2000-02-25 | 인드래니 머캐르지 | 정위 수술 방법 및 장치 |
WO2003105659A2 (en) * | 2002-06-17 | 2003-12-24 | Mazor Surgical Technologies Ltd. | Robot for use with orthopaedic inserts |
US20090054910A1 (en) * | 2006-12-15 | 2009-02-26 | Ao Technology Ag | Method and device for computer assisted distal locking of intramedullary nails |
KR20100081588A (ko) * | 2009-01-06 | 2010-07-15 | 삼성전자주식회사 | 로봇 및 그 제어방법 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105643640A (zh) * | 2014-11-12 | 2016-06-08 | 沈阳新松机器人自动化股份有限公司 | 一种关节定位识别系统 |
WO2021142213A1 (en) * | 2020-01-09 | 2021-07-15 | Smith & Nephew, Inc. | Methods and arrangements to describe deformity of a bone |
CN111590584A (zh) * | 2020-05-27 | 2020-08-28 | 京东方科技集团股份有限公司 | 安全限位区的确定方法和装置、复位方法和医用机器人 |
WO2021238876A1 (zh) * | 2020-05-27 | 2021-12-02 | 京东方科技集团股份有限公司 | 安全限位区的确定方法和装置、复位方法和医用机器人 |
CN111590584B (zh) * | 2020-05-27 | 2021-12-10 | 京东方科技集团股份有限公司 | 安全限位区的确定方法和装置、复位方法和医用机器人 |
Also Published As
Publication number | Publication date |
---|---|
JP6291503B2 (ja) | 2018-03-14 |
JP2016503322A (ja) | 2016-02-04 |
US10015470B2 (en) | 2018-07-03 |
KR101433242B1 (ko) | 2014-08-25 |
KR20140063321A (ko) | 2014-05-27 |
US20150341615A1 (en) | 2015-11-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014077613A1 (ko) | 정복 시술 로봇 및 그의 구동 제어 방법 | |
WO2018056544A1 (ko) | 치과 수술용 증강현실 시스템 및 구현방법 | |
WO2017192020A1 (ko) | 치과용 3차원 데이터 처리장치 및 그 방법 | |
WO2019132427A1 (ko) | 레이저 표적 투영장치 및 그 제어방법, 레이저 표적 투영장치를 포함하는 레이저 수술 유도 시스템 | |
KR20220025186A (ko) | 진단 검사를 실시간 치료에 통합하기 위한 범용 장치 및 방법 | |
JP2004049912A (ja) | 医療用診断または治療装置における患者の位置決め方法および装置 | |
WO2013095032A1 (ko) | 초음파 영상을 이용하여 정중 시상면을 자동으로 검출하는 방법 및 그 장치 | |
WO2017142223A1 (en) | Remote image transmission system, display apparatus, and guide displaying method thereof | |
WO2019132165A1 (ko) | 수술결과에 대한 피드백 제공방법 및 프로그램 | |
WO2022035110A1 (ko) | 증강현실 의료영상을 제공하기 위한 사용자 단말기 및 증강현실 의료영상 제공 방법 | |
WO2018097596A1 (ko) | 방사선 촬영 가이드 시스템 및 방법 | |
WO2021071336A1 (ko) | 시선 검출 기반의 스마트 안경 표시 장치 | |
WO2019132244A1 (ko) | 수술 시뮬레이션 정보 생성방법 및 프로그램 | |
WO2010058927A2 (ko) | 안면 영상 촬영장치 | |
JP2015019777A (ja) | 体位決定支援装置及び医用画像診断装置 | |
WO2021162287A1 (ko) | 수술 대상체의 정합 확인방법, 그 장치 및 이를 포함하는 시스템 | |
WO2021153973A1 (ko) | 관절치환 로봇수술 정보 제공 장치 및 제공 방법 | |
WO2021045546A2 (ko) | 로봇의 위치 가이드 장치, 이의 방법 및 이를 포함하는 시스템 | |
WO2021054659A2 (ko) | 수술 내비게이션 장치 및 그 방법 | |
WO2012108578A1 (ko) | 3차원 광학 측정기를 이용한 뼈 움직임 감지 및 경로 보정 시스템 | |
WO2018080131A1 (ko) | 방사선 촬영 시스템에서 프로토콜을 재배열하기 위한 장치 및 방법 | |
WO2020197109A1 (ko) | 치아 영상 정합 장치 및 방법 | |
WO2020185049A2 (ko) | 페디클 스크류 고정 플래닝 시스템 및 방법 | |
WO2021015449A2 (ko) | 수술위치 정보제공방법 및 수술위치 정보제공장치 | |
WO2022225132A1 (ko) | 랜드마크를 이용한 증강현실 기반의 의료정보 시각화 시스템 및 그 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13856000 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14443027 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2015542951 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13856000 Country of ref document: EP Kind code of ref document: A1 |